WO2016028828A1 - Head-mounted display with electrochromic dimming module for augmented and virtual reality perception - Google Patents
Head-mounted display with electrochromic dimming module for augmented and virtual reality perception Download PDFInfo
- Publication number
- WO2016028828A1 WO2016028828A1 PCT/US2015/045779 US2015045779W WO2016028828A1 WO 2016028828 A1 WO2016028828 A1 WO 2016028828A1 US 2015045779 W US2015045779 W US 2015045779W WO 2016028828 A1 WO2016028828 A1 WO 2016028828A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimming
- value
- dimming module
- ambient light
- eye display
- Prior art date
Links
- 230000003190 augmentative effect Effects 0.000 title abstract description 8
- 230000008447 perception Effects 0.000 title description 2
- 230000004044 response Effects 0.000 claims abstract description 50
- 239000000758 substrate Substances 0.000 claims abstract description 45
- 239000004020 conductor Substances 0.000 claims abstract description 35
- 150000001875 compounds Chemical class 0.000 claims abstract description 17
- 239000012212 insulator Substances 0.000 claims abstract description 17
- 238000000034 method Methods 0.000 claims description 36
- 239000000463 material Substances 0.000 claims description 18
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 claims description 10
- 230000007935 neutral effect Effects 0.000 claims description 9
- 239000011521 glass Substances 0.000 claims description 8
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 claims description 6
- 229920000144 PEDOT:PSS Polymers 0.000 claims description 5
- 229920003229 poly(methyl methacrylate) Polymers 0.000 claims description 5
- 239000004926 polymethyl methacrylate Substances 0.000 claims description 5
- 235000012239 silicon dioxide Nutrition 0.000 claims description 5
- 239000000377 silicon dioxide Substances 0.000 claims description 5
- 230000000670 limiting effect Effects 0.000 claims description 4
- 239000004417 polycarbonate Substances 0.000 claims description 4
- 229920000515 polycarbonate Polymers 0.000 claims description 4
- BQCADISMDOOEFD-UHFFFAOYSA-N Silver Chemical compound [Ag] BQCADISMDOOEFD-UHFFFAOYSA-N 0.000 claims description 3
- 239000002042 Silver nanowire Substances 0.000 claims description 3
- 239000002041 carbon nanotube Substances 0.000 claims description 3
- 229910021393 carbon nanotube Inorganic materials 0.000 claims description 3
- 229910021389 graphene Inorganic materials 0.000 claims description 3
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 claims description 3
- 229910052751 metal Inorganic materials 0.000 claims description 3
- 239000002184 metal Substances 0.000 claims description 3
- 230000002829 reductive effect Effects 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 abstract description 24
- 238000012545 processing Methods 0.000 description 51
- 230000003287 optical effect Effects 0.000 description 22
- 238000004891 communication Methods 0.000 description 13
- 230000005540 biological transmission Effects 0.000 description 9
- 238000005286 illumination Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000013507 mapping Methods 0.000 description 8
- 239000000872 buffer Substances 0.000 description 5
- GWEVSGVZZGPLCZ-UHFFFAOYSA-N Titan oxide Chemical compound O=[Ti]=O GWEVSGVZZGPLCZ-UHFFFAOYSA-N 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000004528 spin coating Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000001815 facial effect Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000036961 partial effect Effects 0.000 description 3
- 238000000547 structure data Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- VVQNEPGJFQJSBK-UHFFFAOYSA-N Methyl methacrylate Chemical compound COC(=O)C(C)=C VVQNEPGJFQJSBK-UHFFFAOYSA-N 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 239000011149 active material Substances 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000001965 increasing effect Effects 0.000 description 2
- 150000002500 ions Chemical class 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000005012 migration Effects 0.000 description 2
- 238000013508 migration Methods 0.000 description 2
- 239000004033 plastic Substances 0.000 description 2
- 229920003023 plastic Polymers 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 210000001525 retina Anatomy 0.000 description 2
- 239000004408 titanium dioxide Substances 0.000 description 2
- 239000012780 transparent material Substances 0.000 description 2
- JMASRVWKEDWRBT-UHFFFAOYSA-N Gallium nitride Chemical compound [Ga]#N JMASRVWKEDWRBT-UHFFFAOYSA-N 0.000 description 1
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 206010049155 Visual brightness Diseases 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000007177 brain activity Effects 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 239000003792 electrolyte Substances 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/10—Intensity circuits
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02F—OPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
- G02F1/00—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
- G02F1/01—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
- G02F1/15—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on an electrochromic effect
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0118—Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0233—Improving the luminance or brightness uniformity across the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
Definitions
- a near-eye display (NED) device may be worn by a user for experiences such as an augmented reality (AR) experience and a virtual reality (VR) experience.
- a NED device may include a display module that may provide a computer-generated image (CGI), or other information, in a near-eye display of the NED device.
- CGI computer-generated image
- a near-eye display of a NED device may include optical see-through lens to allow a CGI to be superimposed on a real-world view of a user.
- a NED device may be included in a head-mounted display (HMD).
- HMD head-mounted display
- a HMD having a NED device may be in the form of a helmet, visor, glasses, and goggles or attached by one or more straps.
- HMDs may be used in at least aviation, engineering, science, medicine, computer gaming, video, sports, training, simulations and other applications.
- the technology provides embodiments of a dimming module for a near-eye display (NED) device that controls an amount of ambient light that passes through the transmissive (or optical see-through) near-eye display to a user.
- the dimming module includes at least one electrochromic cell that enables variable density dimming (or selectable dimming levels/ambient light transmission percentages) so that the NED device may be used in an augmented reality (AR) and/or virtual reality (VR) application.
- AR augmented reality
- VR virtual reality
- An AR application may prefer partial dimming of ambient light or a transmissive (clear) state, while a VR application may prefer a darkened (opaque) state.
- a NED device having a dimming module may be included in a visor, or other type of head-mounted display (HMD).
- a NED device may be disposed by a support structure of a HMD, such as a frame of a visor or glasses.
- the dimming module includes at least one monochromic electrochromic cell with stacked layers of material.
- the stacked layers include layers of, a monochrome electrochromic compound and an insulator sandwiched between a pair of transparent substrates and conductors.
- the transparent substrates may be lens of a HMD.
- An amount of current may be applied from a control circuit to the conductor layers to control an amount of dimming in response to a dimming value.
- the dimming value may be determined based upon an ambient light value from an ambient light sensor, user preference value, type of executing application, near-eye display brightness value, dimming module neutral density (ND) range (dimmable range) and near-eye display brightness range, singly or in combination.
- multiple monochromic electrochromic cells may be stacked and used to increase a dimmable range.
- an apparatus comprises a support structure, a near-eye display to provide image light and a dimming module.
- the dimming module comprises: a first transparent substrate, a first transparent conductor, a monochrome electrochromic compound, an insulator, a second transparent conductor and a second transparent substrate.
- the dimming module controls light passing through the dimming module in response to a current applied between the first and second transparent conductors.
- the technology also provides one or more embodiments of a method comprising sensing an ambient light value associated with an amount of ambient light received by a transmissive near-eye display.
- An application is executed to provide image light to the transmissive near-eye display.
- a near-eye display brightness value associated with the application is retrieved and a dimming value for a dimming module is determined in response to the ambient light value and the near-eye display brightness value.
- the amount of ambient light that passes through the dimming module is limited in response to the dimming value.
- the technology also provides one or more embodiments including a computing system and a HMD having a near-eye display.
- the computing system provides an electronic signal representing image data.
- the HMD provides image data in response to the electronic signal.
- the HMD includes a NED device comprising a display engine to output the image data, a transmissive near-eye display to provide image light in response to the image data, a dimming module and a control circuit.
- the dimming module controls an amount of ambient light that passes through the dimming module in response to a current.
- the control circuit provides the current in response to a dimming value.
- a dimming module comprises first and second transparent conductors and a monochrome electrochromic cell. Current is applied to the first and second conductors from a control circuit.
- Figure 1 is a block diagram depicting example components of an embodiment of a near-eye display (NED) device system.
- NED near-eye display
- Figure 2 is a block diagram of example hardware components in control circuitry of a NED device.
- Figure 3 is a top view of an embodiment of a near-eye display having a dimming module.
- Figure 4 illustrates a single monochrome cell in a dimming module used in a near-eye display.
- Figure 5 illustrates multiple monochrome cells in a dimming module used in a near-eye display.
- Figure 6 is a table I illustrating dimming ranges for a dimming module having one to four monochrome cells in a dimming module.
- Figure 7 is a block diagram of an embodiment of a system from a software perspective for displaying image data by a NED device having a dimming module.
- Figures 8A-D are flowcharts of embodiments of methods for operating a NED device and/or NED device system having a dimming module.
- Figure 9 is a block diagram of one embodiment of a computing system that can be used to implement a network accessible computing system, a companion processing module or control circuitry of a NED device.
- the technology provides embodiments of a dimming module for a NED device that controls an amount of ambient light that passes through the transmissive (or optical see- through) near-eye display to a user.
- the dimming module includes at least one electrochromic cell that enables variable density dimming (or selectable dimming levels/ambient light transmission percentages) so that the NED device may be used in an AR and/or VR application.
- An AR application may prefer partial dimming of ambient light or a transmissive (clear) state, while a VR application may prefer a darkened (opaque) state.
- a NED device having a dimming module may be included in a visor, or other type of HMD.
- a NED device may disposed by a support structure of a HMD, such as a frame of a visor or glasses or internal supporting structures, such as the structure supporting a NED display module waveguide.
- a HMD such as glasses, used for VR applications block out the wearer's ambient light environment, enabling them to see the virtual content (or images) without impediment of ambient light from nearby lamps, windows, etc.
- AR applications it may be important for a user to see the ambient environment with augmented imagery in some applications.
- a user may wish to view image data, such as direction guidance, while walking down a hallway to a meeting.
- a HMD having a dimming module may be used for both AR and VR applications.
- a dimming module embodiment controls the amount of ambient light for the AR application while blocking the ambient light for the VR applications.
- a dimming module may also have variable density levels because different HMDs may have different near-eye display brightness ranges for AR images due to different display technologies and battery capabilities.
- a dimming module embodiment may then regulate the ambient illuminance so users can see a range of tones in the augmented imagery in a HMD having a particular near-eye display brightness range.
- a dimming module may have a dimming range from fully (or almost) dark to fully (or almost completely) transmissive.
- a dimming module includes at least one monochrome electrochromic cell.
- the dimming module may be formed in a curve shape to wrap around a user's field of view (FOV).
- the dimming module may have relatively low power consumption so as to minimize the weight of batteries used in a HMD.
- a dimming module may have a relatively quick response time, such as about 100ms, and may not create excessive heat that may interfere with lightweight plastic support structures that may be used in HMDs.
- a monochrome electrochromic cell in the dimming module includes a plurality of materials formed in stacked layers.
- a dimming module includes a plurality of monochrome electrochromic cells oriented in stacked manner so as to increase a dimming range toward the darkened state.
- a monochrome electrochromic cell has one or more particular materials that change color or opacity when a charge or current is applied. The charge causes ion migration from one area to another, causing a decided change in the material's visible characteristics. Electrochromic cells may have faster or slower switching speed than liquid crystal display (LCD) materials.
- LCD liquid crystal display
- a monochrome electrochromic cell has process temperatures that may not exceed about 120C, so as to be adaptable to lightweight plastic substrates.
- a monochrome electrochromic cell does not include an active matrix thin- film transistor (TFT) backplane and/or reflective layer, such as Titanium dioxide (T1O2).
- TFT active matrix thin- film transistor
- T1O2 Titanium dioxide
- Figure 1 is a block diagram depicting example components of an embodiment of a near-eye display (NED) device system 8 having a dimming module.
- a NED device system 8 includes a near-eye display (NED) device in a head- mounted display (HMD) device 2 and companion processing module 4.
- HMD 2 is communicatively coupled to companion processing module 4.
- HMD 2 includes a NED device having a projection light engine 120 and near-eye display 14r-l (both shown in Figure 3) having a waveguide as described in detail herein.
- HMD 2 is in the shape of eyeglasses having a frame 115, with each display optical system 141 and 14r positioned at the front of the HMD 2 to be seen through by each eye when worn by a user.
- Each display optical system 141 and 14r is also referred to as a display or near-eye display 14, and the two display optical systems 141 and 14r together may also be referred to as a display or near-eye display 14.
- near-eye display 14 is a transmissive (or optical see-through) near-eye display in which a user may see through the near-eye display.
- near-eye display 14 is curved so as to wraps around a user's visual FOV.
- each display optical system 141 and 14r uses a projection display in which image data (or image light) is projected to generate a display of the image data so that the image data appears to the user at a location in a three dimensional FOV in front of the user.
- image data or image light
- a user may be playing a shoot down enemy helicopter game in an optical see-through mode in his living room.
- An image of a helicopter appears to the user to be flying over a chair in his living room, not between optional see-through lenses 116 and 118, shown in Figure 3, as a user cannot focus on image data that close to the human eye.
- frame 115 provides a convenient eyeglass frame holding elements of the HMD 2 in place as well as a conduit for electrical connections.
- frame 115 provides a support structure for a projection light engine 120 and a near-eye display 14 as described herein.
- Some other examples of NED device support structures are a helmet, visor frame, goggles support or one or more straps.
- the frame 115 includes a nose bridge 104, a front top cover section 117, a respective projection light engine housing 130 for each of a left side housing (1301) and a right side housing (130r) of HMD 2 as well as left and right temples or side arms 1021 and 102r which are designed to rest on each of a user's ears.
- nose bridge 104 includes a microphone 110 for recording sounds and transmitting audio data to control circuit 136.
- an ambient light sensor 257a is also disposed in nose bridge 104 to sense ambient light and provide an ambient light value 244c to memory 244 in control circuit 136 (as shown in Figure 2).
- On the exterior of the side housing 1301 and 130r are respective outward facing cameras 1131 and 113r which capture image data of the real environment in front of the user for mapping what is in a FOV of a near-eye display (NED) device.
- NED near-eye display
- dashed lines 128 are illustrative examples of some electrical connection paths which connect to control circuit 136, also illustrated in dashed lines.
- One dashed electrical connection line is labeled 128 to avoid overcrowding the drawing.
- the electrical connections and control circuit 136 are in dashed lines to indicate they are under the front top cover section 117 in this example.
- Some examples of connectors 129 as screws are illustrated which may be used for connecting the various parts of the frame together.
- the companion processing module 4 may take various embodiments.
- companion processing module 4 is in a portable form which may be worn on the user's body, e.g. a wrist, or be a separate portable computing system like a mobile device (e.g. smartphone, tablet, laptop).
- the companion processing module 4 may communicate using a wire or wirelessly over one or more communication network(s) 50 to one or more network accessible computing system(s) 12, whether located nearby or at a remote location.
- the functionality of the companion processing module 4 may be integrated in software and hardware components of HMD 2.
- One or more network accessible computing system(s) 12 may be leveraged for processing power and remote data access.
- the complexity and number of components may vary considerably for different embodiments of the network accessible computing system(s) 12 and the companion processing module 4.
- a NED device system 1000 may include near-eye display (NED) device system 8 (with or without companion processing module 4), communication network(s) 50 and network accessible computing system(s) 12.
- network accessible computing system(s) 12 may be located remotely or in a Cloud operating environment.
- Image data is identified for display based on an application (e.g. a game, conferencing application, movie playing application, messaging application) executing on one or more processors in control circuit 136, companion processing module 4 and/or network accessible computing system(s) 12 (or a combination thereof) to provide image data to near-eye display 14.
- an application e.g. a game, conferencing application, movie playing application, messaging application
- FIG. 2 is a block diagram of example hardware components including a computing system within a control circuit of a NED device.
- Control circuit 136 provides various electronics that support the other components of HMD 2.
- the control circuit 136 for a HMD 2 comprises a processing unit 210 and a memory 244 accessible to the processing unit 210 for storing processor readable instructions and data.
- a network communication module 137 is communicatively coupled to the processing unit 210 which can act as a network interface for connecting HMD 2 to another computing system such as the companion processing module 4, a computing system of another NED device or one which is remotely accessible over the Internet.
- a power supply 239 provides power for the components of the control circuit 136 and the other components of the HMD 2 like the capture devices 113, the microphone 110, ambient light sensor 257a, other sensor units, and for power drawing components for displaying image data on near-eye display 14 such as light sources and electronic circuitry associated with an image source like a microdisplay in a projection light engine 120.
- the processing unit 210 may comprise one or more processors (or cores) such as a central processing unit (CPU) or core and a graphics processing unit (GPU) or core. In embodiments without a separate companion processing module 4, processing unit 210 may contain at least one GPU.
- Memory 244 is representative of the various types of memory which may be used by the system such as random access memory (RAM) for application use during execution, buffers for sensor data including captured image data, ambient light data and display data, read only memory (ROM) or Flash memory for instructions and system data, and other types of nonvolatile memory for storing applications and user profile data, for example.
- memory 244 includes processor readable information, such as digital data values or processor readable instructions.
- memory 244 may include at least a portion of an application 244a that may be executing.
- Memory 244 may also include a user preference value 244b that indicates a user preference of a dimming value or dimming environment for HMD 2.
- NUI natural language user interface
- a stored user preference value 244b may be increased or decreased by a predetermined step size which then may result in the dimming module increasing or decreasing dimming.
- Memory 244 also includes an ambient light value 244c, received from ambient light sensor 257a, that indicates an amount of ambient light received by ambient light sensor 257a, or in particular an amount of ambient light received by an external curved surface 290 of near-eye display 141 (shown in Figure 3).
- Memory 244 may also include a near-eye display brightness range 244d and dimming module neutral density (ND) range 244e.
- Memory 244 may also include a dimming (software component) 244f to calculate or determine a dimming value based on one or more inputs, such as user preference value 244b, dimming module density (ND) value 249a, display brightness range 244d and/or ambient light value 244c.
- display brightness range 244d represents the capable brightness or luminous range of a near-eye display 14.
- a near-eye display may have a brightness (or luminance) range from zero candelas per square meter (cd/m 2 ) to 8000 cd/m 2 .
- dimming module ND range 244e represents the range of selectable dimming or transmission rates (or percentages) of ambient light traveling through a dimming module.
- dimming module ND range 244e may range from allowing approximately 4% transmission of ambient light (dark state) to allowing approximately 84% transmission of ambient light (clear or transmissive state) through a dimming module or near-eye display having a dimming module.
- digital data values and/or processor readable instructions illustrated Figure 2 may be stored and/or executed in alternate locations in a NED device system 1000
- Figure 2 illustrates an electrical connection of a data bus 270 that connects sensor units 257, display driver 246, processing unit 210, memory 244, and network communication module 137.
- Data bus 270 also derives power from power supply 239 through a power bus 272 to which all the illustrated elements of the control circuit are connected for drawing power.
- control circuit 136 may include circuits that are specific to particular types of image generation or near-eye display technologies.
- control circuit 136 may include microdisplay circuitry 259 and display illumination driver 247 for a microdisplay describe in detail below.
- other types of control circuits may be used in control circuit 136 for different near-eye display technologies, such as a retina scanning display (RSD) that does not have a microdisplay in an embodiment.
- RSD retina scanning display
- a RSD would receive electronic signals from a display engine rather than a projection light engine. A RSD would then provide image light to a user's retina by scanning image light in response to image data from a display engine.
- Control circuit 136 in at least a microdisplay embodiment, comprises a display driver 246 for receiving digital control data (e.g. control bits) to guide reproduction of image data that may be decoded by microdisplay circuitry 259, display illumination driver 247 and dimming module driver 249.
- Display illumination driver drives an illumination unit 222 (in a projection light engine) in response to a display brightness value 247a in an embodiment.
- a projection light engine may include a microdisplay that may be an active transmissive, emissive or reflective device.
- a microdisplay may be a liquid crystal on silicon (LCoS) device requiring power or a micromechanical machine (MEMs) based device requiring power to move individual mirrors.
- LCD liquid crystal on silicon
- MEMs micromechanical machine
- a display illumination driver 247 converts digital control data to analog signals for driving an illumination unit 222 which includes one or more light sources, such as one or more lasers or light emitting diodes (LEDs).
- a display brightness value 247a is stored display driver 246.
- display brightness value 247a may be stored in memory 244.
- display brightness values 247a represents the current display brightness of the near-eye display 14.
- a near-eye display has brightness that corresponds to display brightness value 247a.
- one or more drivers provide one or more analog signals to a near-eye display to provide a corresponding display brightness in response to a display brightness value 247a.
- control circuit 136 includes a dimming module driver 249 to drive dimming module 198 (shown in Figure 3) in response to display driver 246.
- display driver 246 stores a dimming module ND value 249a.
- dimming module ND value 249a is stored in memory 244.
- dimming module driver 249 provides a predetermined amount of current for a predetermined amount of time to dimming module 198 in response to a dimming module ND value 249a or user preference value 244b.
- dimming module driver 249 includes a table to store digital values associated with a predetermined amount of current to apply to dimming module 198 in response to corresponding digital values in dimming module ND value 249a.
- the control circuit 136 may include other control units not illustrated here but related to other functions of a HMD 2 such as providing audio output, identifying head orientation and location information.
- Figure 3 is a top view of an embodiment of a near-eye display 141 being coupled with a projection light engine 120 having an external exit pupil 121. In order to show the components of the display optical system 14, in this case 141 for the left eye, a portion of the top frame section 117 covering the near-eye display 141 and the projection light engine 120 is not depicted. Arrow 142 represents an optical axis of the near-eye display 141.
- the near-eye displays 141 are optical see-through displays. In other embodiments, they can be video-see displays.
- Each display includes a display unit 112 illustrated between two optional see-through lenses 116 and 118 and including a waveguide 123.
- the optional see through lenses 116 and 118 serve as protective coverings and/or to form an image for the viewer at a comfortable visual distance, such as lm ahead, for the display unit.
- One or both of them may also be used to implement a user's eyeglass prescription.
- eye space 140 approximates a location of a user's eye when HMD 2 is worn.
- the waveguide directs image data in the form of image light from a projection light engine 120 towards a user eye space 140 while also allowing ambient 170 light from the real world to pass through towards a user's eye space, thereby allowing a user to have an actual direct view of the space in front of HMD 2 in addition to seeing an image of a virtual feature from the projection light engine 120.
- the projection light engine 120 includes a birdbath optical element 234 illustrated as a curved surface.
- the curved surface provides optical power to the beams 235 of image light it reflects, thus collimating them as well. Only one beam is labeled to prevent overcrowding the drawing.
- the radius of curvature of the birdbath optical element is at least -38 millimeters (mm).
- a waveguide 123 may be a diffractive waveguide. Additionally, in some examples, a waveguide 123 is a surface relief grating (SRG) waveguide. An input grating 119 couples an image light from a projection light engine 120. Additionally, a waveguide has a number of exit gratings 125 for an image light to exit the waveguide in the direction of a user eye space 140. One exit grating 125 is labeled to avoid overcrowding the drawing.
- the projection light engine 120 in a left side housing 1301 includes an image source, for example a microdisplay, which produces the image light and a projection optical system which folds an optical path of the image light to form an exit pupil 121 external to the projection light engine 120.
- the shape of the projection light engine 120 is an illustrative example adapting to the shape of the example of left side housing 1301 which conforms around a corner of the frame 115 in Figure 1 reducing bulkiness. The shape may be varied to accommodate different arrangements of the projection light engine 120, for example due to different image source technologies implemented.
- a microdisplay can be implemented using a transmissive projection technology.
- a light source is modulated by optically active material: the material is usually implemented using transmissive LCD type microdisplays with powerful backlights and high optical energy densities.
- Other microdisplays use a reflective technology for which light from an illumination unit is reflected and modulated by an optically active material.
- the illumination maybe a white source or RGB source, depending on the technology.
- Digital light processing (DLP), digital micromirror device (DMD), and LCoS are all examples of reflective technologies which are efficient as most energy is reflected away from the modulated structure and may be used by the display.
- a microdisplay can be self-emitting, such as a color-emitting organic light emitting diode (OLED) microdisplay or an array of LEDs.
- LED arrays may be created conventionally on gallium nitride (GaN) substrates with a phosphor layer for spectral conversion or other color conversion method.
- GaN gallium nitride
- a self-emissive display is relayed and magnified for a viewer. .
- FIG 3 shows half of a HMD 2.
- a full HMD 2 may include another display optical system 14 with another set of optional see-through lenses 116 and 118, another waveguide 123, as well as another projection light engine 120, and another of outward facing capture devices 113.
- a single projection light engine 120 may be optically coupled to a continuous display viewed by both eyes or be optically coupled to separate displays for the eyes. Additional details of a head mounted personal A/V apparatus are illustrated in United States Patent Application Serial No. 12/905952 entitled Fusing Virtual Content Into Real Content, Filed October 15, 2010.
- FIG. 4 illustrates a monochrome electrochromic cell 400 used in a dimming module 198 according to an embodiment.
- a dimming module 198 includes a single or a plurality of monochrome electrochromic cells.
- monochrome electrochromic cell 400 includes additional materials as well.
- Monochrome electrochromic cell 400 includes a stacked plurality of layered substrates.
- the stacked plurality of layered substrates are at least partially contiguous.
- one or more other material is interspersed between the depicted layers.
- Substrates 450a-b are a clear or transparent substrate layer in an embodiment.
- substrates 450a-b are an outer surface of dimming module 198 and are a moldable transparent material, such as (Poly) methyl methacrylate (PMMA), polycarbonate, glass or similar material.
- PMMA methyl methacrylate
- substrates 450a-b correspond to lens 116 and 118 shown in Figure 3.
- substrates 451a-b are transparent conductive layers to conduct current.
- substrates 451a-b include a material selected from a group consisting of Indium Tin Oxide, metal mesh, silver nanowires, carbon nanotubes, graphene and PEDOT:PSS (Poly(3,4- ethylenedioxythiophene) polystyrene sulfonate).
- substrates 451a-b are coupled to a current source, such as dimming module driver 249 to receive a predetermined amount of current for a predetermined amount of time.
- substrate 453a is a monochrome electrochromic compound layer.
- substrate 454a is an insulator, such as silicon dioxide (S1O2), and is disposed between substrate 453 a and substrate 451b.
- a substrate including an electrolyte is included in monochrome electrochromic cell 400.
- a monochrome electrochromic cell 400 has one or more particular materials that change color or opacity when a charge or current is applied. The charge causes ion migration from one area to another, causing a decided change in the material's visible characteristics.
- monochrome electrochromic cell 400 is manufactured or processed using a variety of process steps.
- one or more substrates are formed using one or more spin coating process steps.
- some substrates may be formed by a spin coating process step, while other substrates may be assembled or affixed to each other.
- Figure 5 illustrates a multiple monochrome cell embodiment in a dimming module used in a near-eye display.
- a dimming module 198 includes a plurality of monochrome electrochromic cells 500. Multiple monochrome cells may be used to increase a dimmable range toward a darkened state as illustrated in Figure 6. In an embodiment, multiple monochrome cells may be used in an application when a relatively high darkened state is desirable and a relatively lower transmissive state is satisfactory.
- Table I of Figure 6 illustrates that a single monochrome cell may have a dimmable range of between approximately 4% (darkened state, 4% of ambient light is allowed to pass) to approximately 83.7% (transmissive state, 83.7% of ambient light is allowed to pass); a two monochrome cell may have a dimmable range of between approximately 0.2% (darkened state) to approximately 70% (transmissive state); a three monochrome cell may have a dimmable range of between approximately 0.0064% (darkened state) to approximately 58.6% (transmissive state); and a four monochrome cell may have a dimmable range of between approximately 0.0003% (darkened sate) to approximately 49.0% (transmissive state).
- Figure 5 illustrates a four monochrome cell embodiment or a plurality of monochrome electrochromic cells 500.
- a plurality of monochrome electrochromic cells 500 include substrates 450a-b, 451a-b, 453a and 454a as similar described herein regarding a single monochrome cell 400 shown in Figure 4.
- substrates 450a-b are clear or transparent substrate layers in an embodiment.
- substrates 450a-b are an outer surface of dimming module 198 and are a moldable transparent material, such as (Poly) methyl methacrylate (PMMA), polycarbonate, glass or similar material.
- PMMA methyl methacrylate
- a plurality of monochrome electrochromic cells 500 includes additional stacked substrates 451c-e that may also be transparent conductor substrates coupled to a controllable current source as described herein. Additional substrate layers 453b-d and 454b-d may be respective monochrome electrochromic compound layers and insulator layers, as similar described in regard to Figure 4.
- the stacked plurality of layered substrates in a plurality of monochrome electrochromic cells 500 are at least partially contiguous. In an alternate embodiment, one or more other material is interspersed between the depicted layers.
- a plurality of monochrome electrochromic cells 500 may be manufactured or processed using a variety of process steps.
- one or more substrates are formed using one or more spin coating process steps.
- some substrates may be formed by a spin coating process step, while other substrates may be assembled or affixed to each other.
- Figure 7 is a block diagram of an embodiment of a system from a software perspective for displaying image data or light (such as a CGI) by a near-eye display device.
- Figure 7 illustrates an embodiment of a computing environment 54 from a software perspective which may be implemented by a system like NED system 8, network accessible computing system(s) 12 in communication with one or more NED systems or a combination thereof. Additionally, a NED system can communicate with other NED systems for sharing data and processing resources.
- an executing application determines which image data is to be displayed, some examples of which are text, emails, virtual books or game related images.
- an application 162 may be executing on one or more processors of the NED system 8 and communicating with an operating system 190 and an image and audio processing engine 191.
- a network accessible computing system(s) 12 may also be executing a version 162N of the application as well as other NED systems 8 with which it is in communication for enhancing the experience.
- Application 162 includes a game in an embodiment.
- the game may be stored on a remote server and purchased from a console, computer, or smartphone in embodiments.
- the game may be executed in whole or in part on the server, console, computer, smartphone or on any combination thereof.
- Multiple users might interact with the game using standard controllers, computers, smartphones, or companion devices and use air gestures, touch, voice, or buttons to communicate with the game in embodiments.
- Application(s) data 329 for one or more applications may also be stored in one or more network accessible locations.
- Some examples of application(s) data 329 may be one or more rule data stores for rules linking action responses to user input data, rules for determining which image data to display responsive to user input data, reference data for natural user input like for one or more gestures associated with the application which may be registered with a gesture recognition engine 193, execution criteria for the one or more gestures, voice user input commands which may be registered with a sound recognition engine 194, physics models for virtual objects associated with the application which may be registered with an optional physics engine (not shown) of the image and audio processing engine 191, and object properties like color, shape, facial features, clothing, etc. of the virtual objects and virtual imagery in a scene.
- the software components of a computing environment 54 comprise the image and audio processing engine 191 in communication with an operating system 190.
- the illustrated embodiment of an image and audio processing engine 191 includes an object recognition engine 192, gesture recognition engine 193, display data engine 195, a sound recognition engine 194, and a scene mapping engine 306.
- the individual engines and data stores provide a supporting platform of data and tasks which an application(s) 162 can leverage for implementing its one or more functions by sending requests identifying data for processing and receiving notification of data updates.
- the operating system 190 facilitates communication between the various engines and applications.
- the operating system 190 makes available to applications which objects have been identified by the object recognition engine 192, gestures the gesture recognition engine 193 has identified, which words or sounds the sound recognition engine 194 has identified, and the positions of objects, real and virtual from the scene mapping engine 306.
- the computing environment 54 also stores data in image and audio data buffer(s) 199 which provide memory for image data and audio data which may be captured or received from various sources as well as memory space for image data to be displayed.
- the buffers may exist on both the NED, e.g. as part of the overall memory 244, and may also exist on the companion processing module 4.
- virtual data (or a virtual image) is to be displayed in relation to a real object in the real environment.
- the object recognition engine 192 of the image and audio processing engine 191 detects and identifies real objects, their orientation, and their position in a display FOV based on captured image data and captured depth data from outward facing image capture devices 113 if available or determined depth positions from stereopsis based on the image data of the real environment captured by the capture devices 113.
- the object recognition engine 192 distinguishes real objects from each other by marking object boundaries, for example using edge detection, and comparing the object boundaries with structure data 200. Besides identifying the type of object, an orientation of an identified object may be detected based on the comparison with stored structure data 200.
- structure data 200 may store structural information such as structural patterns for comparison and image data as references for pattern recognition. Reference image data and structural patterns may also be available in user profile data 197 stored locally or accessible in Cloud based storage.
- the scene mapping engine 306 tracks the three dimensional (3D) position, orientation, and movement of real and virtual objects in a 3D mapping of the display FOV.
- Image data is to be displayed in a user's FOV or in a 3D mapping of a volumetric space about the user based on communications with the object recognition engine 192 and one or more executing application(s) 162 causing image data to be displayed.
- An application(s) 162 identifies a target 3D space position in the 3D mapping of the display FOV for an object represented by image data and controlled by the application.
- the helicopter shoot down application identifies changes in the position and object properties of the helicopters based on the user's actions to shoot down the virtual helicopters.
- the display data engine 195 performs translation, rotation, and scaling operations for display of the image data at the correct size and perspective.
- the display data engine 195 relates the target 3D space position in the display FOV to display coordinates of the display unit 112.
- the display data engine may store image data for each separately addressable display location or area (e.g. a pixel, in a Z-buffer and a separate color buffer).
- the display driver 246 translates the image data for each display area to digital control data instructions for microdisplay circuitry 259 or the display illumination driver 247 or both for controlling display of image data by the image source.
- NED system 8 and/or network accessible computing system(s) 12 may be included in an Internet of Things (IoT) embodiment.
- the IoT embodiment may include a network of devices that may have the ability to capture information via sensors. Further, such devices may be able to track, interpret, and communicate collected information. These devices may act in accordance with user preferences and privacy settings to transmit information and work in cooperation with other devices. Information may be communicated directly among individual devices or via a network such as a local area network (LAN), wide area network (WAN), a "cloud" of interconnected LANs or WANs, or across the entire Internet.
- LAN local area network
- WAN wide area network
- clouds of interconnected LANs or WANs
- the technology described herein may also be embodied in a Big Data or Cloud operating environment as well.
- a Cloud operating environment information including data, images, engines, operating systems, and/or applications described herein may be accessed from a remote storage device via the Internet.
- a modular rented private cloud may be used to access information remotely.
- data sets have sizes beyond the ability of typically used software tools to capture, create, manage, and process the data within a tolerable elapsed time.
- image data may be stored remotely in a Big Data operating embodiment.
- a NED device having a dimming module may operate with different applications.
- a user may, but not limited to: 1) play an AR video game; 2) watch a movie; 3) teleconference; and/or 4) view messages or documents.
- the following examples may include methods of operating a NED device that may include one or more steps illustrated in Figures 8A-D described in detail herein.
- a user is playing an AR game using a NED device.
- an ambient light sensor from a HMD having a NED device such as ambient light sensor 257a, senses an ambient light environment of 50 lux. This is an ambient light level that this particular NED device having a dimming module can match. Accordingly, for this particular AR game being played on this particular NED device, no dimming is needed.
- the display brightness level is set to match AR content to the same visual brightness as the user's environment and a dimming module of the NED device is set to the highest transmission state.
- the 50 lux value in the above example is stored as ambient light value 244c in memory 244 and application 224a corresponds to the AR game.
- Dimming 244f executed by processing unit 210 compares this 50 lux value with display brightness range 244d in memory 244. Dimming 244f then determines that dimming module ND value 249a may be set to a full transmission state value or at the full transmission state value in dimming module (ND) range 244e.
- a display brightness value 247a, in display driver 246, is also set by the AR game.
- a user is playing a movie in an airplane using a NED device having a dimming module.
- a dimming module in a NED device is set to a fully darkened state or lowest transmission state.
- dimming 244f executed by processing unit 210 would determine a type of application to be executed, such as a movie player application, and then set a dimming module ND value 249a to a full darkened state value or at the full darkened state value in dimming module (ND) range 244e.
- a movie player application corresponds to application 244a and provides its type to dimming 244f.
- a user is teleconferencing with another person on a NED device having a dimming module while also viewing a CGI.
- the user wants to be able to see the person they are speaking to along with the CGI or document during the teleconference.
- the user has set an ambient light user preference for the teleconferencing application such that the user may easily also see their environment as well.
- An ambient light sensor such as ambient light sensor 257a, senses an ambient light environment of 3000 lux and stores a corresponding value in ambient light value 244c.
- This particular NED display cannot generate a brightness level greater than 800 cd/m 2 (as determined by display brightness range 244d in an embodiment), so the dimming module is used to filter the environment to a brightness level that the displayed information can match.
- a processor such as processing unit 210, makes the following calculation to determine a dimming value (such as dimming module ND value 249a) after comparing the ambient light value to a display brightness range:
- the calculated or determined dimming value, 83.7%, is within the density range of the dimming module (or compared with dimming module ND range 244e), so the desired dimming level is converted to a dimming module drive level (or dimming module ND value 249a in an embodiment).
- the dimming level is set to the closest level within capabilities.
- the dimming module drive level (or dimming module ND value 249a in an embodiment) is then determined by mapping the desired density result through the dimming module's drive vs. density characteristic function. In an embodiment, this calculation may be done with look-up table in display driver 246. Accordingly, the dimming module and the display brightness may both be set to their desired states.
- an office worker has abandoned their multiple desktop monitors for a virtual monitor application provided by a NED device having a dimming module.
- the user's preference is to have the NED device have partial dimming, so that the user can see people who stop by yet allow the virtual monitor to dominate the user's visual attention—not the items on the user's wall or what's going on out the window.
- an ambient light sensor may sense an ambient light value of 400 lux.
- a processor may then calculate or determine a dimming value (or dimming module ND value 249a in an embodiment) based on the ambient light value and user preference value for the virtual monitor application. For example, a user preference value and dimming module may cause the ambient light for this particular NED device to be reduced or limited by 50% when using the virtual monitor application with the above ambient light.
- Figures 8A-D are flowcharts of embodiment of methods for operating a NED device having a dimming module and/or system having a dimming module.
- the steps illustrated in Figures 8A-D may be performed by hardware components, software components and a user, singly or in combination.
- the method embodiments below are described in the context of the system and apparatus embodiments described above. However, the method embodiments are not limited to operating in the system embodiments described herein and may be implemented in other system embodiments. Furthermore, the method embodiments may be continuously performed while the NED system is in operation and an applicable application is executing.
- Step 801 of method 800 shown in Figure 8 A, begins by sensing an ambient light value associated with an amount of ambient light received by a transmissive near-eye display.
- ambient light sensor 257a senses ambient light 170 shown in Figures 1 and 2.
- Step 802 illustrates storing the ambient light value.
- ambient light value 244c is stored in memory 244 as illustrated in Figure 2.
- step 802 may be omitted and the ambient light value is not stored.
- Step 803 illustrates executing an application to provide image light to the transmissive near-eye display.
- application 244a is executed by at least processing unit 210 as illustrated in Figure 2.
- Step 804 illustrates retrieving a display brightness value associated with the application.
- a display brightness value may be obtained from application 244a or from display brightness value 247a in display driver 246.
- Step 805 illustrates determining a dimming value for a dimming module in response to the ambient light value and the display brightness value.
- a software component such as dimming 244f being executed by processing unit 210 determines a dimming value for dimming module 198 and stores the value as dimming module ND value 249a in display driver 246.
- Step 806 illustrates limiting the amount of ambient light that passes through the dimming module (and then through a transmissive near-eye display) in response to the dimming value.
- dimming module driver 249 applies a predetermined amount of current for a predetermined amount of time to a dimming module 198, in particular to transparent conductors of a one or more monochrome electrochromic cells in a dimming module 198. The current then causes an electrochromic reaction that limits a predetermined amount of light passing through dimming module 198 in response to dimming module ND value 249a.
- method 850 of Figure 8B may replace steps 801-805 of method 800.
- Step 851 illustrates receiving a value that indicates a user preference as to the amount of ambient light that passes through the dimming module.
- a user may provide a user preference by way of a user or natural language user interface as described herein. For example, a user may make a speech command, such as "dim 50%.”
- Step 852 illustrates storing the value that indicates the user preference.
- user preference value 244b is stored in memory 244 as illustrated in Figure 2.
- a natural language or 3D user interface software component receives a spoken or gestured user preference and translates the spoken or gestured preference to a digital value representing that preference as user preference value 244b.
- step 852 may be omitted and the user preference may not be stored.
- Step 853 illustrates determining the dimming value is based on the value that indicates a user preference.
- dimming 244f executing by processing unit 210 makes this determination in response to user preference value 244b.
- method 860 shown in Figure 8C may replace steps 801-805 of method 800.
- Step 861 illustrates determining a type of the application.
- an application 244a executing by processing unit 210 provides the type of application to an operating system or dimming 244f.
- dimming 244f executing by processing unit 210 queries application 244a for a type.
- Step 862 illustrates determining the dimming value is based on the type of application.
- dimming 244f executed by processing unit 210 sets a dimming value in response to a type of application that is executing or will be executing.
- dimming 244f sets dimming module ND value 249a to a value corresponding to a dark state when a movie player is executing and may set a dimming value module ND value 249a to correspond to a transmissive state for messaging application.
- method 870 shown in Figure 8D is performed in combination with method 800 in Figure 8 A.
- Step 871 illustrates retrieving a display brightness range.
- dimming 244f executed by processing unit 210 retrieves a display brightness rage from display brightness range 244d in memory 244.
- Step 872 illustrates retrieving a dimming module neutral density (ND) range.
- dimming 244f executed by processing unit 210 retrieves a dimming module neutral density (ND) range 244e stored in memory 244.
- Step 873 illustrates determining the dimming value is in response to the ambient light value, display brightness value, display brightness range and dimming module neutral density range.
- dimming 244f executed by processing unit 210 performs at least part of step 873.
- the dimming value is determined based on the ambient light value, display brightness value, display brightness range and dimming module ND range, singly or in combination thereof, as described herein.
- FIG. 9 is a block diagram of one embodiment of an exemplary computing system 900 that can be used to implement a network accessible computing system(s) 12, a companion processing module 4, or another embodiment of control circuit 136 of a HMD 2.
- Computing system 900 may host at least some of the software components of computing environment 54.
- computing system 900 may be embodied in a Cloud server, server, client, peer, desktop computer, laptop computer, hand-held processing device, tablet, smartphone and/or wearable computing/processing device.
- computing system 900 typically includes one or more processing units (or cores) 902 or one or more central processing units (CPU) and one or more graphics processing units (or cores) (GPU).
- Computing system 900 also includes memory 904.
- memory 904 may include volatile memory 905 (such as RAM), non-volatile memory 907 (such as ROM, flash memory, etc.) or some combination thereof.
- volatile memory 905 such as RAM
- non-volatile memory 907 such as ROM, flash memory, etc.
- computing system 900 may also have additional features/functionality.
- computing system 900 may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape.
- additional storage is illustrated in Figure 9 by removable storage 908 and non-removable storage 910.
- processing unit(s) 902 can be performed or executed, at least in part, by one or more other hardware logic components.
- illustrative types of hardware logic components include Field-programmable Gate Arrays (FPGAs), Program Application-specific Integrated Circuits (ASICs), Program Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs) and other like type of hardware logic components.
- FPGAs Field-programmable Gate Arrays
- ASICs Program Application-specific Integrated Circuits
- ASSPs Program Application-specific Standard Products
- SOCs System-on-a-chip systems
- CPLDs Complex Programmable Logic Devices
- Computing system 900 may also contain communication module(s) 912 including one or more network interfaces and transceivers that allow the device to communicate with other computer systems.
- Computing system 900 may also have input device(s) 914 such as keyboard, mouse, pen, microphone, touch input device, gesture recognition device, facial recognition device, tracking device or similar input device.
- Output device(s) 916 such as a display, speaker, printer, or similar output device may also be included.
- a user interface (UI) software component to interface with a user may be stored in and executed by computing system 900.
- computing system 900 stores and executes a NUI and/or 3D UI.
- NUIs include using speech recognition, touch and stylus recognition, gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, hover, gestures, and machine intelligence.
- Specific categories of NUI technologies include for example, touch sensitive displays, voice and speech recognition, intention and goal understanding, motion gesture detection using depth cameras (such as stereoscopic or time-of-flight camera systems, infrared camera systems, RGB camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, 3D displays, head, eye, and gaze tracking, immersive augmented reality and virtual reality systems, all of which may provide a more natural interface, as well as technologies for sensing brain activity using electric field sensing electrodes (EEG and related methods).
- EEG electric field sensing electrodes
- a UI (including a NUI or 3D UI) software component may be at least partially executed and/or stored on a computing system 900.
- a UI may be at least partially executed and/or stored on server and sent to a client.
- the UI may be generated as part of a service, and it may be integrated with other services, such as social networking services.
- the example computing systems illustrated in the figures include examples of computer readable storage devices.
- a computer readable storage device is also a processor readable storage device.
- Such devices may include volatile and nonvolatile, removable and non-removable memory devices implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
- processor or computer readable storage devices are RAM, ROM, EEPROM, cache, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, memory sticks or cards, magnetic cassettes, magnetic tape, a media drive, a hard disk, magnetic disk storage or other magnetic storage devices, or any other device which can be used to store the information and which can be accessed by a computing system.
- One or more embodiments include an apparatus comprising a support structure, a near-eye display to provide image light and a dimming module.
- the dimming module comprises: a first transparent substrate, a first transparent conductor, a monochrome electrochromic compound, an insulator, a second transparent conductor and a second transparent substrate.
- the dimming module controls light passing through the dimming module in response to a current applied between the first and second transparent conductors.
- the monochrome electrochromic compound and insulator are disposed at least partially between the first transparent conductor and the second transparent conductor.
- the insulator is disposed at least partially adjacent the monochrome electrochromic compound and the first transparent conductor is disposed at least partially adjacent the monochrome electrochromic compound.
- the first transparent conductor is disposed at least partially between the first transparent substrate and the monochrome electrochromic compound.
- the second transparent conductor is disposed at least partially between the insulator and the second transparent substrate.
- the dimming module is disposed adjacent the near-eye display.
- the dimming module incudes a third transparent conductor, another monochrome electrochromic compound, another insulator and a fourth transparent conductor.
- the first and second transparent substrates includes a material selected from a group consisting of Poly (methyl methacrylate) (PMMA), polycarbonate and glass.
- the first and second transparent conductors includes a material selected from a group consisting of Indium Tin Oxide, metal mesh, silver nanowires, carbon nanotubes, graphene and PEDOT:PSS (Poly(3,4-ethylenedioxythiophene) polystyrene sulfonate) in embodiments.
- the insulator includes silicon dioxide (S1O2) in an embodiment.
- One or more embodiments of a method comprises sensing an ambient light value associated with an amount of ambient light received by a transmissive near-eye display.
- An application executes to provide image light to the transmissive near-eye display.
- a near-eye display brightness value associated with the application is retrieved and a dimming value for a dimming module is determined in response to the ambient light value and the near-eye display brightness value.
- the amount of ambient light that passes through the dimming module is limited in response to the dimming value.
- the method comprises receiving a value that indicates a user preference as to the amount of ambient light that passes through the dimming module.
- the determining the dimming value is based on the value that indicates the user preference rather than in response to the ambient light value and near-eye display brightness value.
- the method further comprises determining a type of the application and wherein determining the dimming value is based on the type of the application rather than in response to the ambient light value and the near-eye display brightness value.
- the type of the application includes a movie type
- determining the dimming value includes setting the dimming value so that the ambient light that passes through the dimming module is reduced to a minimum.
- the determining the dimming value in response to the ambient light value and the near-eye display brightness value comprises: retrieving a near-eye display brightness range and retrieving a dimming module neutral density range. Determining the dimming value is in response to the ambient light value, near-eye display brightness value, near-eye display brightness range and dimming module neutral density range.
- the dimming module includes a monochrome electrochromic cell, and wherein limiting the amount of ambient light that passes through the dimming module in response to the dimming value comprises: applying an amount of current to the monochrome electrochromic cell in response to the dimming value.
- One or more embodiments including a computing system and a HMD having a near-eye display.
- the computing system provides an electronic signal representing image data.
- the HMD provides image data in response to the electronic signal.
- the HMD includes a NED device comprising a display engine to output the image data, a transmissive near- eye display to provide image light in response to the image data, a dimming module and a control circuit.
- the dimming module reduces an amount of ambient light that passes through the transmissive near-eye display in response to a current.
- the control circuit provides the current in response to a dimming value.
- a dimming module comprises a first and second transparent conductors and a monochrome electrochromic cell. Current is applied to the first and second transparent conductors from a control circuit.
- the apparatus further comprises an ambient light sensor to sense an ambient light value associated with the amount of ambient light.
- the computing system determines the dimming value in response to at least the ambient light value.
- the computing system provides the electrical signal in response to the computing system executing an application, wherein the application has a display brightness value. The computing system determines the dimming value in response to at least one of the ambient light value and the display brightness value.
- the near-eye transmissive display has a display brightness range and the dimming module has a dimming range.
- the computing system determines the dimming value in response to the ambient light value, the display brightness value, the display brightness range and the dimming range.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Nonlinear Science (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
Abstract
The technology provides a dimming module (198) for a near-eye display, NED, device (14r, l) that controls an amount of ambient light (170) that passes through the transmissive near-eye display (112) to a user. The dimming module (198) includes at least one electrochromic cell (400) that enables variable density dimming so that the NED device (14r, l) may be used in an augmented reality (AR) and/or virtual reality (VR) application. The electrochromic cell (400) may be a monochrome electrochromic cell having stacked layers of a monochrome electrochromic compound layer (453a) and insulator (454a) sandwiched between a pair of transparent substrates (450a, b) and conductors (451a, b). A current may be applied to the conductor layers (451a, b) to control the amount of dimming in response to a dimming value. A NED device (14l, r) having a dimming module (198) may be included in a visor (8), or other type of head-mounted display, HMD (2). The dimming module (198) may be flat and supported by a flat waveguide mount (123) in the user's field of view.
Description
HEAD-MOUNTED DISPLAY WITH ELECTROCHROMIC DIMMING MODULE FOR AUGMENTED AND VIRTUAL REALITY PERCEPTION
BACKGROUND
[0001] A near-eye display (NED) device may be worn by a user for experiences such as an augmented reality (AR) experience and a virtual reality (VR) experience. A NED device may include a display module that may provide a computer-generated image (CGI), or other information, in a near-eye display of the NED device. In an AR experience, a near-eye display of a NED device may include optical see-through lens to allow a CGI to be superimposed on a real-world view of a user.
[0002] A NED device may be included in a head-mounted display (HMD). A HMD having a NED device may be in the form of a helmet, visor, glasses, and goggles or attached by one or more straps. HMDs may be used in at least aviation, engineering, science, medicine, computer gaming, video, sports, training, simulations and other applications.
SUMMARY
[0003] The technology provides embodiments of a dimming module for a near-eye display (NED) device that controls an amount of ambient light that passes through the transmissive (or optical see-through) near-eye display to a user. The dimming module includes at least one electrochromic cell that enables variable density dimming (or selectable dimming levels/ambient light transmission percentages) so that the NED device may be used in an augmented reality (AR) and/or virtual reality (VR) application. An AR application may prefer partial dimming of ambient light or a transmissive (clear) state, while a VR application may prefer a darkened (opaque) state. A NED device having a dimming module may be included in a visor, or other type of head-mounted display (HMD). A NED device may be disposed by a support structure of a HMD, such as a frame of a visor or glasses.
[0004] In an embodiment, the dimming module includes at least one monochromic electrochromic cell with stacked layers of material. The stacked layers include layers of, a monochrome electrochromic compound and an insulator sandwiched between a pair of transparent substrates and conductors. In an embodiment, the transparent substrates may be lens of a HMD. An amount of current may be applied from a control circuit to the conductor layers to control an amount of dimming in response to a dimming value. The dimming value may be determined based upon an ambient light value from an ambient light sensor, user preference value, type of executing application, near-eye display brightness value, dimming module neutral density (ND) range (dimmable range) and near-eye display brightness range,
singly or in combination. In an embodiment, multiple monochromic electrochromic cells may be stacked and used to increase a dimmable range.
[0005] The technology provides one or more apparatus embodiments. In an embodiment, an apparatus comprises a support structure, a near-eye display to provide image light and a dimming module. The dimming module comprises: a first transparent substrate, a first transparent conductor, a monochrome electrochromic compound, an insulator, a second transparent conductor and a second transparent substrate. The dimming module controls light passing through the dimming module in response to a current applied between the first and second transparent conductors.
[0006] The technology also provides one or more embodiments of a method comprising sensing an ambient light value associated with an amount of ambient light received by a transmissive near-eye display. An application is executed to provide image light to the transmissive near-eye display. A near-eye display brightness value associated with the application is retrieved and a dimming value for a dimming module is determined in response to the ambient light value and the near-eye display brightness value. The amount of ambient light that passes through the dimming module is limited in response to the dimming value.
[0007] The technology also provides one or more embodiments including a computing system and a HMD having a near-eye display. The computing system provides an electronic signal representing image data. The HMD provides image data in response to the electronic signal. The HMD includes a NED device comprising a display engine to output the image data, a transmissive near-eye display to provide image light in response to the image data, a dimming module and a control circuit. The dimming module controls an amount of ambient light that passes through the dimming module in response to a current. The control circuit provides the current in response to a dimming value.
[0008] In an embodiment, a dimming module comprises first and second transparent conductors and a monochrome electrochromic cell. Current is applied to the first and second conductors from a control circuit.
[0009] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] Figure 1 is a block diagram depicting example components of an embodiment of a near-eye display (NED) device system.
[0011] Figure 2 is a block diagram of example hardware components in control circuitry of a NED device.
[0012] Figure 3 is a top view of an embodiment of a near-eye display having a dimming module.
[0013] Figure 4 illustrates a single monochrome cell in a dimming module used in a near-eye display.
[0014] Figure 5 illustrates multiple monochrome cells in a dimming module used in a near-eye display.
[0015] Figure 6 is a table I illustrating dimming ranges for a dimming module having one to four monochrome cells in a dimming module.
[0016] Figure 7 is a block diagram of an embodiment of a system from a software perspective for displaying image data by a NED device having a dimming module.
[0017] Figures 8A-D are flowcharts of embodiments of methods for operating a NED device and/or NED device system having a dimming module.
[0018] Figure 9 is a block diagram of one embodiment of a computing system that can be used to implement a network accessible computing system, a companion processing module or control circuitry of a NED device.
DETAILED DESCRIPTION
[0019] The technology provides embodiments of a dimming module for a NED device that controls an amount of ambient light that passes through the transmissive (or optical see- through) near-eye display to a user. The dimming module includes at least one electrochromic cell that enables variable density dimming (or selectable dimming levels/ambient light transmission percentages) so that the NED device may be used in an AR and/or VR application. An AR application may prefer partial dimming of ambient light or a transmissive (clear) state, while a VR application may prefer a darkened (opaque) state. A NED device having a dimming module may be included in a visor, or other type of HMD. A NED device may disposed by a support structure of a HMD, such as a frame of a visor or glasses or internal supporting structures, such as the structure supporting a NED display module waveguide.
[0020] A HMD, such as glasses, used for VR applications block out the wearer's ambient light environment, enabling them to see the virtual content (or images) without
impediment of ambient light from nearby lamps, windows, etc. However for AR applications, it may be important for a user to see the ambient environment with augmented imagery in some applications. For example in an AR application, a user may wish to view image data, such as direction guidance, while walking down a hallway to a meeting.
[0021] In an embodiment, a HMD having a dimming module may be used for both AR and VR applications. A dimming module embodiment controls the amount of ambient light for the AR application while blocking the ambient light for the VR applications. In an embodiment, a dimming module may also have variable density levels because different HMDs may have different near-eye display brightness ranges for AR images due to different display technologies and battery capabilities. A dimming module embodiment may then regulate the ambient illuminance so users can see a range of tones in the augmented imagery in a HMD having a particular near-eye display brightness range. In an embodiment, a dimming module may have a dimming range from fully (or almost) dark to fully (or almost completely) transmissive.
[0022] In embodiments, a dimming module includes at least one monochrome electrochromic cell. The dimming module may be formed in a curve shape to wrap around a user's field of view (FOV). The dimming module may have relatively low power consumption so as to minimize the weight of batteries used in a HMD. Also, a dimming module may have a relatively quick response time, such as about 100ms, and may not create excessive heat that may interfere with lightweight plastic support structures that may be used in HMDs.
[0023] In an embodiment, a monochrome electrochromic cell in the dimming module includes a plurality of materials formed in stacked layers. In embodiments, a dimming module includes a plurality of monochrome electrochromic cells oriented in stacked manner so as to increase a dimming range toward the darkened state. In an embodiment, a monochrome electrochromic cell has one or more particular materials that change color or opacity when a charge or current is applied. The charge causes ion migration from one area to another, causing a decided change in the material's visible characteristics. Electrochromic cells may have faster or slower switching speed than liquid crystal display (LCD) materials.
[0024] In an embodiment, a monochrome electrochromic cell has process temperatures that may not exceed about 120C, so as to be adaptable to lightweight plastic substrates. In an embodiment, a monochrome electrochromic cell does not include an active matrix thin- film transistor (TFT) backplane and/or reflective layer, such as Titanium dioxide (T1O2).
[0025] Figure 1 is a block diagram depicting example components of an embodiment of a near-eye display (NED) device system 8 having a dimming module. In the illustrated embodiment, a NED device system 8 includes a near-eye display (NED) device in a head- mounted display (HMD) device 2 and companion processing module 4. HMD 2 is communicatively coupled to companion processing module 4. Wireless communication is illustrated in this example, but communication via a wire between companion processing module 4 and HMD 2 may also be implemented. In an embodiment, HMD 2 includes a NED device having a projection light engine 120 and near-eye display 14r-l (both shown in Figure 3) having a waveguide as described in detail herein.
[0026] In this embodiment, HMD 2 is in the shape of eyeglasses having a frame 115, with each display optical system 141 and 14r positioned at the front of the HMD 2 to be seen through by each eye when worn by a user. Each display optical system 141 and 14r is also referred to as a display or near-eye display 14, and the two display optical systems 141 and 14r together may also be referred to as a display or near-eye display 14.
[0027] In an embodiment, near-eye display 14 is a transmissive (or optical see-through) near-eye display in which a user may see through the near-eye display. In an embodiment, near-eye display 14 is curved so as to wraps around a user's visual FOV. In this embodiment, each display optical system 141 and 14r uses a projection display in which image data (or image light) is projected to generate a display of the image data so that the image data appears to the user at a location in a three dimensional FOV in front of the user. For example, a user may be playing a shoot down enemy helicopter game in an optical see-through mode in his living room. An image of a helicopter appears to the user to be flying over a chair in his living room, not between optional see-through lenses 116 and 118, shown in Figure 3, as a user cannot focus on image data that close to the human eye.
[0028] In this embodiment, frame 115 provides a convenient eyeglass frame holding elements of the HMD 2 in place as well as a conduit for electrical connections. In an embodiment, frame 115 provides a support structure for a projection light engine 120 and a near-eye display 14 as described herein. Some other examples of NED device support structures are a helmet, visor frame, goggles support or one or more straps. The frame 115 includes a nose bridge 104, a front top cover section 117, a respective projection light engine housing 130 for each of a left side housing (1301) and a right side housing (130r) of HMD 2 as well as left and right temples or side arms 1021 and 102r which are designed to rest on each of a user's ears. In this embodiment, nose bridge 104 includes a microphone 110 for recording sounds and transmitting audio data to control circuit 136. In an embodiment, an
ambient light sensor 257a is also disposed in nose bridge 104 to sense ambient light and provide an ambient light value 244c to memory 244 in control circuit 136 (as shown in Figure 2). On the exterior of the side housing 1301 and 130r are respective outward facing cameras 1131 and 113r which capture image data of the real environment in front of the user for mapping what is in a FOV of a near-eye display (NED) device.
[0029] In this embodiment, dashed lines 128 are illustrative examples of some electrical connection paths which connect to control circuit 136, also illustrated in dashed lines. One dashed electrical connection line is labeled 128 to avoid overcrowding the drawing. The electrical connections and control circuit 136 are in dashed lines to indicate they are under the front top cover section 117 in this example. There may also be other electrical connections (not shown) including extensions of a power bus in the side arms for other components, some examples of which are sensor units including additional cameras, audio output devices like earphones or units, and perhaps an additional processor and memory. Some examples of connectors 129 as screws are illustrated which may be used for connecting the various parts of the frame together.
[0030] The companion processing module 4 may take various embodiments. In some embodiments, companion processing module 4 is in a portable form which may be worn on the user's body, e.g. a wrist, or be a separate portable computing system like a mobile device (e.g. smartphone, tablet, laptop). The companion processing module 4 may communicate using a wire or wirelessly over one or more communication network(s) 50 to one or more network accessible computing system(s) 12, whether located nearby or at a remote location. In other embodiments, the functionality of the companion processing module 4 may be integrated in software and hardware components of HMD 2. Some examples of hardware components of the companion processing module 4 and network accessible computing system(s) 12 are shown in Figure 9.
[0031] One or more network accessible computing system(s) 12 may be leveraged for processing power and remote data access. The complexity and number of components may vary considerably for different embodiments of the network accessible computing system(s) 12 and the companion processing module 4. In an embodiment illustrated in Figure 1, a NED device system 1000 may include near-eye display (NED) device system 8 (with or without companion processing module 4), communication network(s) 50 and network accessible computing system(s) 12. In an embodiment, network accessible computing system(s) 12 may be located remotely or in a Cloud operating environment.
[0032] Image data is identified for display based on an application (e.g. a game, conferencing application, movie playing application, messaging application) executing on one or more processors in control circuit 136, companion processing module 4 and/or network accessible computing system(s) 12 (or a combination thereof) to provide image data to near-eye display 14.
[0033] Figure 2 is a block diagram of example hardware components including a computing system within a control circuit of a NED device. Control circuit 136 provides various electronics that support the other components of HMD 2. In this example, the control circuit 136 for a HMD 2 comprises a processing unit 210 and a memory 244 accessible to the processing unit 210 for storing processor readable instructions and data. A network communication module 137 is communicatively coupled to the processing unit 210 which can act as a network interface for connecting HMD 2 to another computing system such as the companion processing module 4, a computing system of another NED device or one which is remotely accessible over the Internet. A power supply 239 provides power for the components of the control circuit 136 and the other components of the HMD 2 like the capture devices 113, the microphone 110, ambient light sensor 257a, other sensor units, and for power drawing components for displaying image data on near-eye display 14 such as light sources and electronic circuitry associated with an image source like a microdisplay in a projection light engine 120.
[0034] The processing unit 210 may comprise one or more processors (or cores) such as a central processing unit (CPU) or core and a graphics processing unit (GPU) or core. In embodiments without a separate companion processing module 4, processing unit 210 may contain at least one GPU. Memory 244 is representative of the various types of memory which may be used by the system such as random access memory (RAM) for application use during execution, buffers for sensor data including captured image data, ambient light data and display data, read only memory (ROM) or Flash memory for instructions and system data, and other types of nonvolatile memory for storing applications and user profile data, for example.
[0035] In an embodiment, memory 244 includes processor readable information, such as digital data values or processor readable instructions. For example, memory 244 may include at least a portion of an application 244a that may be executing. Memory 244 may also include a user preference value 244b that indicates a user preference of a dimming value or dimming environment for HMD 2. A user may indicate their preference, as described in detail herein, by making a speech command to a natural language user interface (NUI), such
as: "Dim more" or "Dim less." In this embodiment, a stored user preference value 244b may be increased or decreased by a predetermined step size which then may result in the dimming module increasing or decreasing dimming. Memory 244 also includes an ambient light value 244c, received from ambient light sensor 257a, that indicates an amount of ambient light received by ambient light sensor 257a, or in particular an amount of ambient light received by an external curved surface 290 of near-eye display 141 (shown in Figure 3). Memory 244 may also include a near-eye display brightness range 244d and dimming module neutral density (ND) range 244e. Memory 244 may also include a dimming (software component) 244f to calculate or determine a dimming value based on one or more inputs, such as user preference value 244b, dimming module density (ND) value 249a, display brightness range 244d and/or ambient light value 244c.
[0036] In an embodiment, display brightness range 244d represents the capable brightness or luminous range of a near-eye display 14. For example, a near-eye display may have a brightness (or luminance) range from zero candelas per square meter (cd/m2) to 8000 cd/m2. In an embodiment, dimming module ND range 244e represents the range of selectable dimming or transmission rates (or percentages) of ambient light traveling through a dimming module. In an example, dimming module ND range 244e may range from allowing approximately 4% transmission of ambient light (dark state) to allowing approximately 84% transmission of ambient light (clear or transmissive state) through a dimming module or near-eye display having a dimming module.
[0037] In alternate embodiments, digital data values and/or processor readable instructions illustrated Figure 2, such as in memory 244, may be stored and/or executed in alternate locations in a NED device system 1000
[0038] Figure 2 illustrates an electrical connection of a data bus 270 that connects sensor units 257, display driver 246, processing unit 210, memory 244, and network communication module 137. Data bus 270 also derives power from power supply 239 through a power bus 272 to which all the illustrated elements of the control circuit are connected for drawing power.
[0039] In an embodiment, control circuit 136 may include circuits that are specific to particular types of image generation or near-eye display technologies. For example, control circuit 136 may include microdisplay circuitry 259 and display illumination driver 247 for a microdisplay describe in detail below. In alternate embodiments, other types of control circuits may be used in control circuit 136 for different near-eye display technologies, such as a retina scanning display (RSD) that does not have a microdisplay in an embodiment. In
a RSD embodiment, a RSD would receive electronic signals from a display engine rather than a projection light engine. A RSD would then provide image light to a user's retina by scanning image light in response to image data from a display engine.
[0040] Control circuit 136, in at least a microdisplay embodiment, comprises a display driver 246 for receiving digital control data (e.g. control bits) to guide reproduction of image data that may be decoded by microdisplay circuitry 259, display illumination driver 247 and dimming module driver 249. Display illumination driver drives an illumination unit 222 (in a projection light engine) in response to a display brightness value 247a in an embodiment. A projection light engine may include a microdisplay that may be an active transmissive, emissive or reflective device. For example, a microdisplay may be a liquid crystal on silicon (LCoS) device requiring power or a micromechanical machine (MEMs) based device requiring power to move individual mirrors. In an embodiment, a display illumination driver 247 converts digital control data to analog signals for driving an illumination unit 222 which includes one or more light sources, such as one or more lasers or light emitting diodes (LEDs). In an embodiment, a display brightness value 247a is stored display driver 246. In an alternate embodiment, display brightness value 247a may be stored in memory 244. In an embodiment, display brightness values 247a represents the current display brightness of the near-eye display 14. In an embodiment, a near-eye display has brightness that corresponds to display brightness value 247a. In an embodiment, one or more drivers provide one or more analog signals to a near-eye display to provide a corresponding display brightness in response to a display brightness value 247a.
[0041] In an embodiment, control circuit 136 includes a dimming module driver 249 to drive dimming module 198 (shown in Figure 3) in response to display driver 246. In an embodiment, display driver 246 stores a dimming module ND value 249a. In an alternate embodiment, dimming module ND value 249a is stored in memory 244. In an embodiment, dimming module driver 249 provides a predetermined amount of current for a predetermined amount of time to dimming module 198 in response to a dimming module ND value 249a or user preference value 244b. In an embodiment, dimming module driver 249 includes a table to store digital values associated with a predetermined amount of current to apply to dimming module 198 in response to corresponding digital values in dimming module ND value 249a.
[0042] The control circuit 136 may include other control units not illustrated here but related to other functions of a HMD 2 such as providing audio output, identifying head orientation and location information.
[0043] Figure 3 is a top view of an embodiment of a near-eye display 141 being coupled with a projection light engine 120 having an external exit pupil 121. In order to show the components of the display optical system 14, in this case 141 for the left eye, a portion of the top frame section 117 covering the near-eye display 141 and the projection light engine 120 is not depicted. Arrow 142 represents an optical axis of the near-eye display 141.
[0044] In an embodiment illustrated in Figure 3, the near-eye displays 141 (and 14r) are optical see-through displays. In other embodiments, they can be video-see displays. Each display includes a display unit 112 illustrated between two optional see-through lenses 116 and 118 and including a waveguide 123. The optional see through lenses 116 and 118 serve as protective coverings and/or to form an image for the viewer at a comfortable visual distance, such as lm ahead, for the display unit. One or both of them may also be used to implement a user's eyeglass prescription. In this example, eye space 140 approximates a location of a user's eye when HMD 2 is worn. The waveguide directs image data in the form of image light from a projection light engine 120 towards a user eye space 140 while also allowing ambient 170 light from the real world to pass through towards a user's eye space, thereby allowing a user to have an actual direct view of the space in front of HMD 2 in addition to seeing an image of a virtual feature from the projection light engine 120.
[0045] In this top view, the projection light engine 120 includes a birdbath optical element 234 illustrated as a curved surface. The curved surface provides optical power to the beams 235 of image light it reflects, thus collimating them as well. Only one beam is labeled to prevent overcrowding the drawing. In some embodiments, the radius of curvature of the birdbath optical element is at least -38 millimeters (mm).
[0046] In some embodiments, a waveguide 123 may be a diffractive waveguide. Additionally, in some examples, a waveguide 123 is a surface relief grating (SRG) waveguide. An input grating 119 couples an image light from a projection light engine 120. Additionally, a waveguide has a number of exit gratings 125 for an image light to exit the waveguide in the direction of a user eye space 140. One exit grating 125 is labeled to avoid overcrowding the drawing.
[0047] In the illustrated embodiment of Figure 3, the projection light engine 120 in a left side housing 1301 includes an image source, for example a microdisplay, which produces the image light and a projection optical system which folds an optical path of the image light to form an exit pupil 121 external to the projection light engine 120. The shape of the projection light engine 120 is an illustrative example adapting to the shape of the example of left side housing 1301 which conforms around a corner of the frame 115 in Figure
1 reducing bulkiness. The shape may be varied to accommodate different arrangements of the projection light engine 120, for example due to different image source technologies implemented.
[0048] There are different image generation technologies that can be used to implement an image source. For example, a microdisplay can be implemented using a transmissive projection technology. In one example of such technology, a light source is modulated by optically active material: the material is usually implemented using transmissive LCD type microdisplays with powerful backlights and high optical energy densities. Other microdisplays use a reflective technology for which light from an illumination unit is reflected and modulated by an optically active material. The illumination maybe a white source or RGB source, depending on the technology. Digital light processing (DLP), digital micromirror device (DMD), and LCoS are all examples of reflective technologies which are efficient as most energy is reflected away from the modulated structure and may be used by the display. Additionally, a microdisplay can be self-emitting, such as a color-emitting organic light emitting diode (OLED) microdisplay or an array of LEDs. LED arrays may be created conventionally on gallium nitride (GaN) substrates with a phosphor layer for spectral conversion or other color conversion method. In an embodiment, a self-emissive display is relayed and magnified for a viewer. .
[0049] Figure 3 shows half of a HMD 2. For the illustrated embodiment, a full HMD 2 may include another display optical system 14 with another set of optional see-through lenses 116 and 118, another waveguide 123, as well as another projection light engine 120, and another of outward facing capture devices 113. In some embodiments, there may be a continuous display viewed by both eyes, rather than a display optical system for each eye. In some embodiments, a single projection light engine 120 may be optically coupled to a continuous display viewed by both eyes or be optically coupled to separate displays for the eyes. Additional details of a head mounted personal A/V apparatus are illustrated in United States Patent Application Serial No. 12/905952 entitled Fusing Virtual Content Into Real Content, Filed October 15, 2010.
[0050] Figure 4 illustrates a monochrome electrochromic cell 400 used in a dimming module 198 according to an embodiment. In an embodiment, a dimming module 198 includes a single or a plurality of monochrome electrochromic cells. In embodiments, monochrome electrochromic cell 400 includes additional materials as well.
[0051] Monochrome electrochromic cell 400 includes a stacked plurality of layered substrates. In an embodiment, the stacked plurality of layered substrates are at least partially
contiguous. In an alternate embodiment, one or more other material is interspersed between the depicted layers. Substrates 450a-b are a clear or transparent substrate layer in an embodiment. In embodiments, substrates 450a-b are an outer surface of dimming module 198 and are a moldable transparent material, such as (Poly) methyl methacrylate (PMMA), polycarbonate, glass or similar material. In an embodiment, substrates 450a-b correspond to lens 116 and 118 shown in Figure 3. In an embodiment, substrates 451a-b are transparent conductive layers to conduct current. In an embodiment, substrates 451a-b include a material selected from a group consisting of Indium Tin Oxide, metal mesh, silver nanowires, carbon nanotubes, graphene and PEDOT:PSS (Poly(3,4- ethylenedioxythiophene) polystyrene sulfonate). In an embodiment, substrates 451a-b are coupled to a current source, such as dimming module driver 249 to receive a predetermined amount of current for a predetermined amount of time. In an embodiment, substrate 453a is a monochrome electrochromic compound layer. In an embodiment, substrate 454a is an insulator, such as silicon dioxide (S1O2), and is disposed between substrate 453 a and substrate 451b. In an embodiment, a substrate including an electrolyte is included in monochrome electrochromic cell 400.
[0052] In an embodiment, a monochrome electrochromic cell 400 has one or more particular materials that change color or opacity when a charge or current is applied. The charge causes ion migration from one area to another, causing a decided change in the material's visible characteristics.
[0053] In embodiments, monochrome electrochromic cell 400 is manufactured or processed using a variety of process steps. In an embodiment, one or more substrates are formed using one or more spin coating process steps. In an alternate embodiment, some substrates may be formed by a spin coating process step, while other substrates may be assembled or affixed to each other.
[0054] Figure 5 illustrates a multiple monochrome cell embodiment in a dimming module used in a near-eye display. In an embodiment, a dimming module 198 includes a plurality of monochrome electrochromic cells 500. Multiple monochrome cells may be used to increase a dimmable range toward a darkened state as illustrated in Figure 6. In an embodiment, multiple monochrome cells may be used in an application when a relatively high darkened state is desirable and a relatively lower transmissive state is satisfactory.
[0055] Table I of Figure 6 illustrates that a single monochrome cell may have a dimmable range of between approximately 4% (darkened state, 4% of ambient light is allowed to pass) to approximately 83.7% (transmissive state, 83.7% of ambient light is
allowed to pass); a two monochrome cell may have a dimmable range of between approximately 0.2% (darkened state) to approximately 70% (transmissive state); a three monochrome cell may have a dimmable range of between approximately 0.0064% (darkened state) to approximately 58.6% (transmissive state); and a four monochrome cell may have a dimmable range of between approximately 0.0003% (darkened sate) to approximately 49.0% (transmissive state).
[0056] Figure 5 illustrates a four monochrome cell embodiment or a plurality of monochrome electrochromic cells 500. In an embodiment, a plurality of monochrome electrochromic cells 500 include substrates 450a-b, 451a-b, 453a and 454a as similar described herein regarding a single monochrome cell 400 shown in Figure 4. For example, substrates 450a-b are clear or transparent substrate layers in an embodiment. In embodiments, substrates 450a-b are an outer surface of dimming module 198 and are a moldable transparent material, such as (Poly) methyl methacrylate (PMMA), polycarbonate, glass or similar material.
[0057] In addition, a plurality of monochrome electrochromic cells 500 includes additional stacked substrates 451c-e that may also be transparent conductor substrates coupled to a controllable current source as described herein. Additional substrate layers 453b-d and 454b-d may be respective monochrome electrochromic compound layers and insulator layers, as similar described in regard to Figure 4.
[0058] In an embodiment, the stacked plurality of layered substrates in a plurality of monochrome electrochromic cells 500 are at least partially contiguous. In an alternate embodiment, one or more other material is interspersed between the depicted layers.
[0059] A plurality of monochrome electrochromic cells 500, similar to a single monochrome cell 400, may be manufactured or processed using a variety of process steps. In an embodiment, one or more substrates are formed using one or more spin coating process steps. In an alternate embodiment, some substrates may be formed by a spin coating process step, while other substrates may be assembled or affixed to each other.
[0060] Figure 7 is a block diagram of an embodiment of a system from a software perspective for displaying image data or light (such as a CGI) by a near-eye display device. Figure 7 illustrates an embodiment of a computing environment 54 from a software perspective which may be implemented by a system like NED system 8, network accessible computing system(s) 12 in communication with one or more NED systems or a combination thereof. Additionally, a NED system can communicate with other NED systems for sharing data and processing resources.
[0061] As described herein, an executing application determines which image data is to be displayed, some examples of which are text, emails, virtual books or game related images. In this embodiment, an application 162 may be executing on one or more processors of the NED system 8 and communicating with an operating system 190 and an image and audio processing engine 191. In the illustrated embodiment, a network accessible computing system(s) 12 may also be executing a version 162N of the application as well as other NED systems 8 with which it is in communication for enhancing the experience.
[0062] Application 162 includes a game in an embodiment. The game may be stored on a remote server and purchased from a console, computer, or smartphone in embodiments. The game may be executed in whole or in part on the server, console, computer, smartphone or on any combination thereof. Multiple users might interact with the game using standard controllers, computers, smartphones, or companion devices and use air gestures, touch, voice, or buttons to communicate with the game in embodiments.
[0063] Application(s) data 329 for one or more applications may also be stored in one or more network accessible locations. Some examples of application(s) data 329 may be one or more rule data stores for rules linking action responses to user input data, rules for determining which image data to display responsive to user input data, reference data for natural user input like for one or more gestures associated with the application which may be registered with a gesture recognition engine 193, execution criteria for the one or more gestures, voice user input commands which may be registered with a sound recognition engine 194, physics models for virtual objects associated with the application which may be registered with an optional physics engine (not shown) of the image and audio processing engine 191, and object properties like color, shape, facial features, clothing, etc. of the virtual objects and virtual imagery in a scene.
[0064] As shown in Figure 7, the software components of a computing environment 54 comprise the image and audio processing engine 191 in communication with an operating system 190. The illustrated embodiment of an image and audio processing engine 191 includes an object recognition engine 192, gesture recognition engine 193, display data engine 195, a sound recognition engine 194, and a scene mapping engine 306. The individual engines and data stores provide a supporting platform of data and tasks which an application(s) 162 can leverage for implementing its one or more functions by sending requests identifying data for processing and receiving notification of data updates. The operating system 190 facilitates communication between the various engines and applications. The operating system 190 makes available to applications which objects have
been identified by the object recognition engine 192, gestures the gesture recognition engine 193 has identified, which words or sounds the sound recognition engine 194 has identified, and the positions of objects, real and virtual from the scene mapping engine 306.
[0065] The computing environment 54 also stores data in image and audio data buffer(s) 199 which provide memory for image data and audio data which may be captured or received from various sources as well as memory space for image data to be displayed. The buffers may exist on both the NED, e.g. as part of the overall memory 244, and may also exist on the companion processing module 4.
[0066] In many applications, virtual data (or a virtual image) is to be displayed in relation to a real object in the real environment. The object recognition engine 192 of the image and audio processing engine 191 detects and identifies real objects, their orientation, and their position in a display FOV based on captured image data and captured depth data from outward facing image capture devices 113 if available or determined depth positions from stereopsis based on the image data of the real environment captured by the capture devices 113. The object recognition engine 192 distinguishes real objects from each other by marking object boundaries, for example using edge detection, and comparing the object boundaries with structure data 200. Besides identifying the type of object, an orientation of an identified object may be detected based on the comparison with stored structure data 200. Accessible over one or more communication network(s) 50, structure data 200 may store structural information such as structural patterns for comparison and image data as references for pattern recognition. Reference image data and structural patterns may also be available in user profile data 197 stored locally or accessible in Cloud based storage.
[0067] The scene mapping engine 306 tracks the three dimensional (3D) position, orientation, and movement of real and virtual objects in a 3D mapping of the display FOV. Image data is to be displayed in a user's FOV or in a 3D mapping of a volumetric space about the user based on communications with the object recognition engine 192 and one or more executing application(s) 162 causing image data to be displayed.
[0068] An application(s) 162 identifies a target 3D space position in the 3D mapping of the display FOV for an object represented by image data and controlled by the application. For example, the helicopter shoot down application identifies changes in the position and object properties of the helicopters based on the user's actions to shoot down the virtual helicopters. The display data engine 195 performs translation, rotation, and scaling operations for display of the image data at the correct size and perspective. The display data engine 195 relates the target 3D space position in the display FOV to display coordinates of
the display unit 112. For example, the display data engine may store image data for each separately addressable display location or area (e.g. a pixel, in a Z-buffer and a separate color buffer). The display driver 246 translates the image data for each display area to digital control data instructions for microdisplay circuitry 259 or the display illumination driver 247 or both for controlling display of image data by the image source.
[0069] The technology described herein may be embodied in other specific forms or environments without departing from the spirit or essential characteristics thereof. Likewise, the particular naming and division of modules, engines routines, applications, features, attributes, methodologies and other aspects are not mandatory, and the mechanisms that implement the technology or its features may have different names, divisions and/or formats.
[0070] The technology described herein may be embodied in a variety of operating environments. For example, NED system 8 and/or network accessible computing system(s) 12 may be included in an Internet of Things (IoT) embodiment. The IoT embodiment may include a network of devices that may have the ability to capture information via sensors. Further, such devices may be able to track, interpret, and communicate collected information. These devices may act in accordance with user preferences and privacy settings to transmit information and work in cooperation with other devices. Information may be communicated directly among individual devices or via a network such as a local area network (LAN), wide area network (WAN), a "cloud" of interconnected LANs or WANs, or across the entire Internet. These devices may be integrated into computers, appliances, smartphones wearable devices, implantable devices, vehicles (e.g., automobiles, airplanes, and trains), toys, buildings, and other objects.
[0071] The technology described herein may also be embodied in a Big Data or Cloud operating environment as well. In a Cloud operating environment, information including data, images, engines, operating systems, and/or applications described herein may be accessed from a remote storage device via the Internet. In an embodiment, a modular rented private cloud may be used to access information remotely. In a Big Data operating embodiment, data sets have sizes beyond the ability of typically used software tools to capture, create, manage, and process the data within a tolerable elapsed time. In an embodiment, image data may be stored remotely in a Big Data operating embodiment.
[0072] There are a variety of ways or embodiments a NED device having a dimming module may operate with different applications. For example, a user may, but not limited to: 1) play an AR video game; 2) watch a movie; 3) teleconference; and/or 4) view messages
or documents. In embodiments, the following examples may include methods of operating a NED device that may include one or more steps illustrated in Figures 8A-D described in detail herein.
[0073] In a first example, a user is playing an AR game using a NED device. In this example, an ambient light sensor from a HMD having a NED device, such as ambient light sensor 257a, senses an ambient light environment of 50 lux. This is an ambient light level that this particular NED device having a dimming module can match. Accordingly, for this particular AR game being played on this particular NED device, no dimming is needed. The display brightness level is set to match AR content to the same visual brightness as the user's environment and a dimming module of the NED device is set to the highest transmission state.
[0074] In an embodiment using hardware components of Figure 2, the 50 lux value in the above example is stored as ambient light value 244c in memory 244 and application 224a corresponds to the AR game. Dimming 244f executed by processing unit 210 compares this 50 lux value with display brightness range 244d in memory 244. Dimming 244f then determines that dimming module ND value 249a may be set to a full transmission state value or at the full transmission state value in dimming module (ND) range 244e. A display brightness value 247a, in display driver 246, is also set by the AR game.
[0075] In a second example, a user is playing a movie in an airplane using a NED device having a dimming module. For a movie application, or a movie player application, it is desirable to block out the user's environment and provide them with a clean background for the movie. In this example, a dimming module in a NED device is set to a fully darkened state or lowest transmission state.
[0076] In an embodiment using the hardware components of Figure 2, dimming 244f executed by processing unit 210 would determine a type of application to be executed, such as a movie player application, and then set a dimming module ND value 249a to a full darkened state value or at the full darkened state value in dimming module (ND) range 244e. In an embodiment, a movie player application corresponds to application 244a and provides its type to dimming 244f.
[0077] In a third example, a user is teleconferencing with another person on a NED device having a dimming module while also viewing a CGI. The user wants to be able to see the person they are speaking to along with the CGI or document during the teleconference. The user has set an ambient light user preference for the teleconferencing application such that the user may easily also see their environment as well.
[0078] An ambient light sensor, such as ambient light sensor 257a, senses an ambient light environment of 3000 lux and stores a corresponding value in ambient light value 244c. This particular NED display cannot generate a brightness level greater than 800 cd/m2 (as determined by display brightness range 244d in an embodiment), so the dimming module is used to filter the environment to a brightness level that the displayed information can match. A processor, such as processing unit 210, makes the following calculation to determine a dimming value (such as dimming module ND value 249a) after comparing the ambient light value to a display brightness range:
[0079] % (Dimming value)=Display brightness highest range value /(ambient light value/pi) or 83.7% (Dimming value) = 800 cd/m2 / ( 3000 / 3.14159 )
[0080] The calculated or determined dimming value, 83.7%, is within the density range of the dimming module (or compared with dimming module ND range 244e), so the desired dimming level is converted to a dimming module drive level (or dimming module ND value 249a in an embodiment). When the dimming level is beyond the dimming module capability (or beyond dimming module ND range 244e), the dimming level is set to the closest level within capabilities. The dimming module drive level (or dimming module ND value 249a in an embodiment) is then determined by mapping the desired density result through the dimming module's drive vs. density characteristic function. In an embodiment, this calculation may be done with look-up table in display driver 246. Accordingly, the dimming module and the display brightness may both be set to their desired states.
[0081] In a fourth example, an office worker has abandoned their multiple desktop monitors for a virtual monitor application provided by a NED device having a dimming module. The user's preference is to have the NED device have partial dimming, so that the user can see people who stop by yet allow the virtual monitor to dominate the user's visual attention—not the items on the user's wall or what's going on out the window. As described herein, an ambient light sensor may sense an ambient light value of 400 lux. A processor, as described herein, may then calculate or determine a dimming value (or dimming module ND value 249a in an embodiment) based on the ambient light value and user preference value for the virtual monitor application. For example, a user preference value and dimming module may cause the ambient light for this particular NED device to be reduced or limited by 50% when using the virtual monitor application with the above ambient light.
[0082] Figures 8A-D are flowcharts of embodiment of methods for operating a NED device having a dimming module and/or system having a dimming module. The steps illustrated in Figures 8A-D may be performed by hardware components, software
components and a user, singly or in combination. For illustrative purposes, the method embodiments below are described in the context of the system and apparatus embodiments described above. However, the method embodiments are not limited to operating in the system embodiments described herein and may be implemented in other system embodiments. Furthermore, the method embodiments may be continuously performed while the NED system is in operation and an applicable application is executing.
[0083] Step 801 , of method 800 shown in Figure 8 A, begins by sensing an ambient light value associated with an amount of ambient light received by a transmissive near-eye display. In an embodiment, ambient light sensor 257a senses ambient light 170 shown in Figures 1 and 2.
[0084] Step 802 illustrates storing the ambient light value. In an embodiment, ambient light value 244c is stored in memory 244 as illustrated in Figure 2. In an alternate embodiment, step 802 may be omitted and the ambient light value is not stored.
[0085] Step 803 illustrates executing an application to provide image light to the transmissive near-eye display. In an embodiment, application 244a is executed by at least processing unit 210 as illustrated in Figure 2.
[0086] Step 804 illustrates retrieving a display brightness value associated with the application. In an embodiment, a display brightness value may be obtained from application 244a or from display brightness value 247a in display driver 246.
[0087] Step 805 illustrates determining a dimming value for a dimming module in response to the ambient light value and the display brightness value. In embodiments, a software component, such as dimming 244f being executed by processing unit 210 determines a dimming value for dimming module 198 and stores the value as dimming module ND value 249a in display driver 246.
[0088] Step 806 illustrates limiting the amount of ambient light that passes through the dimming module (and then through a transmissive near-eye display) in response to the dimming value. In an embodiment, dimming module driver 249 applies a predetermined amount of current for a predetermined amount of time to a dimming module 198, in particular to transparent conductors of a one or more monochrome electrochromic cells in a dimming module 198. The current then causes an electrochromic reaction that limits a predetermined amount of light passing through dimming module 198 in response to dimming module ND value 249a.
[0089] In an embodiment, method 850 of Figure 8B may replace steps 801-805 of method 800. Step 851 illustrates receiving a value that indicates a user preference as to the
amount of ambient light that passes through the dimming module. In an embodiment, a user may provide a user preference by way of a user or natural language user interface as described herein. For example, a user may make a speech command, such as "dim 50%."
[0090] Step 852 illustrates storing the value that indicates the user preference. In an embodiment, user preference value 244b is stored in memory 244 as illustrated in Figure 2. In an embodiment, a natural language or 3D user interface software component receives a spoken or gestured user preference and translates the spoken or gestured preference to a digital value representing that preference as user preference value 244b. In an alternate embodiment, step 852 may be omitted and the user preference may not be stored.
[0091] Step 853 illustrates determining the dimming value is based on the value that indicates a user preference. In an embodiment, dimming 244f executing by processing unit 210 makes this determination in response to user preference value 244b.
[0092] In an embodiment, method 860 shown in Figure 8C may replace steps 801-805 of method 800. Step 861 illustrates determining a type of the application. In an embodiment, an application 244a executing by processing unit 210 provides the type of application to an operating system or dimming 244f. In an alternate embodiment, dimming 244f executing by processing unit 210 queries application 244a for a type.
[0093] Step 862 illustrates determining the dimming value is based on the type of application. In an embodiment, dimming 244f executed by processing unit 210 sets a dimming value in response to a type of application that is executing or will be executing. For example, dimming 244f sets dimming module ND value 249a to a value corresponding to a dark state when a movie player is executing and may set a dimming value module ND value 249a to correspond to a transmissive state for messaging application.
[0094] In an embodiment, method 870 shown in Figure 8D is performed in combination with method 800 in Figure 8 A. Step 871 illustrates retrieving a display brightness range. In an embodiment, dimming 244f executed by processing unit 210 retrieves a display brightness rage from display brightness range 244d in memory 244.
[0095] Step 872 illustrates retrieving a dimming module neutral density (ND) range. In an embodiment, dimming 244f executed by processing unit 210 retrieves a dimming module neutral density (ND) range 244e stored in memory 244.
[0096] Step 873 illustrates determining the dimming value is in response to the ambient light value, display brightness value, display brightness range and dimming module neutral density range. In an embodiment, dimming 244f executed by processing unit 210 performs at least part of step 873. In an embodiment, the dimming value is determined based on the
ambient light value, display brightness value, display brightness range and dimming module ND range, singly or in combination thereof, as described herein.
[0097] Figure 9 is a block diagram of one embodiment of an exemplary computing system 900 that can be used to implement a network accessible computing system(s) 12, a companion processing module 4, or another embodiment of control circuit 136 of a HMD 2. Computing system 900 may host at least some of the software components of computing environment 54. In an embodiment, computing system 900 may be embodied in a Cloud server, server, client, peer, desktop computer, laptop computer, hand-held processing device, tablet, smartphone and/or wearable computing/processing device.
[0098] In its most basic configuration, computing system 900 typically includes one or more processing units (or cores) 902 or one or more central processing units (CPU) and one or more graphics processing units (or cores) (GPU). Computing system 900 also includes memory 904. Depending on the exact configuration and type of computer system, memory 904 may include volatile memory 905 (such as RAM), non-volatile memory 907 (such as ROM, flash memory, etc.) or some combination thereof. This most basic configuration is illustrated in Figure 9 by dashed line 906.
[0099] Additionally, computing system 900 may also have additional features/functionality. For example, computing system 900 may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. Such additional storage is illustrated in Figure 9 by removable storage 908 and non-removable storage 910.
[00100] Alternatively, or in addition to processing unit(s) 902, the functionally described herein can be performed or executed, at least in part, by one or more other hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program Application-specific Integrated Circuits (ASICs), Program Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs) and other like type of hardware logic components.
[00101] Computing system 900 may also contain communication module(s) 912 including one or more network interfaces and transceivers that allow the device to communicate with other computer systems. Computing system 900 may also have input device(s) 914 such as keyboard, mouse, pen, microphone, touch input device, gesture recognition device, facial recognition device, tracking device or similar input device. Output device(s) 916 such as a display, speaker, printer, or similar output device may also be included.
[00102] A user interface (UI) software component to interface with a user may be stored in and executed by computing system 900. In an embodiment, computing system 900 stores and executes a NUI and/or 3D UI. Examples of NUIs include using speech recognition, touch and stylus recognition, gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, hover, gestures, and machine intelligence. Specific categories of NUI technologies include for example, touch sensitive displays, voice and speech recognition, intention and goal understanding, motion gesture detection using depth cameras (such as stereoscopic or time-of-flight camera systems, infrared camera systems, RGB camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, 3D displays, head, eye, and gaze tracking, immersive augmented reality and virtual reality systems, all of which may provide a more natural interface, as well as technologies for sensing brain activity using electric field sensing electrodes (EEG and related methods).
[00103] In an embodiment, a UI (including a NUI or 3D UI) software component may be at least partially executed and/or stored on a computing system 900. In an alternate embodiment, a UI may be at least partially executed and/or stored on server and sent to a client. The UI may be generated as part of a service, and it may be integrated with other services, such as social networking services.
[00104] The example computing systems illustrated in the figures include examples of computer readable storage devices. A computer readable storage device is also a processor readable storage device. Such devices may include volatile and nonvolatile, removable and non-removable memory devices implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Some examples of processor or computer readable storage devices are RAM, ROM, EEPROM, cache, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, memory sticks or cards, magnetic cassettes, magnetic tape, a media drive, a hard disk, magnetic disk storage or other magnetic storage devices, or any other device which can be used to store the information and which can be accessed by a computing system.
Aspects of Certain Embodiments
[00105] One or more embodiments include an apparatus comprising a support structure, a near-eye display to provide image light and a dimming module. The dimming module comprises: a first transparent substrate, a first transparent conductor, a monochrome electrochromic compound, an insulator, a second transparent conductor and a second
transparent substrate. The dimming module controls light passing through the dimming module in response to a current applied between the first and second transparent conductors.
[00106] In an embodiment, the monochrome electrochromic compound and insulator are disposed at least partially between the first transparent conductor and the second transparent conductor.
[00107] In another embodiment, the insulator is disposed at least partially adjacent the monochrome electrochromic compound and the first transparent conductor is disposed at least partially adjacent the monochrome electrochromic compound.
[00108] In still another embodiment, the first transparent conductor is disposed at least partially between the first transparent substrate and the monochrome electrochromic compound. The second transparent conductor is disposed at least partially between the insulator and the second transparent substrate.
[00109] In an embodiment, the dimming module is disposed adjacent the near-eye display.
[00110] In an embodiment, the dimming module incudes a third transparent conductor, another monochrome electrochromic compound, another insulator and a fourth transparent conductor.
[00111] In embodiments, the first and second transparent substrates includes a material selected from a group consisting of Poly (methyl methacrylate) (PMMA), polycarbonate and glass. The first and second transparent conductors includes a material selected from a group consisting of Indium Tin Oxide, metal mesh, silver nanowires, carbon nanotubes, graphene and PEDOT:PSS (Poly(3,4-ethylenedioxythiophene) polystyrene sulfonate) in embodiments. The insulator includes silicon dioxide (S1O2) in an embodiment.
[00112] One or more embodiments of a method comprises sensing an ambient light value associated with an amount of ambient light received by a transmissive near-eye display. An application executes to provide image light to the transmissive near-eye display. A near-eye display brightness value associated with the application is retrieved and a dimming value for a dimming module is determined in response to the ambient light value and the near-eye display brightness value. The amount of ambient light that passes through the dimming module is limited in response to the dimming value.
[00113] In another method embodiment, the method comprises receiving a value that indicates a user preference as to the amount of ambient light that passes through the dimming module. The determining the dimming value is based on the value that indicates
the user preference rather than in response to the ambient light value and near-eye display brightness value.
[00114] In an embodiment, the method further comprises determining a type of the application and wherein determining the dimming value is based on the type of the application rather than in response to the ambient light value and the near-eye display brightness value.
[00115] In an embodiment, the type of the application includes a movie type, and wherein determining the dimming value includes setting the dimming value so that the ambient light that passes through the dimming module is reduced to a minimum.
[00116] In an embodiment, the determining the dimming value in response to the ambient light value and the near-eye display brightness value comprises: retrieving a near-eye display brightness range and retrieving a dimming module neutral density range. Determining the dimming value is in response to the ambient light value, near-eye display brightness value, near-eye display brightness range and dimming module neutral density range.
[00117] In an embodiment, the dimming module includes a monochrome electrochromic cell, and wherein limiting the amount of ambient light that passes through the dimming module in response to the dimming value comprises: applying an amount of current to the monochrome electrochromic cell in response to the dimming value.
[00118] One or more embodiments including a computing system and a HMD having a near-eye display. The computing system provides an electronic signal representing image data. The HMD provides image data in response to the electronic signal. The HMD includes a NED device comprising a display engine to output the image data, a transmissive near- eye display to provide image light in response to the image data, a dimming module and a control circuit. The dimming module reduces an amount of ambient light that passes through the transmissive near-eye display in response to a current. The control circuit provides the current in response to a dimming value.
[00119] In an embodiment, a dimming module comprises a first and second transparent conductors and a monochrome electrochromic cell. Current is applied to the first and second transparent conductors from a control circuit.
[00120] In an embodiment, the apparatus further comprises an ambient light sensor to sense an ambient light value associated with the amount of ambient light. The computing system determines the dimming value in response to at least the ambient light value.
[00121] In an embodiment, the computing system provides the electrical signal in response to the computing system executing an application, wherein the application has a display brightness value. The computing system determines the dimming value in response to at least one of the ambient light value and the display brightness value.
[00122] In an embodiment, the near-eye transmissive display has a display brightness range and the dimming module has a dimming range. The computing system determines the dimming value in response to the ambient light value, the display brightness value, the display brightness range and the dimming range.
[00123] Embodiment described in the previous paragraphs may also be combined with one or more of the specifically disclosed alternatives.
[00124] Although the subject matter has been described in language specific to structural features and/or acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as examples of implementing the claims and other equivalent features and acts that would be recognized by one skilled in the art are intended to be within the scope of the claims.
Claims
1. An apparatus comprising:
a support structure;
a near-eye display, disposed on the support structure, to provide image light; and, a dimming module comprising:
a first transparent conductor,
a monochrome electrochromic compound,
an insulator, and
a second transparent conductor,
wherein the dimming module controls light passing through the dimming module in response to a current applied between the first and second transparent conductors.
2. The apparatus of claim 1, wherein the dimming module further comprises: a first transparent substrate;
a second transparent substrate; and
wherein the insulator and monochrome electrochromic compound are disposed at least partially between the first transparent conductor and the second transparent conductor.
3. The apparatus of claim 2, wherein the first transparent conductor is disposed at least partially adjacent the monochrome electrochromic compound, and
wherein the insulator is disposed at least partially adjacent the monochrome electrochromic compound.
4. The apparatus of claim 3, wherein the first transparent conductor is disposed at least partially between the first transparent substrate and the electrochromic compound, and
wherein the second transparent conductor is disposed at least partially between the insulator and the second transparent substrate.
5. The apparatus of claim 4, wherein the dimming module is disposed adjacent the near-eye display,
wherein the near-eye display provides image light in response to a signal from a display engine.
6. The apparatus of claim 2, wherein the dimming module comprises:
a third transparent conductor;
another monochrome electrochromic compound;
another insulator; and
a fourth transparent conductor.
7. The apparatus of claim 2, wherein the first and second transparent substrates includes a material selected from the group consisting of Poly methyl methacrylate, polycarbonate and glass.
8. The apparatus of claim 7, wherein the first and second transparent conductors includes a material selected from a group consisting of Indium Tin Oxide, metal mesh, silver nanowires, carbon nanotubes, graphene and Poly(3,4-ethylenedioxythiophene) polystyrene sulfonate.
9. The apparatus of claim 8, wherein the insulator includes silicon dioxide.
10. A method comprising :
sensing an ambient light value associated with an amount of ambient light received by a transmissive near-eye display;
executing an application to provide image light to the transmissive near-eye display; retrieving a near-eye display brightness value associated with the application;
determining a dimming value for a dimming module in response to the ambient light value and the near-eye display brightness value; and
limiting the amount of ambient light that passes through the dimming module in response to the dimming value.
11. The method of claim 10, comprising:
receiving a value that indicates a user preference as to the amount of ambient light that passes through the dimming module,
wherein determining the dimming value is based on the value that indicates the user preference rather than in response to the ambient light value and near-eye display brightness value.
12. The method of claim 10, comprising:
determining a type of the application and wherein determining the dimming value is based on the type of the application rather than in response to the ambient light value and the near-eye display brightness value.
13. The method of claim 12, wherein the type of the application includes a movie type, and wherein determining the dimming value includes setting the dimming value so that the ambient light that passes through the dimming module is reduced to a minimum.
14. The method of claim 10, wherein determining the dimming value in response to the ambient light value and the near-eye display brightness value comprises:
retrieving a near-eye display brightness range; and
retrieving a dimming module neutral density range,
wherein determining the dimming value is in response to the ambient light value, near-eye display brightness value, near-eye display brightness range and dimming module neutral density range.
15. The method of claim 10, wherein the dimming module includes a monochrome electrochromic cell, and wherein limiting the amount of ambient light that passes through the dimming module in response to the dimming value comprises:
applying an amount of current to the monochrome electrochromic cell in response to the dimming value.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020177007770A KR102373940B1 (en) | 2014-08-21 | 2015-08-19 | Head-mounted display with electrochromic dimming module for augmented and virtual reality perception |
EP15756522.7A EP3183615A1 (en) | 2014-08-21 | 2015-08-19 | Head-mounted display with electrochromic dimming module for augmented and virtual reality perception |
CN201580045001.0A CN106662747B (en) | 2014-08-21 | 2015-08-19 | Head-mounted display with electrochromic dimming module for augmented reality and virtual reality perception |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/465,000 US9626936B2 (en) | 2014-08-21 | 2014-08-21 | Dimming module for augmented and virtual reality |
US14/465,000 | 2014-08-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016028828A1 true WO2016028828A1 (en) | 2016-02-25 |
Family
ID=54011899
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2015/045779 WO2016028828A1 (en) | 2014-08-21 | 2015-08-19 | Head-mounted display with electrochromic dimming module for augmented and virtual reality perception |
Country Status (5)
Country | Link |
---|---|
US (1) | US9626936B2 (en) |
EP (1) | EP3183615A1 (en) |
KR (1) | KR102373940B1 (en) |
CN (1) | CN106662747B (en) |
WO (1) | WO2016028828A1 (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9861446B2 (en) | 2016-03-12 | 2018-01-09 | Philipp K. Lang | Devices and methods for surgery |
US10194131B2 (en) | 2014-12-30 | 2019-01-29 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures |
US10294415B2 (en) | 2014-06-09 | 2019-05-21 | iGlass Technology, Inc. | Electrochromic composition and electrochromic device using same |
US10344208B2 (en) | 2014-06-09 | 2019-07-09 | iGlass Technology, Inc. | Electrochromic device and method for manufacturing electrochromic device |
CN110286538A (en) * | 2019-06-28 | 2019-09-27 | Oppo广东移动通信有限公司 | Display methods, display device, head-mounted display apparatus and storage medium |
US11048089B2 (en) | 2016-04-05 | 2021-06-29 | Ostendo Technologies, Inc. | Augmented/virtual reality near-eye displays with edge imaging lens comprising a plurality of display devices |
US11348257B2 (en) | 2018-01-29 | 2022-05-31 | Philipp K. Lang | Augmented reality guidance for orthopedic and other surgical procedures |
US11553969B1 (en) | 2019-02-14 | 2023-01-17 | Onpoint Medical, Inc. | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures |
US11751944B2 (en) | 2017-01-16 | 2023-09-12 | Philipp K. Lang | Optical guidance for surgical, medical, and dental procedures |
US11786206B2 (en) | 2021-03-10 | 2023-10-17 | Onpoint Medical, Inc. | Augmented reality guidance for imaging systems |
US11801114B2 (en) | 2017-09-11 | 2023-10-31 | Philipp K. Lang | Augmented reality display for vascular and other interventions, compensation for cardiac and respiratory motion |
US11857378B1 (en) | 2019-02-14 | 2024-01-02 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
US12053247B1 (en) | 2020-12-04 | 2024-08-06 | Onpoint Medical, Inc. | System for multi-directional tracking of head mounted displays for real-time augmented reality guidance of surgical procedures |
US12055720B2 (en) | 2019-02-28 | 2024-08-06 | Sony Group Corporation | Head-mounted display and glasses |
US12127795B2 (en) | 2024-04-08 | 2024-10-29 | Philipp K. Lang | Augmented reality display for spinal rod shaping and placement |
Families Citing this family (89)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11726332B2 (en) | 2009-04-27 | 2023-08-15 | Digilens Inc. | Diffractive projection apparatus |
US9933684B2 (en) * | 2012-11-16 | 2018-04-03 | Rockwell Collins, Inc. | Transparent waveguide display providing upper and lower fields of view having a specific light output aperture configuration |
KR20160036763A (en) * | 2014-09-25 | 2016-04-05 | 한국전자통신연구원 | Apparatus and method for augmented perception |
CN107873086B (en) | 2015-01-12 | 2020-03-20 | 迪吉伦斯公司 | Environmentally isolated waveguide display |
US9632226B2 (en) | 2015-02-12 | 2017-04-25 | Digilens Inc. | Waveguide grating device |
US11016302B2 (en) * | 2015-03-17 | 2021-05-25 | Raytrx, Llc | Wearable image manipulation and control system with high resolution micro-displays and dynamic opacity augmentation in augmented reality glasses |
US11461936B2 (en) | 2015-03-17 | 2022-10-04 | Raytrx, Llc | Wearable image manipulation and control system with micro-displays and augmentation of vision and sensing in augmented reality glasses |
SG11201708481TA (en) * | 2015-04-15 | 2017-11-29 | Razer (Asia-Pacific) Pte Ltd | Filtering devices and filtering methods |
CN104765152B (en) * | 2015-05-06 | 2017-10-24 | 京东方科技集团股份有限公司 | A kind of virtual reality glasses |
JP2017044768A (en) * | 2015-08-25 | 2017-03-02 | 株式会社ジャパンディスプレイ | Display device and head-mounted display device |
KR102404648B1 (en) * | 2015-09-21 | 2022-05-31 | 엘지디스플레이 주식회사 | Display device |
US10690916B2 (en) | 2015-10-05 | 2020-06-23 | Digilens Inc. | Apparatus for providing waveguide displays with two-dimensional pupil expansion |
JP2017228942A (en) * | 2016-06-22 | 2017-12-28 | 富士通株式会社 | Head-mounted display, transmission control program, and transmission control method |
CN106293067B (en) * | 2016-07-27 | 2019-09-27 | 上海与德通讯技术有限公司 | A kind of display changeover method and wearable display equipment |
CN107783289B (en) * | 2016-08-30 | 2024-06-04 | 北京亮亮视野科技有限公司 | Multimode head-mounted visual device |
US10032314B2 (en) | 2016-10-11 | 2018-07-24 | Microsoft Technology Licensing, Llc | Virtual reality headset |
CA3046662A1 (en) | 2016-12-22 | 2018-06-28 | Magic Leap, Inc. | Systems and methods for manipulating light from ambient light sources |
US10209520B2 (en) | 2016-12-30 | 2019-02-19 | Microsoft Technology Licensing, Llc | Near eye display multi-component dimming system |
US10545346B2 (en) | 2017-01-05 | 2020-01-28 | Digilens Inc. | Wearable heads up displays |
US10534185B1 (en) | 2017-02-14 | 2020-01-14 | Facebook Technologies, Llc | Multi-planar display with waveguide and lens stacks |
US10690919B1 (en) | 2017-02-17 | 2020-06-23 | Facebook Technologies, Llc | Superluminous LED array for waveguide display |
US10185393B2 (en) * | 2017-04-03 | 2019-01-22 | Facebook Technologies, Llc | Waveguide display with spatially switchable grating |
KR102700054B1 (en) | 2017-05-31 | 2024-08-28 | 나이키 이노베이트 씨.브이. | Sport chair with game integration |
CN107217959B (en) * | 2017-06-05 | 2019-01-18 | 哈尔滨工业大学 | A kind of Intelligent Dynamic Color tunable section Low-E glass |
US10338400B2 (en) | 2017-07-03 | 2019-07-02 | Holovisions LLC | Augmented reality eyewear with VAPE or wear technology |
US10859834B2 (en) | 2017-07-03 | 2020-12-08 | Holovisions | Space-efficient optical structures for wide field-of-view augmented reality (AR) eyewear |
US10895746B1 (en) | 2017-08-07 | 2021-01-19 | Facebook Technologies, Llc | Expanding field-of-view in direct projection augmented reality and virtual reality systems |
US10534209B1 (en) * | 2017-08-21 | 2020-01-14 | Facebook Technologies, Llc | Liquid crystal structure for controlling brightness uniformity in a waveguide display |
WO2019046334A1 (en) | 2017-08-29 | 2019-03-07 | Pcms Holdings, Inc. | System and method of compensating for real world illumination changes in augmented reality |
US10930709B2 (en) | 2017-10-03 | 2021-02-23 | Lockheed Martin Corporation | Stacked transparent pixel structures for image sensors |
CN111465887A (en) | 2017-10-11 | 2020-07-28 | 奇跃公司 | Augmented reality display including eyepiece with transparent emissive display |
CN111566723A (en) | 2017-10-26 | 2020-08-21 | 奇跃公司 | Broadband adaptive lens assembly for augmented reality displays |
US10510812B2 (en) | 2017-11-09 | 2019-12-17 | Lockheed Martin Corporation | Display-integrated infrared emitter and sensor structures |
CA3088116A1 (en) * | 2018-01-17 | 2019-07-25 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
IL275822B2 (en) * | 2018-01-17 | 2024-02-01 | Magic Leap Inc | Eye center of rotation determination, depth plane selection, and render camera positioning in display systems |
US10690910B2 (en) | 2018-02-07 | 2020-06-23 | Lockheed Martin Corporation | Plenoptic cellular vision correction |
US10129984B1 (en) | 2018-02-07 | 2018-11-13 | Lockheed Martin Corporation | Three-dimensional electronics distribution by geodesic faceting |
US10951883B2 (en) | 2018-02-07 | 2021-03-16 | Lockheed Martin Corporation | Distributed multi-screen array for high density display |
US11616941B2 (en) | 2018-02-07 | 2023-03-28 | Lockheed Martin Corporation | Direct camera-to-display system |
US10838250B2 (en) | 2018-02-07 | 2020-11-17 | Lockheed Martin Corporation | Display assemblies with electronically emulated transparency |
US10594951B2 (en) | 2018-02-07 | 2020-03-17 | Lockheed Martin Corporation | Distributed multi-aperture camera array |
US10979699B2 (en) | 2018-02-07 | 2021-04-13 | Lockheed Martin Corporation | Plenoptic cellular imaging system |
US10652529B2 (en) | 2018-02-07 | 2020-05-12 | Lockheed Martin Corporation | In-layer Signal processing |
US10613332B1 (en) | 2018-02-15 | 2020-04-07 | Facebook Technologies, Llc | Near-eye display assembly with enhanced display resolution |
JP7064371B2 (en) * | 2018-04-17 | 2022-05-10 | 株式会社Nttドコモ | Dimming control method for content display system, server device, display terminal and content display system |
JP2021522552A (en) * | 2018-04-24 | 2021-08-30 | メンター アクイジション ワン, エルエルシー | See-through computer display system with visual correction and increased content density |
US10845600B2 (en) | 2018-04-24 | 2020-11-24 | Samsung Electronics Co., Ltd. | Controllable modifiable shader layer for head mountable display |
US10747309B2 (en) | 2018-05-10 | 2020-08-18 | Microsoft Technology Licensing, Llc | Reconfigurable optics for switching between near-to-eye display modes |
US11567336B2 (en) | 2018-07-24 | 2023-01-31 | Magic Leap, Inc. | Display systems and methods for determining registration between display and eyes of user |
CN108965857A (en) * | 2018-08-09 | 2018-12-07 | 张家港康得新光电材料有限公司 | A kind of stereo display method and device, wearable stereoscopic display |
JP2021536592A (en) | 2018-08-31 | 2021-12-27 | マジック リープ, インコーポレイテッドMagic Leap, Inc. | Spatically decomposed dynamic dimming for augmented reality devices |
US11030973B2 (en) * | 2018-09-12 | 2021-06-08 | Google Llc | Wearable heads-up displays with ambient light detection and adjustable display brightness |
US10997948B2 (en) | 2018-09-21 | 2021-05-04 | Apple Inc. | Electronic device with adaptive lighting system |
DE102018218987A1 (en) * | 2018-11-07 | 2020-05-07 | Robert Bosch Gmbh | Spectacle lens for data glasses, data glasses and method for operating a spectacle lens or data glasses |
US10866413B2 (en) | 2018-12-03 | 2020-12-15 | Lockheed Martin Corporation | Eccentric incident luminance pupil tracking |
WO2020131962A1 (en) * | 2018-12-21 | 2020-06-25 | Magic Leap, Inc. | Eyepiece architecture incorporating artifact mitigation |
EP3908876A4 (en) | 2019-01-11 | 2022-03-09 | Magic Leap, Inc. | Time-multiplexed display of virtual content at various depths |
US11448918B2 (en) | 2019-01-30 | 2022-09-20 | Samsung Electronics Co., Ltd. | Grating device, screen including the grating device, method of manufacturing the screen and display apparatus for augmented reality and/or virtual reality including the screen |
CN110007485B (en) * | 2019-03-12 | 2020-08-25 | 联想(北京)有限公司 | Glasses and image processing method |
US10698201B1 (en) | 2019-04-02 | 2020-06-30 | Lockheed Martin Corporation | Plenoptic cellular axis redirection |
US11366321B1 (en) | 2019-04-23 | 2022-06-21 | Apple Inc. | Predictive dimming of optical passthrough displays |
US11474377B2 (en) * | 2019-04-24 | 2022-10-18 | Google Llc | Combiner lens and method of making thereof |
US11287655B2 (en) | 2019-06-21 | 2022-03-29 | Samsung Electronics Co.. Ltd. | Holographic display apparatus and method for providing expanded viewing window |
CN110346937A (en) * | 2019-07-23 | 2019-10-18 | 业成科技(成都)有限公司 | Rear-mounted virtual reality of wearing shows equipment |
KR20220054386A (en) | 2019-08-29 | 2022-05-02 | 디지렌즈 인코포레이티드. | Vacuum Bragg grating and manufacturing method thereof |
WO2021046242A1 (en) | 2019-09-05 | 2021-03-11 | Dolby Laboratories Licensing Corporation | Viewer synchronized illumination sensing |
US10888037B1 (en) * | 2019-09-23 | 2021-01-05 | Microsoft Technology Licensing, Llc | Anti-fogging HMD utilizing device waste heat |
WO2021138607A1 (en) * | 2020-01-03 | 2021-07-08 | Digilens Inc. | Modular waveguide displays and related applications |
CN111090172A (en) * | 2020-01-09 | 2020-05-01 | 深圳珑璟光电技术有限公司 | Near-to-eye display system and device for adjusting transparency by using electrochromic material |
CN111077679A (en) * | 2020-01-23 | 2020-04-28 | 福州贝园网络科技有限公司 | Intelligent glasses display and imaging method thereof |
WO2021168449A1 (en) | 2020-02-21 | 2021-08-26 | Raytrx, Llc | All-digital multi-option 3d surgery visualization system and control |
US11243400B1 (en) * | 2020-07-17 | 2022-02-08 | Rockwell Collins, Inc. | Space suit helmet having waveguide display |
WO2022071963A1 (en) * | 2020-10-02 | 2022-04-07 | Hewlett-Packard Development Company, L.P. | User identification via extended reality image capture |
JP2022086237A (en) * | 2020-11-30 | 2022-06-09 | セイコーエプソン株式会社 | Virtual image display device |
KR20220078093A (en) * | 2020-12-03 | 2022-06-10 | 삼성전자주식회사 | Wearable electronic device including light emitting unit |
US11454816B1 (en) | 2020-12-07 | 2022-09-27 | Snap Inc. | Segmented illumination display |
KR20220085620A (en) | 2020-12-15 | 2022-06-22 | 삼성전자주식회사 | waveguide type display apparatus |
CN112630981A (en) * | 2021-03-08 | 2021-04-09 | 宁波圻亿科技有限公司 | Wearable device |
CN113282141A (en) * | 2021-05-31 | 2021-08-20 | 华北水利水电大学 | Wearable portable computer and teaching platform based on mix virtual reality |
KR20230053215A (en) * | 2021-10-14 | 2023-04-21 | 삼성전자주식회사 | Wearable electronic device adjusting the transmittance of a visor and the brightness of a display |
KR20230053414A (en) * | 2021-10-14 | 2023-04-21 | 삼성전자주식회사 | Wearable electronic device including sensor module |
EP4369076A1 (en) | 2021-10-14 | 2024-05-15 | Samsung Electronics Co., Ltd. | Wearable electronic device comprising sensor module |
US11817022B2 (en) * | 2021-11-30 | 2023-11-14 | Meta Platforms Technologies, Llc | Correcting artifacts in tiled display assemblies for artificial reality headsets |
US20230204958A1 (en) * | 2021-12-28 | 2023-06-29 | David Fliszar | Eyewear electronic tinting lens with integrated waveguide |
CN114280789B (en) * | 2021-12-29 | 2024-02-27 | Oppo广东移动通信有限公司 | Near-eye display optical system and near-eye display optical apparatus |
US20240162200A1 (en) * | 2022-06-15 | 2024-05-16 | Lumileds Llc | Lamination of a light source having a low-density set of light-emitting elements |
WO2024025126A1 (en) * | 2022-07-26 | 2024-02-01 | 삼성전자 주식회사 | Wearable electronic device and operation method thereof |
EP4433862A1 (en) * | 2022-09-02 | 2024-09-25 | Google LLC | See-through display with varying thickness conductors |
CN118573833A (en) * | 2024-08-01 | 2024-08-30 | 雷鸟创新技术(深圳)有限公司 | Near-eye display device, display compensation method thereof, electronic equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS56146123A (en) * | 1980-04-15 | 1981-11-13 | Asahi Glass Co Ltd | Electro-optical panel |
DE202010013016U1 (en) * | 2010-11-30 | 2011-02-17 | Loewe Opta Gmbh | Glasses with controllable transparency |
US20120105473A1 (en) * | 2010-10-27 | 2012-05-03 | Avi Bar-Zeev | Low-latency fusing of virtual and real content |
JP2014021452A (en) * | 2012-07-23 | 2014-02-03 | Ricoh Co Ltd | Electrochromic device and manufacturing method of the same |
US20140111838A1 (en) * | 2012-10-24 | 2014-04-24 | Samsung Electronics Co., Ltd. | Method for providing virtual image to user in head-mounted display device, machine-readable storage medium, and head-mounted display device |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7850304B2 (en) * | 2001-01-23 | 2010-12-14 | Kenneth Martin Jacobs | Continuous adjustable 3Deeps filter spectacles for optimized 3Deeps stereoscopic viewing and its control method and means |
JP5226528B2 (en) | 2005-11-21 | 2013-07-03 | マイクロビジョン,インク. | Display having an image guiding substrate |
EP2426552A1 (en) * | 2006-03-03 | 2012-03-07 | Gentex Corporation | Electro-optic elements incorporating improved thin-film coatings |
US8456410B2 (en) * | 2006-12-12 | 2013-06-04 | Intersil Americas Inc. | Backlight control using light sensors with infrared suppression |
FR2950982B1 (en) | 2009-10-06 | 2017-05-19 | Thales Sa | VISION EQUIPMENT COMPRISING AN OPTICAL LAMP WITH A CONTROLLED LIGHT TRANSMISSION COEFFICIENT |
US8477425B2 (en) | 2010-02-28 | 2013-07-02 | Osterhout Group, Inc. | See-through near-eye display glasses including a partially reflective, partially transmitting optical element |
US20120050044A1 (en) * | 2010-08-25 | 2012-03-01 | Border John N | Head-mounted display with biological state detection |
US8780014B2 (en) | 2010-08-25 | 2014-07-15 | Eastman Kodak Company | Switchable head-mounted display |
US9087471B2 (en) | 2011-11-04 | 2015-07-21 | Google Inc. | Adaptive brightness control of head mounted display |
JP6003903B2 (en) * | 2012-01-24 | 2016-10-05 | ソニー株式会社 | Display device |
US20130286053A1 (en) | 2012-04-25 | 2013-10-31 | Rod G. Fleck | Direct view augmented reality eyeglass-type display |
US20130328925A1 (en) | 2012-06-12 | 2013-12-12 | Stephen G. Latta | Object focus in a mixed reality environment |
US9430055B2 (en) | 2012-06-15 | 2016-08-30 | Microsoft Technology Licensing, Llc | Depth of field control for see-thru display |
US9398844B2 (en) | 2012-06-18 | 2016-07-26 | Microsoft Technology Licensing, Llc | Color vision deficit correction |
US9720231B2 (en) * | 2012-09-26 | 2017-08-01 | Dolby Laboratories Licensing Corporation | Display, imaging system and controller for eyewear display device |
US9311718B2 (en) * | 2014-01-23 | 2016-04-12 | Microsoft Technology Licensing, Llc | Automated content scrolling |
US9766459B2 (en) | 2014-04-25 | 2017-09-19 | Microsoft Technology Licensing, Llc | Display devices with dimming panels |
-
2014
- 2014-08-21 US US14/465,000 patent/US9626936B2/en active Active
-
2015
- 2015-08-19 CN CN201580045001.0A patent/CN106662747B/en active Active
- 2015-08-19 KR KR1020177007770A patent/KR102373940B1/en active IP Right Grant
- 2015-08-19 WO PCT/US2015/045779 patent/WO2016028828A1/en active Application Filing
- 2015-08-19 EP EP15756522.7A patent/EP3183615A1/en not_active Ceased
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS56146123A (en) * | 1980-04-15 | 1981-11-13 | Asahi Glass Co Ltd | Electro-optical panel |
US20120105473A1 (en) * | 2010-10-27 | 2012-05-03 | Avi Bar-Zeev | Low-latency fusing of virtual and real content |
DE202010013016U1 (en) * | 2010-11-30 | 2011-02-17 | Loewe Opta Gmbh | Glasses with controllable transparency |
JP2014021452A (en) * | 2012-07-23 | 2014-02-03 | Ricoh Co Ltd | Electrochromic device and manufacturing method of the same |
US20140111838A1 (en) * | 2012-10-24 | 2014-04-24 | Samsung Electronics Co., Ltd. | Method for providing virtual image to user in head-mounted display device, machine-readable storage medium, and head-mounted display device |
Cited By (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11698565B2 (en) | 2014-06-09 | 2023-07-11 | Vitro Flat Glass Llc | Electrochromic device and method for manufacturing electrochromic device |
US10698285B2 (en) | 2014-06-09 | 2020-06-30 | iGlass Technology, Inc. | Electrochromic device and method for manufacturing electrochromic device |
US10344208B2 (en) | 2014-06-09 | 2019-07-09 | iGlass Technology, Inc. | Electrochromic device and method for manufacturing electrochromic device |
US10294415B2 (en) | 2014-06-09 | 2019-05-21 | iGlass Technology, Inc. | Electrochromic composition and electrochromic device using same |
US11483532B2 (en) | 2014-12-30 | 2022-10-25 | Onpoint Medical, Inc. | Augmented reality guidance system for spinal surgery using inertial measurement units |
US10594998B1 (en) | 2014-12-30 | 2020-03-17 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays and surface representations |
US10194131B2 (en) | 2014-12-30 | 2019-01-29 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures |
US10326975B2 (en) | 2014-12-30 | 2019-06-18 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures |
US12010285B2 (en) | 2014-12-30 | 2024-06-11 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery with stereoscopic displays |
US11652971B2 (en) | 2014-12-30 | 2023-05-16 | Onpoint Medical, Inc. | Image-guided surgery with surface reconstruction and augmented reality visualization |
US10951872B2 (en) | 2014-12-30 | 2021-03-16 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with real time visualization of tracked instruments |
US11350072B1 (en) | 2014-12-30 | 2022-05-31 | Onpoint Medical, Inc. | Augmented reality guidance for bone removal and osteotomies in spinal surgery including deformity correction |
US10511822B2 (en) | 2014-12-30 | 2019-12-17 | Onpoint Medical, Inc. | Augmented reality visualization and guidance for spinal procedures |
US11750788B1 (en) | 2014-12-30 | 2023-09-05 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery with stereoscopic display of images and tracked instruments |
US10602114B2 (en) | 2014-12-30 | 2020-03-24 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery and spinal procedures using stereoscopic optical see-through head mounted displays and inertial measurement units |
US11272151B2 (en) | 2014-12-30 | 2022-03-08 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery with display of structures at risk for lesion or damage by penetrating instruments or devices |
US12063338B2 (en) | 2014-12-30 | 2024-08-13 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery with stereoscopic displays and magnified views |
US10742949B2 (en) | 2014-12-30 | 2020-08-11 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays and tracking of instruments and devices |
US11153549B2 (en) | 2014-12-30 | 2021-10-19 | Onpoint Medical, Inc. | Augmented reality guidance for spinal surgery |
US11050990B2 (en) | 2014-12-30 | 2021-06-29 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with cameras and 3D scanners |
US10841556B2 (en) | 2014-12-30 | 2020-11-17 | Onpoint Medical, Inc. | Augmented reality guidance for spinal procedures using stereoscopic optical see-through head mounted displays with display of virtual surgical guides |
US11172990B2 (en) | 2016-03-12 | 2021-11-16 | Philipp K. Lang | Systems for augmented reality guidance for aligning physical tools and instruments for arthroplasty component placement, including robotics |
US10368947B2 (en) | 2016-03-12 | 2019-08-06 | Philipp K. Lang | Augmented reality guidance systems for superimposing virtual implant components onto the physical joint of a patient |
US11013560B2 (en) | 2016-03-12 | 2021-05-25 | Philipp K. Lang | Systems for augmented reality guidance for pinning, drilling, reaming, milling, bone cuts or bone resections including robotics |
US10799296B2 (en) | 2016-03-12 | 2020-10-13 | Philipp K. Lang | Augmented reality system configured for coordinate correction or re-registration responsive to spinal movement for spinal procedures, including intraoperative imaging, CT scan or robotics |
US9980780B2 (en) | 2016-03-12 | 2018-05-29 | Philipp K. Lang | Guidance for surgical procedures |
US10743939B1 (en) | 2016-03-12 | 2020-08-18 | Philipp K. Lang | Systems for augmented reality visualization for bone cuts and bone resections including robotics |
US9861446B2 (en) | 2016-03-12 | 2018-01-09 | Philipp K. Lang | Devices and methods for surgery |
US10603113B2 (en) | 2016-03-12 | 2020-03-31 | Philipp K. Lang | Augmented reality display systems for fitting, sizing, trialing and balancing of virtual implant components on the physical joint of the patient |
US11311341B2 (en) | 2016-03-12 | 2022-04-26 | Philipp K. Lang | Augmented reality guided fitting, sizing, trialing and balancing of virtual implants on the physical joint of a patient for manual and robot assisted joint replacement |
US10159530B2 (en) | 2016-03-12 | 2018-12-25 | Philipp K. Lang | Guidance for surgical interventions |
US11957420B2 (en) | 2016-03-12 | 2024-04-16 | Philipp K. Lang | Augmented reality display for spinal rod placement related applications |
US11452568B2 (en) | 2016-03-12 | 2022-09-27 | Philipp K. Lang | Augmented reality display for fitting, sizing, trialing and balancing of virtual implants on the physical joint of a patient for manual and robot assisted joint replacement |
US10405927B1 (en) | 2016-03-12 | 2019-09-10 | Philipp K. Lang | Augmented reality visualization for guiding physical surgical tools and instruments including robotics |
US11850003B2 (en) | 2016-03-12 | 2023-12-26 | Philipp K Lang | Augmented reality system for monitoring size and laterality of physical implants during surgery and for billing and invoicing |
US11602395B2 (en) | 2016-03-12 | 2023-03-14 | Philipp K. Lang | Augmented reality display systems for fitting, sizing, trialing and balancing of virtual implant components on the physical joint of the patient |
US10849693B2 (en) | 2016-03-12 | 2020-12-01 | Philipp K. Lang | Systems for augmented reality guidance for bone resections including robotics |
US10292768B2 (en) | 2016-03-12 | 2019-05-21 | Philipp K. Lang | Augmented reality guidance for articular procedures |
US10278777B1 (en) | 2016-03-12 | 2019-05-07 | Philipp K. Lang | Augmented reality visualization for guiding bone cuts including robotics |
US11048089B2 (en) | 2016-04-05 | 2021-06-29 | Ostendo Technologies, Inc. | Augmented/virtual reality near-eye displays with edge imaging lens comprising a plurality of display devices |
US11751944B2 (en) | 2017-01-16 | 2023-09-12 | Philipp K. Lang | Optical guidance for surgical, medical, and dental procedures |
US11801114B2 (en) | 2017-09-11 | 2023-10-31 | Philipp K. Lang | Augmented reality display for vascular and other interventions, compensation for cardiac and respiratory motion |
US11348257B2 (en) | 2018-01-29 | 2022-05-31 | Philipp K. Lang | Augmented reality guidance for orthopedic and other surgical procedures |
US11727581B2 (en) | 2018-01-29 | 2023-08-15 | Philipp K. Lang | Augmented reality guidance for dental procedures |
US12086998B2 (en) | 2018-01-29 | 2024-09-10 | Philipp K. Lang | Augmented reality guidance for surgical procedures |
US11553969B1 (en) | 2019-02-14 | 2023-01-17 | Onpoint Medical, Inc. | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures |
US11857378B1 (en) | 2019-02-14 | 2024-01-02 | Onpoint Medical, Inc. | Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets |
US12055720B2 (en) | 2019-02-28 | 2024-08-06 | Sony Group Corporation | Head-mounted display and glasses |
CN110286538A (en) * | 2019-06-28 | 2019-09-27 | Oppo广东移动通信有限公司 | Display methods, display device, head-mounted display apparatus and storage medium |
US12053247B1 (en) | 2020-12-04 | 2024-08-06 | Onpoint Medical, Inc. | System for multi-directional tracking of head mounted displays for real-time augmented reality guidance of surgical procedures |
US11786206B2 (en) | 2021-03-10 | 2023-10-17 | Onpoint Medical, Inc. | Augmented reality guidance for imaging systems |
US12127890B1 (en) | 2021-08-11 | 2024-10-29 | Navakanth Gorrepati | Mixed reality endoscopic retrograde cholangiopancreatopgraphy (ERCP) procedure |
US12127795B2 (en) | 2024-04-08 | 2024-10-29 | Philipp K. Lang | Augmented reality display for spinal rod shaping and placement |
Also Published As
Publication number | Publication date |
---|---|
CN106662747B (en) | 2020-09-08 |
US9626936B2 (en) | 2017-04-18 |
KR102373940B1 (en) | 2022-03-11 |
KR20170044706A (en) | 2017-04-25 |
EP3183615A1 (en) | 2017-06-28 |
US20160055822A1 (en) | 2016-02-25 |
CN106662747A (en) | 2017-05-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9626936B2 (en) | Dimming module for augmented and virtual reality | |
US11768417B2 (en) | Electrochromic systems for head-worn computer systems | |
US11622426B2 (en) | See-through computer display systems | |
US10746994B2 (en) | Spherical mirror having a decoupled aspheric | |
US8692845B2 (en) | Head-mounted display control with image-content analysis | |
US10667981B2 (en) | Reading assistance system for visually impaired | |
US11327307B2 (en) | Near-eye peripheral display device | |
US8873149B2 (en) | Projection optical system for coupling image light to a near-eye display | |
KR102370445B1 (en) | Reduced Current Drain in AR/VR Display Systems | |
EP2652940B1 (en) | Comprehension and intent-based content for augmented reality displays | |
US10955665B2 (en) | Concurrent optimal viewing of virtual objects | |
US20160097930A1 (en) | Microdisplay optical system having two microlens arrays | |
US8594381B2 (en) | Method of identifying motion sickness | |
JP2020502567A (en) | System and method for manipulating light from an ambient light source | |
US20120182206A1 (en) | Head-mounted display control with sensory stimulation | |
US20150262424A1 (en) | Depth and Focus Discrimination for a Head-mountable device using a Light-Field Display System | |
US10997948B2 (en) | Electronic device with adaptive lighting system | |
EP3097461A1 (en) | Automated content scrolling | |
US9336779B1 (en) | Dynamic image-based voice entry of unlock sequence | |
US11887263B1 (en) | Adaptive rendering in artificial reality environments | |
CN112346558A (en) | Eye tracking system | |
US11366321B1 (en) | Predictive dimming of optical passthrough displays | |
Peddie et al. | Technology issues | |
Young | OLED displays and the immersive experience |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15756522 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2015756522 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20177007770 Country of ref document: KR Kind code of ref document: A |