NZ795800A - Systems and methods for manipulating light from ambient light sources - Google Patents

Systems and methods for manipulating light from ambient light sources

Info

Publication number
NZ795800A
NZ795800A NZ795800A NZ79580017A NZ795800A NZ 795800 A NZ795800 A NZ 795800A NZ 795800 A NZ795800 A NZ 795800A NZ 79580017 A NZ79580017 A NZ 79580017A NZ 795800 A NZ795800 A NZ 795800A
Authority
NZ
New Zealand
Prior art keywords
user
ambient light
display
stimulus
wearable device
Prior art date
Application number
NZ795800A
Other versions
NZ795800B2 (en
Inventor
Nastasja Robaina
Mark Baerenrodt
Eric Baerenrodt
Christopher Harrises
Nicolas Samec
Original Assignee
Magic Leap Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Leap Inc filed Critical Magic Leap Inc
Priority claimed from NZ754406A external-priority patent/NZ754406B2/en
Publication of NZ795800A publication Critical patent/NZ795800A/en
Publication of NZ795800B2 publication Critical patent/NZ795800B2/en

Links

Landscapes

  • Eye Examination Apparatus (AREA)

Abstract

A user-wearable display device comprising a frame configured to mount on the user, an augmented reality display attached to the frame and configured to direct images to an eye of the user, a sensor configured to obtain information about ambient light condition in an environment surrounding the user, the sensor comprising an image capture device configured to capture an image of a scene in a field of view of the user, a variable optical material that undergoes a physical and/or a chemical change in response to a stimulus, a source configured to provide the stimulus and processing electronics configured to: analyze the image of the scene in the field of view of the user to identify one or more locations in the scene where the intensity of the ambient light is above a threshold value, wherein the analyzing of the image includes analyzing at least a portion of the image corresponding to a periphery of the field of view of the user; trigger the source to provide the stimulus to the variable optical material to effect a physical and/or a chemical change in the material based on the information obtained by the sensor such that at least one of intensity of ambient light, spectral content of ambient light or direction of ambient light is changed at one or more first portions differently than at one or more second portions, the one or more first portions of the display appearing, as seen by the user’s eye, to be respectively aligned with the one or more identified locations of above-threshold ambient light in the scene in the field of view of the user, and the one or more second portions of the display not appearing, as seen by the user’s eye, to be aligned with the one or more identified locations of above-threshold ambient light in the scene in the field of view of the user.

Description

A user-wearable display device comprising a frame ured to mount on the user, an augmented reality display attached to the frame and ured to direct images to an eye of the user, a sensor configured to obtain information about ambient light condition in an environment surrounding the user, the sensor comprising an image capture device configured to capture an image of a scene in a field of view of the user, a variable optical material that oes a physical and/or a chemical change in response to a stimulus, a source ured to provide the stimulus and processing electronics configured to: analyze the image of the scene in the field of view of the user to identify one or more locations in the scene where the intensity of the ambient light is above a threshold value, wherein the analyzing of the image includes analyzing at least a portion of the image corresponding to a ery of the field of view of the user; trigger the source to provide the us to the variable optical material to effect a physical and/or a al change in the material based on the information obtained by the sensor such that at least one of intensity of ambient light, spectral content of ambient light or direction of ambient light is changed at one or more first portions differently than at one or more second ns, the one or more first portions of the display appearing, as seen by the user's eye, to be respectively aligned with the one or more identified locations of above-threshold ambient light in the scene in the field of view of the user, and the one or more second portions of the display not appearing, as seen by the user's eye, to be aligned with the one or more identified locations of above-threshold ambient light in the scene in the field of view of the user. 795800 A1 SYSTEMS AND METHODS FOR MANIPULATING LIGHT FROM AMBIENT LIGHT SOURCES PRIORITY

Claims (25)

CLAIM
1. A user-wearable display device comprising: a frame ured to mount on the user; an augmented reality display attached to the frame and configured to direct images to an eye of the user; a sensor configured to obtain information about ambient light condition in an environment surrounding the user, the sensor comprising an image capture device configured to capture an image of a scene in a field of view of the user; a variable optical material that oes a physical and/or a chemical change in response to a stimulus; a source configured to provide the us; and processing electronics configured to: analyze the image of the scene in the field of view of the user to identify one or more locations in the scene where the intensity of the ambient light is above a threshold value, wherein the analyzing of the image includes analyzing at least a portion of the image ponding to a periphery of the field of view of the user; trigger the source to provide the stimulus to the variable optical material to effect a physical and/or a chemical change in the material based on the information ed by the sensor such that at least one of intensity of ambient light, al content of ambient light or direction of ambient light is changed at one or more first portions differently than at one or more second portions, the one or more first portions of the display appearing, as seen by the user's eye, to be respectively aligned with the one or more identified locations of above-threshold ambient light in the scene in the field of view of the user, and the one or more second portions of the display not appearing, as seen by the user's eye, to be aligned with the one or more fied locations of threshold ambient light in the scene in the field of view of the user.
2. The user-wearable device of claim 1, wherein the augmented reality display comprises a waveguide configured to: allow a view of the environment surrounding the user through the waveguide; form images by ing light out of the waveguide and into an eye of the user.
3. The user-wearable device of claim 2, wherein the waveguide is part of a stack of waveguides, wherein each waveguide of the stack is ured to output light with different amounts of divergence in ison to one or more other waveguides of the stack of ides.
4. The user-wearable device of any one of claims 1 to 3, wherein the sensor further comprises at least one of a light sensor, a global positioning sub-system, or an environmental sensor.
5. The user-wearable device of any one of claims 1 to 4, further comprising an image capture device configured to track movement of eyes of the user.
6. The user-wearable device of any one of claims 1 to 5, further comprising a light source configured to generate a projection beam based on data associated with the images directed to the eye of the user.
7. The user-wearable device of any one of claims 1 to 6, wherein the source comprises an l source configured to direct visible or invisible light to one or more portions of the display.
8. The earable device of any one of claims 1 to 6, wherein the source comprises an electrical source configured to provide an electrical signal to one or more portions of the display.
9. The user-wearable device of any one of claims 1 to 6, wherein the source ses a thermal source configured to provide a thermal radiation to one or more ns of the display.
10. The user-wearable device of any one of claims 1 to 6, wherein the source comprises a sonic/ultrasonic system configured to provide sonic/ultrasonic energy to one or more portions of the display.
11. The user-wearable device of any one of claims 1 to 10, wherein the variable optical material is embedded in a surface of the display.
12. The user-wearable device of any one of claims 1 to 10, wherein the variable optical material is ed over a surface of the display.
13. The user-wearable device of any one of claims 1 to 12, n the variable optical material includes organic or inorganic compounds.
14. The user-wearable device of any one of claims 1 to 12, wherein the variable optical material comprises electroactive proteins.
15. The user-wearable device of any one of claims 1 to 12, wherein the variable optical material comprises molecules that t a change is size or shape in response to the stimulus.
16. The user-wearable device of any one of claims 1 to 12, wherein the variable optical material comprises molecules that move, rotate, twist or shift in response to the stimulus.
17. The user-wearable device of any one of claims 1 to 12, n the variable l material ses molecules that move together and/or adhere together in se to the stimulus.
18. The earable device of any one of claims 1 to 12, wherein the variable light optical material comprises molecules that move away from each other in response to the stimulus.
19. The user-wearable device of any one of claims 1 to 12, wherein the variable optical material comprises molecules that form nanostructures in response to the stimulus.
20. The user-wearable device of any one of claims 1 to 19, wherein the y comprises a first ocular region corresponding to a first eye of the user and a second ocular region corresponding to a second eye of the user, and wherein the processing electronics are configured to trigger the source to e the stimulus to a n of the display to effect a physical and/or a chemical change in the variable optical material based on the information obtained by the sensor such that at least one of intensity of ambient light, spectral content of ambient light or direction of ambient light is changed through the first ocular region as a result of stimulus from a source triggered by the sing electronics.
21. The user-wearable device of any one of claims 1 to 19, wherein the display comprises a first ocular region corresponding to a first eye of the user and a second ocular region corresponding to a second eye of the user, and n the processing electronics are configured to trigger the source to provide the stimulus to a portion of the display to effect a physical and/or a chemical change in the material based on the information obtained by the sensor such that at least one of intensity of ambient light, spectral content of ambient light or direction of ambient light through the first ocular region is changed differently as ed to intensity of ambient light, spectral content of ambient light or direction of ambient light through the second ocular .
22. The user-wearable device of any one of claims 1 to 19, wherein the processing electronics are configured to r the source to provide the stimulus to the display to effect a physical and/or a chemical change in the material based on the information obtained by the sensor such that attenuation of intensity of ambient light transmitted through the first portion of the display is r than attenuation of intensity of ambient light transmitted through the second portion of the y.
23. The earable device of claim 22, wherein the processing electronics are ured to trigger the source to provide the stimulus to the display to effect a physical and/or a chemical change in the material based on the information obtained by the sensor such that the intensity of ambient light transmitted through the second n of the display is reduced.
24. The user-wearable device of any one of claims 1 to 19, wherein the display comprises a first ocular region corresponding to a first eye of the user and a second ocular region corresponding to a second eye of the user, and wherein the processing electronics are configured to trigger the source to provide the stimulus to the display to effect a physical and/or a chemical change in the material based on the information obtained by the sensor such that intensity of ambient light transmitted through a portion of the first ocular region is reduced.
25. A method of lating light transmitted h a user-wearable display device comprising a display surface including a variable optical material that varies at least one of intensity of t light, spectral content of ambient light or direction of ambient light transmitted through the display surface in response to a stimulus, the method comprising: obtaining ement about ambient light condition in an environment surrounding the user using a sensor, the sensor sing an image capture device and the measurement comprising an image of a scene in a field of view of the user; analyzing the image of the scene in the field of view of the user to identify one or more locations in the scene where the intensity of the ambient light is above a threshold value, wherein the analyzing of the image includes analyzing at least a portion of the image corresponding to a periphery of the field of view of the user; and triggering a source to provide a stimulus to the variable optical material to effect a physical and/or a chemical change in the material based on the measurement obtained by the sensor such that at least one of intensity of t light, spectral t of t light or direction of ambient light is changed through the display at one or more first portions differently than at one or more second portions, the one or more first portions of the display appearing, as seen by the user's eye, to be respectively aligned with the one or more identified locations of above-threshold ambient light in the scene in the field of view of the user, and the one or more second portions of the display not ing, as seen by the user's eye, to be aligned with the one or more identified locations of above-threshold ambient light in the scene in the field of view of the user. WO 19276 FIG. I 1*:50 r--~--~-~-~~~*f‘--”s ‘ Lace! Processmg é “LHQMM' &Data Msduie ; Ln?mmmmmmm mun-R _ «.80 \5! \/ rwwwwfi~www r“‘*~“§m~*~'~w i Ramme E i Remain? { séngfi § Data E g i‘v‘ifldifie ; gREpDEitoryg bfimmwhn—wa L————————J 5/ {If .150 150
NZ795800A 2017-12-21 Systems and methods for manipulating light from ambient light sources NZ795800B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662438325P 2016-12-22 2016-12-22
NZ754406A NZ754406B2 (en) 2017-12-21 Systems and methods for manipulating light from ambient light sources

Publications (2)

Publication Number Publication Date
NZ795800A true NZ795800A (en) 2024-02-23
NZ795800B2 NZ795800B2 (en) 2024-05-24

Family

ID=

Similar Documents

Publication Publication Date Title
US10845606B1 (en) Eye tracking for a head mounted display including a pancake lens block
Li et al. Human sensing using visible light communication
CN110998223B (en) Detector for determining the position of at least one object
US10429927B1 (en) Eye tracking for a head mounted display including a pancake lens block
CN110325891A (en) System and method for manipulating the light from environment light source
US9285893B2 (en) Object detection and tracking with variable-field illumination devices
US11145097B2 (en) Changing view order of augmented reality objects based on user gaze
KR100911376B1 (en) The method and apparatus for realizing augmented reality using transparent display
WO2019050687A3 (en) Electronic devices with ambient light sensors
US9626773B2 (en) Augmented reality alteration detector
CN110291565A (en) It is presented using the 3D object of the feature detected
US10429657B1 (en) Eye tracking for a head mounted display including a pancake lens block
CN104755968A (en) See through near-eye display
US20210044742A1 (en) Dynamically programmable image sensor
EP2386243A3 (en) Ophthalmoscope
JP2012252091A5 (en)
US11841502B2 (en) Reflective polarizer for augmented reality and virtual reality display
MX2021002251A (en) Testing device for head-up display (hud).
WO2006071720A3 (en) Motion-compensating light-emitting apparatus
KR20220130809A (en) Location tracking system for head-worn display systems
CN114930273A (en) Position tracking system for head-mounted display system including angle-sensitive detector
CN112639687B (en) Eye tracking using reverse biased light emitting diode devices
WO2007076485A3 (en) Motion-compensating light-emitting apparatus
NZ795800A (en) Systems and methods for manipulating light from ambient light sources
US20180267601A1 (en) Light Projection for Guiding a User within a Physical User Area During Virtual Reality Operations

Legal Events

Date Code Title Description
PSEA Patent sealed