WO2020170253A1 - Changing the opacity of augmented reality glasses in response to external light sources - Google Patents

Changing the opacity of augmented reality glasses in response to external light sources Download PDF

Info

Publication number
WO2020170253A1
WO2020170253A1 PCT/IL2020/050191 IL2020050191W WO2020170253A1 WO 2020170253 A1 WO2020170253 A1 WO 2020170253A1 IL 2020050191 W IL2020050191 W IL 2020050191W WO 2020170253 A1 WO2020170253 A1 WO 2020170253A1
Authority
WO
WIPO (PCT)
Prior art keywords
cgi
opacity
location
eye
incoming light
Prior art date
Application number
PCT/IL2020/050191
Other languages
French (fr)
Inventor
Daniel Grinberg
Aviad Hellman
Shay CHAIM
Eli Campo
Original Assignee
Reality Plus Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Reality Plus Ltd. filed Critical Reality Plus Ltd.
Publication of WO2020170253A1 publication Critical patent/WO2020170253A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B2027/0192Supplementary details
    • G02B2027/0194Supplementary details with combiner of laminated type, for optical or mechanical aspects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/04Structural and physical details of display devices
    • G09G2300/0439Pixel structures
    • G09G2300/0456Pixel structures with a reflective area and a transmissive area combined in one pixel, such as in transflectance pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0261Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0666Adjustment of display parameters for control of colour parameters, e.g. colour temperature
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light

Definitions

  • the present invention relates to opacity of glasses generally and to opacity responsiveness for augmented reality and otherwise, in particular.
  • Augmented reality (AR) units combine a real-world scene with a computer-generated image (CGI) overlaid over a portion of the real-world scene.
  • CGI computer-generated image
  • AR glasses consisting of transparent near eye display are less efficient when used where there is low light strength in the surroundings area such as indoors.
  • the projection of the CGI on the transparent display becomes more transparent to the viewer when the external light becomes stronger than the light projected on the display.
  • the projected image become invisible.
  • US 8,941,559 to Microsoft provides a practical solution consisting of AR glasses comprising a transparent display (a see through lens) and a means to block the external light on the CGI using an opacity filter or transmissive display, such as an LCD display, placed between the transparent display and the external light.
  • the transmissive display blocks the external light on the pixels behind the CGI being displayed.
  • US 8,941,559 also uses eye-tracking to determine the direction that the user is looking and blocks in that direction too by making the peripheral regions around the CGI opaque. Thus US 8,941,559 determines its opacity based on the position of a CGI projected onto the transparent display and takes into account where the eye is looking.
  • a device for augmented reality glasses includes at least one digital display to display a computer generated image (CGI) to an eye, the CGI having a CGI intensity; at least one eye tracker to track the location of the eye within an eye box when looking at the CGI; a transmissive display located behind each digital display, the transmissive display to provide pixel-based opacity; a light sensor to determine an intensity of incoming light to the glasses; the light sensor to also determine the location of the incoming light relative to the transmissive display; and an opacity controller to determine a location for the pixel-based opacity and to apply a level of pixel-based opacity at the location.
  • CGI computer generated image
  • the opacity controller further includes an opacity calculator to determine the level of the pixel-based opacity by comparing RGB values between the incoming light and the CGI intensity; where the level of the pixel-based opacity balances the incoming light intensity and the CGI intensity and an opacity locator to determine a location for the pixel-based opacity on the transmissive display, the opacity locator to calculate a gaze direction for the eye viewing the CGI, where the gaze direction is the difference between the location of the eye and the location of the incoming light, the opacity locator to then calculate the shift between the location of the gaze direction and the location of the CGI on the at least one digital display.
  • an opacity calculator to determine the level of the pixel-based opacity by comparing RGB values between the incoming light and the CGI intensity; where the level of the pixel-based opacity balances the incoming light intensity and the CGI intensity and an opacity locator to determine a location for the
  • the comparing RGB values includes differentiating between the colors of the CGI and the colors of the incoming light.
  • the at least one digital display is parallel to the transmissive display.
  • the pixel based opacity is adjustable according to a change in the incoming light.
  • the incoming light is a measurement of the relative intensity within a spectrum of the incoming light and its azimuth of the light source.
  • the opacity locator utilizes the coordinate system of the glasses, the coordinate system of the at least one digital display and a CGI coordinate system.
  • a method for augmented reality glasses includes displaying via at least one digital display, a computer generated image (CGI) to an eye, the CGI having a CGI intensity; tracking the location of the eye within an eye box when looking at the CGI; determining an intensity of incoming light to the glasses; determining the location of the incoming light to the glasses relative to a transmissive display providing pixel based opacity; and determining a location and level for the pixel-based opacity, the determining a location and level for the pixel-based opacity further including: comparing RGB values between the incoming light and the CGI intensity; calculating a gaze direction for the eye viewing the CGI, where the gaze direction is the difference between the location of the eye and the location of the incoming light to the glasses and calculating the shift between the location of the gaze direction and the location of the CGI on the at least one digital display.
  • CGI computer generated image
  • the comparing RGB values comprises differentiating between the colors of the CGI and the colors of the incoming light.
  • the at least one digital display is parallel to the transmissive display.
  • the method also includes adjusting the pixel based opacity according to a change in the incoming light.
  • the determining an intensity of incoming light includes measuring the relative intensity within a spectrum of the incoming light and its azimuth of the light source.
  • the determining a location utilizes the coordinate system of the glasses, the coordinate system of the at least one digital display and a CGI coordinate system.
  • FIGs. 1A and IB are schematic illustrations showing how opacity or shading should be varied accordingly in order to view a CGI clearly;
  • FIGs. 2A and 2B are schematic illustrations of how the opacity of Fig. 1 takes into account the position of the eye viewing the CGI, the location of the incoming light and the balance between incoming light intensity and CGI light intensity; constructed and operative in accordance with the present invention;
  • FIG. 3 is a schematic illustration of a pair of augmented reality glasses; constructed and operative in accordance with the present invention
  • FIG. 4 is a schematic illustration of a system controlling the location and opacity level for the glasses of Fig. 3; constructed and operative in accordance with the present invention
  • FIG. 5 is a schematic illustration of the coordinate system required by the opacity locator of Fig. 4 constructed and operative in accordance with the present invention
  • FIGs. 6 A and 6B are schematic illustrations of the shift of a pixel on a digital display to a location on a transmissive display according to the direction the eye is gazing; constructed and operative in accordance with the present invention.
  • the determined opacity should be selective and be applied only where it is needed, which may also be a function of where the user is looking at the CGI (i.e. the location of the eye based on eye tracking relative to any incoming light). This may be at the pixel level on the transmissive display and may help to preserve the clarity of the CGI itself.
  • the intensity of the applied opacity should be a balance between the intensity of the incoming light and the light strength of the CGI. Applicant has further realized that if such variable opacity is implemented, less power will also be utilized to make the CGI visible.
  • Figs. 1A and IB show how the opacity or shading should be varied accordingly according to incoming light.
  • the amount of opacity should be more if the background light is very strong (e.g. sunlight) and less when the background light is lower (e.g. indoors).
  • FIG. 1A shows a user’s eye 40, a CGI 42, a transmissive display 25 and a light source 12.
  • light source 12 is strong and therefore transmissive display 25 requires a fully darkened“shade” 45A behind CGI 42, in the direction that eye 40 is viewing.
  • Fig. IB light source 12 is covered by a cloud 46 and therefore, therefore transmissive display 25 is only required to only partially darken the shade, here labeled 45B, behind CGI 42.
  • the amount of opacity or shadow required in order to block light should also be a function of the intensity balance between the light coming in from the outside real world, together with the light required to see CGI 42 on the digital display as is illustrated in Figs. 2A and 2B to which reference is now made. It will be appreciated that if there is not much incoming light from the outside world then less light is required than if CGI 42 was to be viewed (for example) in bright sunlight.
  • Applicant has further realized that it is not sufficient to merely provide opacity behind an image as is illustrated in Figs. 1 A and IB since each user has a different vision capability and may view CGI 42 differently i.e. the position of the glasses on the face, the distance of CGI 42 from the eye, movement of CGI 42, eye movement etc.
  • opacity or shading 45C is typically applied behind CGI 42 (assuming that the digital display 35 and the transmissive display 25 are in parallel and that the user is looking straight ahead). When a user is not looking straight ahead (i.e.
  • shading 45D may be needed in addition to/instead of shading 45C for the user to see CGI 42 clearly.
  • Applicant has further realized that the above mentioned issue may be resolved by augmented reality (AR) glasses integrated with a system for determining the exact intensity and location for opacity on a transmissive display required for the incoming light from the outside world that may take into account the position of the eye from where a user sees a CGI together with the amount of light required to see the CGI clearly.
  • AR augmented reality
  • the device may use the metrics /distance between where the CGI is projected on the transparent display, the relative position of the incoming external light on the transmissive display and the location of each eye, ensuring that the light between the CGI and the real world is balanced as described herein in order to provide the correct shading so that a clear image is seen by the user.
  • FIG. 3 is an illustration of a device 100 for determining location and intensity of opacity for augmented reality glasses based on the position of the eye within the eye box, the location and intensity of the incoming light from the outside world and the location and light strength of the CGI, according to an embodiment of the present invention.
  • Device 100 may comprise an exemplary pair of augmented reality (AR) glasses 8 having a transmissive display 25 such as an liquid crystal or an light emitting display and a digital display 35 such as a waveguide digital display, in addition to a single light sensor 10, an opacity controller 20 and the elements needed to display CGI 42, such as a CGI generator 30.
  • AR glasses 8 may be any form of eyeglasses having lenses 6.
  • CGI generator 30 may display CGI 42 to the user on digital display 35, which the user may see together with his or her view of the real world. It will be appreciated that CGI generator 30 may have knowledge of the strength of light required to display CGI 42.
  • Light sensor 10 may determine the intensity of incoming light from the real world together with the position of the incoming light. Light sensor 10 may calculate the coordinates of the incoming light relative to transmissive displays 35 as a function of the measurement of the relative intensity within a spectrum and its azimuth of the light source as is known in the art.
  • eye tracker 15 may determine the location of eye 40 within the eye box.
  • the incoming light may be from any type of light source 12, such as from the sun, from indoor lighting, etc.
  • opacity controller 20 may receive input from light sensor 10, eye tracker 15 (per eye 15L for the left eye and 15R for the right eye) and CGI generator 30. It may also receive coordinate system 50 of AR glasses 8, system coordinate system 54 from LCD display 35 showing the location of the projected CGI 42 together with a 2D“waveguide” or CGI coordinate system 52. 1 [0038] Opacity controller 20 may further comprise an opacity calculator 21 and a location calculator 22. Opacity calculator 21 may calculate the balance of light required to see CGI 42 clearly and as a result determine the level of opacity required for transmissive displays 25.
  • Location calculator 22 may determine the gaze direction of eye 40 and therefore the position of a corrected opacity spot behind CGI 42 relative to the position of eye 40 with in the eye box, the location of the incoming light and the location of CGI 42 as described in more detail herein below. It will be appreciated that opacity controller 20 may control the opacity of displays 35 at the pixel level by shading required pixels. It will be further appreciated that controller 20 may control each side of glasses 8, i.e. each side individually.
  • Opacity controller 20 may receive the measurements of the intensities of the real world light from light sensor 10 and of CGI 42 from CGI generator 30 accordingly and opacity calculator 21 may calculate the required opacity level as described in more detail herein below. As discussed herein above, opacity controller 20 may also receive the coordinates 54 of CGI 42 on digital display 35 and the coordinates of the location of the incoming light relative to transmissive displays 35 from light sensor 10.
  • Location calculator 22 may determine where to place opacity or shade 45 as a function of the relationship between the location of user’s eye 40, the location of incoming light on display 25 and the location of the CGI on digital display 35 for each side of glasses 8 as is illustrated in Fig. 5 to which reference is now made.
  • Fig. 5 illustrates the relationship between three coordinate systems - the coordinate system 50 of glasses 8 denoting the location of eye 40 (the eye 40 coordinate origin), a 2D“waveguide” or CGI coordinate system 52 and the coordinates of transmissive display 25 (2D LCD coordinate system 54) as discussed in more detail herein below.
  • the coordinate systems are determined from the output of an eye tracker 15 and light sensor 10 together with the coordinates of the location of CGI 42 on display 25.
  • Location calculator 22 may use the above mentioned relationship between the coordinate systems to calculate the optimal location for opacity of transmissive displays 25.
  • opacity controller 20 may change the opacity of transmissive displays 25 in the area which the user views as being behind CGI 42, separately for each eye.
  • World coordinate system 50 may be the coordinate system of AR glasses 8 and may have its origin located at the middle point between eyes 40. These coordinates may denote the location of each eye 40 viewing CGI 42. Variables in this coordinate system are denoted with the superscript“world”.
  • Each eye only has a distance to the right or left of the origin of world coordinate system (i.e.
  • E orId ⁇ 6.3 cm as an average pupillary distance in cm.
  • a better approximation is to calibrate the pupillary distance per user and use this value for E orId .
  • the best solution is to use eye tracking methods for getting in real time the accurate values for the eye location within the eye box.
  • Waveguide coordinate system 52 may be the coordinate system of digital displays 35 and may be denoted as“WG”.
  • M WG ⁇ M ⁇ G , M j G ) in pixel/cm units
  • LCD coordinate system 54 may be the coordinate system of transmissive display 25. Its center 55, in world coordinate system 50, is given by the coordinates i n LCD coordinate system 54 is given by the center 53, the middle of transmissive display 25 is given by half of the width and half of the height of display 25:
  • the actual gaze direction may be defined as the difference between the position of the eye within the eye box (from eye tracker 15) and the position of the external light relative to the transmissive displays 35 as measured by light sensor 10.
  • Location calculator 22 may use the above definitions to provide a basis to calculate the transition of each pixel position on digital display 35 to a corresponding pixel position in transmissive display 25.
  • a pixel P on digital display 35, which is in waveguide coordinate system 52, P WG may be transferred to world coordinates as follows:
  • Ad L ⁇ orld — W z worUL , in centimeters.
  • Ax Ad tan(a)
  • the position of pixel Q on transmissive display 25 in world coordinates is Q worid _
  • the difference between pixel Q and the center of transmissive display 25 in world coordinates is:
  • Q world (Qx ° rld — L orld , Qy 0rld — L y 0rld
  • Q world i a difference the difference in LCD coordinate system 54, is determined merely by converting Q world to a pixel difference, i.e.
  • AQ LCD (AQx ° rld M ⁇ CD , AQ y 0rld My CD
  • the real position of pixel Q in LCD coordinate system 54 is given by:
  • location calculator 22 may utilize equation 1 to convert each pixel of CGI 42, on a digital display 35, to each pixel of shade 45 on a transmissive display 25 relative to the gaze of eye 40 and the location of CGI 42.
  • an unclear image of CGI 42 may be caused if only a portion of the light behind CGI 42 is blocked, such as may happen when the shift of Fig. 6B is ignored.
  • opacity controller 20 may keep such unclear image caused by an imbalance of light from happening.
  • Opacity calculator 21 may balance between the external light coming in from outside and the intensity of the display light of CGI 42 as discussed herein above. Opacity calculator 21 may utilize the information from CGI generator 30 about CGI 42 to calculate the strength of light to be displayed on digital display 35 and may compare it to the amount of external light as measured by light sensor 10. As input (as is illustrated in Fig. 4 back to which reference is now made), opacity calculator 21 may receive the output of light sensor 10 and may receive information about from CGI generator 30 about CGI 42 as a set of pixel intensities.
  • the intensity of the incoming light from light sensor 10 may be considered a light spectrum and be measured in terms of its RGB (red, green, blue) intensities.
  • Opacity calculator 21 may also determine the light intensity of CGI 42 by its colors (i.e. by dividing CGI 42 into its RGB components (red, green and blue elements)).
  • opacity calculator 21 may determine the amount of external light to be blocked by the transmissive display 25 by calculating the difference between the two sets of RGB values in order to ensure that the external light does not overwhelm the light of CGI 42.
  • the external light behind CGI 42 as defined above, may be fully or partially blocked to reduce the external light strength to no larger than the light strength of CGI 42.
  • opacity controller 20 may apply different blocking amounts to the different colors of the CGI 42 accordingly.
  • device 100 may solve the issue of an unclear image of CGI 42 by calculating the gaze of the user’s eye 30 relative to the location of CGI 42 and by blocking relevant pixels on the transmission display.
  • Embodiments of the present invention may include apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the desired purposes, or it may comprise a computing device or system typically having at least one processor and at least one memory, selectively activated or reconfigured by a computer program stored in the computer.
  • the resultant apparatus when instructed by software may turn the general purpose computer into inventive elements as discussed herein.
  • the instructions may define the inventive device in operation with the computer platform for which it is desired.
  • Such a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk, including optical disks, magnetic-optical disks, read-only memories (ROMs), volatile and non volatile memories, random access memories (RAMs), electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, Flash memory, disk-on-key or any other type of media suitable for storing electronic instructions and capable of being coupled to a computer system bus.
  • the computer readable storage medium may also be implemented in cloud storage.
  • Some general purpose computers may comprise at least one communication element to enable communication with a data network and/or a mobile communications network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

A device for augmented reality glasses includes at least one digital display to display a computer generated image (CGI) having a CGI intensity to an eye, at least one eye tracker to track the location of the eye when looking at said CGI; a transmissive display to provide pixel- based opacity; a light sensor to determine an intensity of incoming light to the glasses and the location of the incoming light relative to the transmissive display; and an opacity controller to determine a level and location for the opacity. The opacity controller includes an opacity calculator to determine the level of said opacity by comparing RGB values between the incoming light and the CGI intensity and an opacity locator to determine a location for the opacity by calculating a gaze direction for the eye viewing the CGI, where the gaze direction is the difference between the location of the eye and the location of the incoming light, the opacity locator then calculates the shift between the location of the gaze direction and the location of the CGI on the at least one digital display.

Description

TITLE OF THE INVENTION
CHANGING THE OPACITY OF AUGMENTED REALITY GLASSES IN RESPONSE
TO EXTERNAL LIGHT SOURCES CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority from US provisional patent applications 62/809,638, filed February 24, 2019, and 62/889,588, filed August 21, 2019, both of which are incorporated herein by reference.
FIELD OF THE INVENTION
[0002] The present invention relates to opacity of glasses generally and to opacity responsiveness for augmented reality and otherwise, in particular.
BACKGROUND OF THE INVENTION
[0003] Augmented reality (AR) units combine a real-world scene with a computer-generated image (CGI) overlaid over a portion of the real-world scene.
[0004] AR glasses consisting of transparent near eye display are less efficient when used where there is low light strength in the surroundings area such as indoors. Alternatively where there is a strong light strength, the projection of the CGI on the transparent display becomes more transparent to the viewer when the external light becomes stronger than the light projected on the display. Eventually, when the external light is stronger than the light projected, the projected image become invisible.
[0005] US 8,941,559 to Microsoft provides a practical solution consisting of AR glasses comprising a transparent display (a see through lens) and a means to block the external light on the CGI using an opacity filter or transmissive display, such as an LCD display, placed between the transparent display and the external light. The transmissive display blocks the external light on the pixels behind the CGI being displayed. US 8,941,559 also uses eye-tracking to determine the direction that the user is looking and blocks in that direction too by making the peripheral regions around the CGI opaque. Thus US 8,941,559 determines its opacity based on the position of a CGI projected onto the transparent display and takes into account where the eye is looking.
SUMMARY OF THE PRESENT INVENTION
[0006] There is provided in accordance with a preferred embodiment of the present invention, a device for augmented reality glasses, the device includes at least one digital display to display a computer generated image (CGI) to an eye, the CGI having a CGI intensity; at least one eye tracker to track the location of the eye within an eye box when looking at the CGI; a transmissive display located behind each digital display, the transmissive display to provide pixel-based opacity; a light sensor to determine an intensity of incoming light to the glasses; the light sensor to also determine the location of the incoming light relative to the transmissive display; and an opacity controller to determine a location for the pixel-based opacity and to apply a level of pixel-based opacity at the location. The opacity controller further includes an opacity calculator to determine the level of the pixel-based opacity by comparing RGB values between the incoming light and the CGI intensity; where the level of the pixel-based opacity balances the incoming light intensity and the CGI intensity and an opacity locator to determine a location for the pixel-based opacity on the transmissive display, the opacity locator to calculate a gaze direction for the eye viewing the CGI, where the gaze direction is the difference between the location of the eye and the location of the incoming light, the opacity locator to then calculate the shift between the location of the gaze direction and the location of the CGI on the at least one digital display.
[0007] Moreover, in accordance with a preferred embodiment of the present invention, the comparing RGB values includes differentiating between the colors of the CGI and the colors of the incoming light.
[0008] Further, in accordance with a preferred embodiment of the present invention, the at least one digital display is parallel to the transmissive display.
[0009] Still further, in accordance with a preferred embodiment of the present invention, the pixel based opacity is adjustable according to a change in the incoming light. [0010] Additionally, in accordance with a preferred embodiment of the present invention, the incoming light is a measurement of the relative intensity within a spectrum of the incoming light and its azimuth of the light source.
[0011] Moreover, in accordance with a preferred embodiment of the present invention, the opacity locator utilizes the coordinate system of the glasses, the coordinate system of the at least one digital display and a CGI coordinate system.
[0012] There is provided in accordance with a preferred embodiment of the present invention, a method for augmented reality glasses, the method includes displaying via at least one digital display, a computer generated image (CGI) to an eye, the CGI having a CGI intensity; tracking the location of the eye within an eye box when looking at the CGI; determining an intensity of incoming light to the glasses; determining the location of the incoming light to the glasses relative to a transmissive display providing pixel based opacity; and determining a location and level for the pixel-based opacity, the determining a location and level for the pixel-based opacity further including: comparing RGB values between the incoming light and the CGI intensity; calculating a gaze direction for the eye viewing the CGI, where the gaze direction is the difference between the location of the eye and the location of the incoming light to the glasses and calculating the shift between the location of the gaze direction and the location of the CGI on the at least one digital display.
[0013] Moreover, in accordance with a preferred embodiment of the present invention, the comparing RGB values comprises differentiating between the colors of the CGI and the colors of the incoming light.
[0014] Further, in accordance with a preferred embodiment of the present invention, the at least one digital display is parallel to the transmissive display. [0015] Still further, in accordance with a preferred embodiment of the present invention, the method also includes adjusting the pixel based opacity according to a change in the incoming light.
[0016] Additionally, in accordance with a preferred embodiment of the present invention, the determining an intensity of incoming light includes measuring the relative intensity within a spectrum of the incoming light and its azimuth of the light source.
[0017] Moreover, in accordance with a preferred embodiment of the present invention, the determining a location utilizes the coordinate system of the glasses, the coordinate system of the at least one digital display and a CGI coordinate system.
BRIEF DESCRIPTION OF THE DRAWINGS
[0018] The subject matter regarded as the invention is particularly pointed out and distinctly claimed in the concluding portion of the specification. The invention, however, both as to organization and method of operation, together with objects, features, and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying drawings in which:
[0019] Figs. 1A and IB are schematic illustrations showing how opacity or shading should be varied accordingly in order to view a CGI clearly;
[0020] Figs. 2A and 2B are schematic illustrations of how the opacity of Fig. 1 takes into account the position of the eye viewing the CGI, the location of the incoming light and the balance between incoming light intensity and CGI light intensity; constructed and operative in accordance with the present invention;
[0021] Fig. 3 is a schematic illustration of a pair of augmented reality glasses; constructed and operative in accordance with the present invention;
[0022] Fig. 4 is a schematic illustration of a system controlling the location and opacity level for the glasses of Fig. 3; constructed and operative in accordance with the present invention;
[0023] Fig. 5 is a schematic illustration of the coordinate system required by the opacity locator of Fig. 4 constructed and operative in accordance with the present invention;
[0024] Figs. 6 A and 6B are schematic illustrations of the shift of a pixel on a digital display to a location on a transmissive display according to the direction the eye is gazing; constructed and operative in accordance with the present invention.
[0025] It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
DETAILED DESCRIPTION OF THE PRESENT INVENTION
[0026] In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the present invention.
[0027] As discussed herein above, light intensity can impede the viewing of a CGI in an augmented reality environment. Applicant has realized that the determined opacity should be selective and be applied only where it is needed, which may also be a function of where the user is looking at the CGI (i.e. the location of the eye based on eye tracking relative to any incoming light). This may be at the pixel level on the transmissive display and may help to preserve the clarity of the CGI itself. Furthermore, the intensity of the applied opacity should be a balance between the intensity of the incoming light and the light strength of the CGI. Applicant has further realized that if such variable opacity is implemented, less power will also be utilized to make the CGI visible.
[0028] Figs. 1A and IB, to which reference is now made, show how the opacity or shading should be varied accordingly according to incoming light. The amount of opacity should be more if the background light is very strong (e.g. sunlight) and less when the background light is lower (e.g. indoors).
[0029] Each figure shows a user’s eye 40, a CGI 42, a transmissive display 25 and a light source 12. In Fig. 1A, light source 12 is strong and therefore transmissive display 25 requires a fully darkened“shade” 45A behind CGI 42, in the direction that eye 40 is viewing. In Fig. IB, light source 12 is covered by a cloud 46 and therefore, therefore transmissive display 25 is only required to only partially darken the shade, here labeled 45B, behind CGI 42. [0030] As discussed herein above, in order to view a clear image of CGI 42, the amount of opacity or shadow required in order to block light should also be a function of the intensity balance between the light coming in from the outside real world, together with the light required to see CGI 42 on the digital display as is illustrated in Figs. 2A and 2B to which reference is now made. It will be appreciated that if there is not much incoming light from the outside world then less light is required than if CGI 42 was to be viewed (for example) in bright sunlight.
[0031] Applicant has further realized that it is not sufficient to merely provide opacity behind an image as is illustrated in Figs. 1 A and IB since each user has a different vision capability and may view CGI 42 differently i.e. the position of the glasses on the face, the distance of CGI 42 from the eye, movement of CGI 42, eye movement etc. As is illustrated in Figs. 1A and IB and in Fig. 2A to which reference is now made, opacity or shading 45C is typically applied behind CGI 42 (assuming that the digital display 35 and the transmissive display 25 are in parallel and that the user is looking straight ahead). When a user is not looking straight ahead (i.e. eye 40 is in a different position within the eye box), there may be a shift or offset between his direction of gaze (based on where the eye is looking and the position of the incoming light) and the position of CGI 42 on the transparent display as is illustrated in Fig. 2B to which reference is now made. This shift may cause an unclear image of CGI 42 due to the opacity not being directly behind CGI 42 according to where the user is looking and where the incoming light is falling and therefore causing an imbalance of light. Therefore the position of where the opacity is placed should be calculated based on the relationship between the location and direction of the user’s direction of gaze on CGI 42 and the location of the digital display onto which to CGI 42 is projected. Thus shading 45D may be needed in addition to/instead of shading 45C for the user to see CGI 42 clearly. [0032] Applicant has further realized that the above mentioned issue may be resolved by augmented reality (AR) glasses integrated with a system for determining the exact intensity and location for opacity on a transmissive display required for the incoming light from the outside world that may take into account the position of the eye from where a user sees a CGI together with the amount of light required to see the CGI clearly. Since it is the transmissive display that provides the shade or opacity for the glasses, the device may use the metrics /distance between where the CGI is projected on the transparent display, the relative position of the incoming external light on the transmissive display and the location of each eye, ensuring that the light between the CGI and the real world is balanced as described herein in order to provide the correct shading so that a clear image is seen by the user.
[0033] Reference is now made to Fig. 3, which is an illustration of a device 100 for determining location and intensity of opacity for augmented reality glasses based on the position of the eye within the eye box, the location and intensity of the incoming light from the outside world and the location and light strength of the CGI, according to an embodiment of the present invention. Device 100 may comprise an exemplary pair of augmented reality (AR) glasses 8 having a transmissive display 25 such as an liquid crystal or an light emitting display and a digital display 35 such as a waveguide digital display, in addition to a single light sensor 10, an opacity controller 20 and the elements needed to display CGI 42, such as a CGI generator 30. AR glasses 8 may be any form of eyeglasses having lenses 6.
[0034] When a user puts AR glasses 8 on, he or she may see the real world through the regular lenses 6 and displays 25 and 35. CGI generator 30 may display CGI 42 to the user on digital display 35, which the user may see together with his or her view of the real world. It will be appreciated that CGI generator 30 may have knowledge of the strength of light required to display CGI 42. Light sensor 10 may determine the intensity of incoming light from the real world together with the position of the incoming light. Light sensor 10 may calculate the coordinates of the incoming light relative to transmissive displays 35 as a function of the measurement of the relative intensity within a spectrum and its azimuth of the light source as is known in the art.
[0035] It will be appreciated that eye tracker 15 may determine the location of eye 40 within the eye box. As discussed herein above, the incoming light may be from any type of light source 12, such as from the sun, from indoor lighting, etc.
[0036] It will be appreciated that, with AR glasses 8, in a dark room, the light strength on CGI 42 may be low and no external light blocking may be needed. On the other hand, when using AR glasses 8 outdoors on a sunny day, there may be full light strength on CGI 42 and therefore, full light blocking may be needed. For cloudy days, the need for light blocking may vary as strength of the light may change.
[0037] Reference is now made to Fig. 4 which illustrates a system diagram of device 100. As is illustrated, opacity controller 20 may receive input from light sensor 10, eye tracker 15 (per eye 15L for the left eye and 15R for the right eye) and CGI generator 30. It may also receive coordinate system 50 of AR glasses 8, system coordinate system 54 from LCD display 35 showing the location of the projected CGI 42 together with a 2D“waveguide” or CGI coordinate system 52. 1 [0038] Opacity controller 20 may further comprise an opacity calculator 21 and a location calculator 22. Opacity calculator 21 may calculate the balance of light required to see CGI 42 clearly and as a result determine the level of opacity required for transmissive displays 25. Location calculator 22 may determine the gaze direction of eye 40 and therefore the position of a corrected opacity spot behind CGI 42 relative to the position of eye 40 with in the eye box, the location of the incoming light and the location of CGI 42 as described in more detail herein below. It will be appreciated that opacity controller 20 may control the opacity of displays 35 at the pixel level by shading required pixels. It will be further appreciated that controller 20 may control each side of glasses 8, i.e. each side individually.
[0039] Opacity controller 20 may receive the measurements of the intensities of the real world light from light sensor 10 and of CGI 42 from CGI generator 30 accordingly and opacity calculator 21 may calculate the required opacity level as described in more detail herein below. As discussed herein above, opacity controller 20 may also receive the coordinates 54 of CGI 42 on digital display 35 and the coordinates of the location of the incoming light relative to transmissive displays 35 from light sensor 10.
[0040] Location calculator 22 may determine where to place opacity or shade 45 as a function of the relationship between the location of user’s eye 40, the location of incoming light on display 25 and the location of the CGI on digital display 35 for each side of glasses 8 as is illustrated in Fig. 5 to which reference is now made. Fig. 5 illustrates the relationship between three coordinate systems - the coordinate system 50 of glasses 8 denoting the location of eye 40 (the eye 40 coordinate origin), a 2D“waveguide” or CGI coordinate system 52 and the coordinates of transmissive display 25 (2D LCD coordinate system 54) as discussed in more detail herein below. As discussed herein above, the coordinate systems are determined from the output of an eye tracker 15 and light sensor 10 together with the coordinates of the location of CGI 42 on display 25.
[0041] Location calculator 22 may use the above mentioned relationship between the coordinate systems to calculate the optimal location for opacity of transmissive displays 25. In response, opacity controller 20 may change the opacity of transmissive displays 25 in the area which the user views as being behind CGI 42, separately for each eye.
[0042] World coordinate system 50 may be the coordinate system of AR glasses 8 and may have its origin located at the middle point between eyes 40. These coordinates may denote the location of each eye 40 viewing CGI 42. Variables in this coordinate system are denoted with the superscript“world”. The location of each eye in world coordinate system 50 is denoted EworId = ('gworld ^world ^world^ gjvcn in centimeters (cm). As an approximation, assuming that each eye has no depth or width and thus, E orId = E orId = 0. Each eye only has a distance to the right or left of the origin of world coordinate system (i.e. the middle point between the eyes) and thus, E orId = ±6.3 cm as an average pupillary distance in cm. A better approximation is to calibrate the pupillary distance per user and use this value for E orId. The best solution is to use eye tracking methods for getting in real time the accurate values for the eye location within the eye box.
[0043] Waveguide coordinate system 52 may be the coordinate system of digital displays 35 and may be denoted as“WG”. A center 53 of waveguide coordinate system 52 in world coordinate system 50 is given by the coordinates wW0rld = (W orLcL, W^vorUl, Wz worUi Center 53 of digital display 35 in waveguide coordinate system 52 is given by the coordinates CWG =
Figure imgf000015_0001
For example, the middle of digital display 35 is given by half of the width and half of the height of the screen where the width and height are
Figure imgf000015_0002
in centimeters. To convert from the centimeters of the world coordinates to the pixel values of displays 35, a conversion factor M, where MWG = {M^G , M j G) in pixel/cm units, is utilized.
[0044] LCD coordinate system 54, denoted by LCD, may be the coordinate system of transmissive display 25. Its center 55, in world coordinate system 50, is given by the coordinates in LCD coordinate system 54 is given by the
Figure imgf000015_0003
center 53, the middle of transmissive display 25 is given by half of the width and half of the height of display 25:
Figure imgf000016_0001
height LCD
, which may be converted to pixels using an LCD conversion, MLCD
2
Figure imgf000016_0002
[0045] Eye tracker 15 may provide the 3D the eye location for each eye 40, which may be provided in world coordinates as Oworld =
Figure imgf000016_0003
The actual gaze direction may be defined as the difference between the position of the eye within the eye box (from eye tracker 15) and the position of the external light relative to the transmissive displays 35 as measured by light sensor 10. The eye position may be represented as world coordinate system 52 represent
Figure imgf000016_0004
the gaze direction as two angles in spherical coordinates, a and b, where a = arctan woriaj and b a is the azimuth and has the range a E [0,2p) and b is the inclination
Figure imgf000016_0005
and has the range b E [0, p] . If the user’s gaze direction is perpendicular to digital display 35, then
Figure imgf000016_0006
[0046] Location calculator 22 may use the above definitions to provide a basis to calculate the transition of each pixel position on digital display 35 to a corresponding pixel position in transmissive display 25. A pixel P on digital display 35, which is in waveguide coordinate system 52, PWG =
Figure imgf000016_0007
may be transferred to world coordinates as follows:
tWG
) +
Figure imgf000016_0008
[0048] Assuming that a digital display 35 and a transmissive display 25 are parallel to each other, the distance between them is Ad = L^orld— Wz worUL, in centimeters. Moreover, as shown in Figs. 6 A and 6B to which reference is now made, there is a shift in the X and Y directions from the location of pixel P in digital display 35 to a location of a pixel Q on transmissive display 25 due to the gaze direction and thus, Ax = Ad tan(a)
Figure imgf000017_0001
[0049] Accordingly, the position of pixel Q on transmissive display 25 in world coordinates is Qworid _
Figure imgf000017_0002
The difference between pixel Q and the center of transmissive display 25 in world coordinates is: Qworld = (Qx °rld— L orld, Qy 0rld— Ly 0rld Since Qworld i a difference, the difference in LCD coordinate system 54, is determined merely by converting Qworld to a pixel difference, i.e. AQLCD = (AQx °rld M^CD , AQy 0rld MyCD Finally, the real position of pixel Q in LCD coordinate system 54 is given by:
Figure imgf000017_0003
[0051] Thus location calculator 22 may utilize equation 1 to convert each pixel of CGI 42, on a digital display 35, to each pixel of shade 45 on a transmissive display 25 relative to the gaze of eye 40 and the location of CGI 42.
[0052] It will be appreciated that an unclear image of CGI 42 (as discussed herein above) may be caused if only a portion of the light behind CGI 42 is blocked, such as may happen when the shift of Fig. 6B is ignored. By sensing the relative location of CGI 42 and the position of eye 40 (using eye tracker 15), as described hereinabove, and then blocking those pixels Q on transmissive display 25 which eye 40 may view as being behind CGI 42, opacity controller 20 may keep such unclear image caused by an imbalance of light from happening.
[0053] Opacity calculator 21 may balance between the external light coming in from outside and the intensity of the display light of CGI 42 as discussed herein above. Opacity calculator 21 may utilize the information from CGI generator 30 about CGI 42 to calculate the strength of light to be displayed on digital display 35 and may compare it to the amount of external light as measured by light sensor 10. As input (as is illustrated in Fig. 4 back to which reference is now made), opacity calculator 21 may receive the output of light sensor 10 and may receive information about from CGI generator 30 about CGI 42 as a set of pixel intensities.
[0054] It will be appreciated that the intensity of the incoming light from light sensor 10 may be considered a light spectrum and be measured in terms of its RGB (red, green, blue) intensities. Opacity calculator 21 may also determine the light intensity of CGI 42 by its colors (i.e. by dividing CGI 42 into its RGB components (red, green and blue elements)). Thus opacity calculator 21 may determine the amount of external light to be blocked by the transmissive display 25 by calculating the difference between the two sets of RGB values in order to ensure that the external light does not overwhelm the light of CGI 42. In general, the external light behind CGI 42, as defined above, may be fully or partially blocked to reduce the external light strength to no larger than the light strength of CGI 42.
[0055] In accordance with a preferred embodiment of the present invention, opacity controller 20 may apply different blocking amounts to the different colors of the CGI 42 accordingly.
[0056] Thus device 100 may solve the issue of an unclear image of CGI 42 by calculating the gaze of the user’s eye 30 relative to the location of CGI 42 and by blocking relevant pixels on the transmission display.
[0057] Unless specifically stated otherwise, as apparent from the preceding discussions, it is appreciated that, throughout the specification, discussions utilizing terms such as "processing," "computing," "calculating," "determining," or the like, refer to the action and/or processes of a general purpose computer of any type, such as a client/server system, mobile computing devices, smart appliances, cloud computing units or similar electronic computing devices that manipulate and/or transform data within the computing system’s registers and/or memories into other data within the computing system’s memories, registers or other such information storage, transmission or display devices.
[0058] Embodiments of the present invention may include apparatus for performing the operations herein. This apparatus may be specially constructed for the desired purposes, or it may comprise a computing device or system typically having at least one processor and at least one memory, selectively activated or reconfigured by a computer program stored in the computer. The resultant apparatus when instructed by software may turn the general purpose computer into inventive elements as discussed herein. The instructions may define the inventive device in operation with the computer platform for which it is desired. Such a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk, including optical disks, magnetic-optical disks, read-only memories (ROMs), volatile and non volatile memories, random access memories (RAMs), electrically programmable read-only memories (EPROMs), electrically erasable and programmable read only memories (EEPROMs), magnetic or optical cards, Flash memory, disk-on-key or any other type of media suitable for storing electronic instructions and capable of being coupled to a computer system bus. The computer readable storage medium may also be implemented in cloud storage.
[0059] Some general purpose computers may comprise at least one communication element to enable communication with a data network and/or a mobile communications network.
[0060] The processes and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the desired method. The desired structure for a variety of these systems will appear from the description below. In addition, embodiments of the present invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
[0061 ] While certain features of the invention have been illustrated and described herein, many modifications, substitutions, changes, and equivalents will now occur to those of ordinary skill in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the tme spirit of the invention.

Claims

CLAIMS What is claimed is:
1. A device for augmented reality glasses, the device comprising:
at least one digital display to display a computer generated image (CGI) to an eye, said CGI having a CGI intensity; at least one eye tracker to track the location of said eye within an eye box when looking at said CGI; a transmissive display located behind each said digital display, said transmissive display to provide pixel-based opacity; a light sensor to determine an intensity of incoming light to said glasses; said light sensor to also determine the location of said incoming light relative to said transmissive display; and an opacity controller to determine a location for said pixel-based opacity and to apply a level of pixel-based opacity at said location; said opacity controller further comprising: an opacity calculator to determine said level of said pixel-based opacity by comparing RGB values between said incoming light and said CGI intensity; wherein said level of said pixel-based opacity balances said incoming light intensity and said CGI intensity; and an opacity locator to determine a location for said pixel-based opacity on said transmissive display, said opacity locator to calculate a gaze direction for said eye viewing said CGI, wherein said gaze direction is the difference between said location of said eye and said location of said incoming light, said opacity locator to then calculate the shift between the location of said gaze direction and the location of said CGI on said at least one digital display.
2. The device according to claim 1 wherein said comparing RGB values comprises differentiating between the colors of said CGI and the colors of said incoming light.
3. The device according to claim 1 wherein said at least one digital display is parallel to said transmissive display.
4. The device according to claim 1 wherein said pixel based opacity is adjustable according to a change in said incoming light.
5. The device according to claim 1 wherein said of said incoming light is a measurement of the relative intensity within a spectrum of said incoming light and its azimuth of the light source.
6. The system according to claim 1 wherein said opacity locator utilizes the coordinate system of said glasses, the coordinate system of said at least one digital display and a CGI coordinate system.
7. A method for determining location and level of opacity for augmented reality glasses, the method comprising:
displaying via at least one digital display, a computer generated image (CGI) to an eye, said CGI having a CGI intensity; tracking the location of said eye within an eye box when looking at said CGI; determining an intensity of incoming light to said glasses; determining the location of said incoming light to said glasses relative to a transmissive display providing pixel based opacity; and determining a location and level for said pixel-based opacity, said determining a location and level for said pixel-based opacity further comprising: comparing RGB values between said incoming light and said CGI intensity; calculating a gaze direction for said eye viewing said CGI, wherein said gaze direction is the difference between said location of said eye and said location of said incoming light to said glasses; and calculating the shift between the location of said gaze direction and the location of said CGI on said at least one digital display
8. The method according to claim 7 wherein said comparing RGB values comprises differentiating between the colors of said CGI and the colors of said incoming light.
9. The method according to claim 7 wherein said at least one digital display is parallel to said transmissive display.
10. The method according to claim 7 and also comprising adjusting said pixel based opacity according to a change in said incoming light.
11. The method according to claim 7 and wherein said determining an intensity of incoming light comprises measuring the relative intensity within a spectrum of said incoming light and its azimuth of the light source.
12. The method according to claim 7 wherein determining a location utilizes the coordinate system of said glasses, the coordinate system of said at least one digital display and a CGI coordinate system.
PCT/IL2020/050191 2019-02-24 2020-02-20 Changing the opacity of augmented reality glasses in response to external light sources WO2020170253A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962809638P 2019-02-24 2019-02-24
US62/809,638 2019-02-24
US201962889588P 2019-08-21 2019-08-21
US62/889,588 2019-08-21

Publications (1)

Publication Number Publication Date
WO2020170253A1 true WO2020170253A1 (en) 2020-08-27

Family

ID=72144386

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2020/050191 WO2020170253A1 (en) 2019-02-24 2020-02-20 Changing the opacity of augmented reality glasses in response to external light sources

Country Status (1)

Country Link
WO (1) WO2020170253A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023279923A1 (en) * 2021-07-05 2023-01-12 北京有竹居网络技术有限公司 Wearable display apparatus, light transmittance regulation method and apparatus, and device and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140085282A1 (en) * 2012-09-21 2014-03-27 Nvidia Corporation See-through optical image processing
US20140347391A1 (en) * 2013-05-23 2014-11-27 Brian E. Keane Hologram anchoring and dynamic positioning
US20180249151A1 (en) * 2015-03-17 2018-08-30 Raytrx, Llc Wearable image manipulation and control system with correction for vision defects and augmentation of vision and sensing
US20180314066A1 (en) * 2017-04-28 2018-11-01 Microsoft Technology Licensing, Llc Generating dimming masks to enhance contrast between computer-generated images and a real-world view

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140085282A1 (en) * 2012-09-21 2014-03-27 Nvidia Corporation See-through optical image processing
US20140347391A1 (en) * 2013-05-23 2014-11-27 Brian E. Keane Hologram anchoring and dynamic positioning
US20180249151A1 (en) * 2015-03-17 2018-08-30 Raytrx, Llc Wearable image manipulation and control system with correction for vision defects and augmentation of vision and sensing
US20180314066A1 (en) * 2017-04-28 2018-11-01 Microsoft Technology Licensing, Llc Generating dimming masks to enhance contrast between computer-generated images and a real-world view

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023279923A1 (en) * 2021-07-05 2023-01-12 北京有竹居网络技术有限公司 Wearable display apparatus, light transmittance regulation method and apparatus, and device and medium

Similar Documents

Publication Publication Date Title
US11676333B2 (en) Spatially-resolved dynamic dimming for augmented reality device
US10209520B2 (en) Near eye display multi-component dimming system
US11526032B2 (en) Smart sunglasses, smart window and smart mirror apparatus for augmenting human vision by means of adaptive polarization filter grids
KR102478370B1 (en) Use of pupil position to correct optical lens distortion
US20240036288A1 (en) Augmented reality device for adjusting focus region according to direction of user's view and operating method of the same
US9720238B2 (en) Method and apparatus for a dynamic “region of interest” in a display system
US20180314066A1 (en) Generating dimming masks to enhance contrast between computer-generated images and a real-world view
US20180188536A1 (en) Near eye display multi-component dimming system
US10371998B2 (en) Display apparatus and method of displaying using polarizers and optical combiners
US11719941B2 (en) Systems and methods for external light management
US11885973B2 (en) Ambient light management systems and methods for wearable devices
CN105093796A (en) Display device
CN109803133B (en) Image processing method and device and display device
US10699383B2 (en) Computational blur for varifocal displays
KR100514241B1 (en) Method for evaluating binocular performance of spectacle lenses, method for displaying said performance and apparatus therefor
CN108957742B (en) Augmented reality helmet and method for realizing virtual transparent dynamic adjustment of picture
WO2020170253A1 (en) Changing the opacity of augmented reality glasses in response to external light sources
US20080158686A1 (en) Surface reflective portable eyewear display system and methods
WO2023154586A1 (en) Determining display gazability and placement of virtual try-on glasses using optometric measurements
Hwang et al. 23.4: Augmented Edge Enhancement for Vision Impairment using Google Glas
Hwang et al. Augmented Edge Enhancement on Google Glass for Vision‐Impaired Users
US12032170B2 (en) Systems and methods for external light management
US20240144605A1 (en) Augmented reality lens selective tint adjustments
US20200036962A1 (en) Mixed reality glasses which display virtual objects that move naturally throughout a complete field of view of a user
WO2020026226A1 (en) Mixed reality glasses which display virtual objects that move naturally throughout a user's complete field of view

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20758969

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20758969

Country of ref document: EP

Kind code of ref document: A1