EP3414899B1 - Multi-depth plane display system with reduced switching between depth planes - Google Patents
Multi-depth plane display system with reduced switching between depth planesInfo
- Publication number
- EP3414899B1 EP3414899B1 EP17750889.2A EP17750889A EP3414899B1 EP 3414899 B1 EP3414899 B1 EP 3414899B1 EP 17750889 A EP17750889 A EP 17750889A EP 3414899 B1 EP3414899 B1 EP 3414899B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- user
- cues
- virtual object
- depth
- accommodation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/27—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/0001—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
- G02B6/0011—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
- G02B6/0013—Means for improving the coupling-in of light from the light source into the light guide
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/0001—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
- G02B6/0011—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
- G02B6/0033—Means for improving the coupling-out of light from the light guide
- G02B6/0035—Means for improving the coupling-out of light from the light guide provided on the surface of the light guide or in the bulk of it
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/0001—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
- G02B6/0011—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
- G02B6/0033—Means for improving the coupling-out of light from the light guide
- G02B6/005—Means for improving the coupling-out of light from the light guide provided by one optical element, or plurality thereof, placed on the light output side of the light guide
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/0001—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems
- G02B6/0011—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings specially adapted for lighting devices or systems the light guides being planar or of plate-like form
- G02B6/0075—Arrangements of multiple light guides
- G02B6/0076—Stacked arrangements of multiple light guides of the same or different cross-sectional area
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/111—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
- H04N13/117—Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/383—Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/388—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
- H04N13/395—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0112—Head-up displays characterised by optical features comprising device for genereting colour display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0127—Head-up displays characterised by optical features comprising devices increasing the depth of field
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0181—Adaptation to the pilot/driver
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0185—Displaying image at variable distance
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- a virtual reality, or "VR” scenario typically involves the presentation of digital or virtual image information without transparency to other actual real-world visual input;
- an augmented reality, or "AR” scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user.
- a mixed reality, or "MR”, scenario is a type of AR scenario and typically involves virtual objects that are integrated into, and responsive to, the natural world.
- an MR scenario may include AR image content that appears to be blocked by or is otherwise perceived to interact with objects in the real world.
- the display detects a face of an interlocutor in the real world and covers the face with virtual content visible to a user of the display, so that the user can see virtual content while the interlocutor believes he has the attention of the user.
- a stereoscopic HMD with augumented reality tracks the gaze of the users eyes and adjusts the accommodation and binocular cues accordingly.
- an augmented reality scene 10 is depicted.
- the user of an AR technology sees a real-world park-like setting 20 featuring people, trees, buildings in the background, and a concrete platform 30.
- the user also perceives that he/she "sees" "virtual content” such as a robot statue 40 standing upon the real -world platform 30, and a flying cartoon-like avatar character 50 which seems to be a personification of a bumble bee.
- These elements 50, 40 are "virtual” in that they do not exist in the real world.
- the human visual perception system is complex, it is challenging to produce AR technology that facilitates a comfortable, natural-feeling, rich presentation of virtual image elements amongst other virtual or real-world imagery elements.
- vari-focal display systems may present virtual content at discrete depth planes, with all virtual content being presented at a same depth plane for each frame presented to the user (e.g., only one depth plane is active, or outputting image information, at a time).
- the vari- focal display system may determine a three-dimensional fixation point at which the user is fixating, and may select a depth plane to present all virtual content based on the fixation point.
- the display system may monitor the orientations and/or shapes of the user's eyes and determine a three-dimensional location at which respective determined gazes of the eyes intersect.
- each depth plane may encompass a particular range of depths from the user, such that a depth plane may be selected according to the three-dimensional location at which the user's eyes are fixating.
- a display system may include one or more waveguides configured to output light with different wavefront divergence corresponding to different depth planes.
- different accommodation cues may be provided to the user and the display system may cause a first virtual object to appear to be located at a first depth in the user's field of view, while causing a second virtual object (using light having different wavefront divergence) to appear to be located at a second depth in the user's field of view.
- the display system may present different images to each eye of the user; for example, the display system may be a binocular display that presents slightly different views of a virtual object to each eye, with each view corresponding to the view of the virtual object by each respective eye at a given depth plane. That is, the display system may provide dichoptic presentations of the virtual object to the eyes, such that depth can, in part, be represented through binocular disparity. A virtual object may therefore be perceived by the user as existing at different depths based on the output wavefront divergence and different views of the virtual object provided to each eye.
- the vari-focal display system may therefore present all virtual content using a particular wavefront divergence (corresponding to a particular depth plane) for each frame.
- a particular wavefront divergence corresponding to a particular depth plane
- light having different wavefront divergence may be selected to present other virtual content on other depth planes.
- a user that is switching fixation between two objects on different depth planes may cause rapid and repeated switching between depth planes, which may be perceived by the user as flicker as the different depth planes become active at different times.
- the techniques described herein may advantageously be utilized to reduce the occurrence of flicker and/or the lag in viewing content caused by depth plane switching and/or display system initiated changes in the user's accommodation.
- the display system may have access to a map or database indicating where various virtual objects may be placed in three-dimensional space.
- the display system may determine that a user is switching between viewing a first virtual object and either (1) a second virtual object or (2) a real-world object that are located at a different depth plane (e.g., farther or closer) from the first virtual object. The display system may then cause the objects to be presented at a same depth plane.
- the display system may cause the first virtual object and the second virtual object to be output with same, or similar, wavefront divergence (e.g., presented via a same waveguide) regardless of whether the user is fixating on the first virtual object or second virtual object. Additionally, the display system may cause the first virtual object to be output with wavefront divergence that is associated with a depth of the real -world object. In some other embodiments, the display system may determine the depth plane at which the user is fixating and modify the depth plane indicated by the display system's map before outputting the virtual content in the first instance.
- wavefront divergence e.g., presented via a same waveguide
- the display system reduces switching in the accommodative response of a user by adjusting an output wavefront divergence, so that a virtual object is presented with the same wavefront divergence (corresponding to the same depth plane) as another virtual object or real object, while retaining (e.g., not adjusting) the views of the virtual object that are provided to each eye.
- the perceived location in three- dimensional space may be preserved based on binocular cues (e.g., binocular disparity). As a result, the perceived three-dimensional location may be maintained while avoiding changes in accommodation.
- modifying wavefront divergence of a virtual object while retaining the same views of the virtual object presented to each eye may result in negative physiological responses (e.g., headaches, eye strain, fatigue, etc.), due to a mismatch between a perceived depth associated with a modified wavefront divergence, and a perceived depth associated with binocular disparity.
- the perceived depth (e.g., in diopters) associated with a particular wavefront divergence and the perceived depth (e.g., in diopters) associated with binocular disparity may be determined by the display system and the mismatch between these two values may be calculated to determine the accommodation vergence mismatch.
- the display system may perform additional actions (in addition to modifying the output wavefront divergence) if the mismatch is greater than a threshold (e.g., 0.33 diopter).
- a threshold e.g. 0.33 diopter.
- the display system may adjust binocular disparity, for example, modifying views of the virtual object provided to each eye to correspond to the depth plane associated with the wavefront divergence.
- the display system may adjust binocular disparity of a virtual object, without modifying wavefront divergence of the virtual object. That is, in contrast to the display system adjusting wavefront divergence to limit a frequency with which a user accommodates to virtual objects, the display system may adjust binocular disparity such that virtual objects are perceived to be at a same, or similar, depth plane as other objects.
- the techniques and systems disclosed herein may be applied to healthcare contexts where virtual objects may be viewed in conjunction with real objects.
- a surgeon may be operating on a real-life patient while wearing a display system.
- the display system may advantageously present medical information to the surgeon, such as heart rate information.
- the display system may cause the medical information to be presented on a depth plane closest to the patient, such that the surgeon may avoid having to switch accommodation between the patient and the medical information as the surgeon performs his/her duties.
- a user may be playing a racecar driving game while wearing a display system.
- the data may include data a) captured from sensors (which may be, e.g., operatively coupled to the frame 80 or otherwise attached to the user 90), such as image capture devices (such as cameras), microphones, inertial measurement units, accelerometers, compasses, GPS units, radio devices, gyros, and/or other sensors disclosed herein; and/or b) acquired and/or processed using remote processing module 150 and/or remote data repository 160 (including data relating to virtual content), possibly for passage to the display 70 after such processing or retrieval.
- sensors which may be, e.g., operatively coupled to the frame 80 or otherwise attached to the user 90
- image capture devices such as cameras
- microphones such as cameras
- inertial measurement units such as cameras
- accelerometers compasses
- GPS units GPS units
- radio devices radio devices
- gyros radio devices
- one or more intervening optical structures may be provided between the scanning fiber, or fibers, and the one or more waveguides 270, 280, 290, 300, 310 to, e.g., redirect light exiting the scanning fiber into the one or more waveguides 270, 280, 290, 300, 310.
- a controller 560 controls the operation of one or more of the stacked waveguide assembly 260, including operation of the image injection devices 360, 370, 380, 390, 400, the light source 530, and the light modulator 540.
- the controller 560 is part of the local data processing module 140.
- the controller 560 includes programming (e.g., instructions in a non-transitory medium) that regulates the timing and provision of image information to the waveguides 270, 280, 290, 300, 310 according to, e.g., any of the various schemes disclosed herein.
- the controller may be a single integral device, or a distributed system connected by wired or wireless communication channels.
- the controller 560 may be part of the processing modules 140 or 150 ( Figure 2 ) in some embodiments.
- the waveguides 270, 280, 290, 300, 310 may be configured to propagate light within each respective waveguide by total internal reflection (TIR).
- the waveguides 270, 280, 290, 300, 310 may each be planar or have another shape (e.g., curved), with major top and bottom surfaces and edges extending between those major top and bottom surfaces.
- the waveguides 270, 280, 290, 300, 310 may each include out-coupling optical elements 570, 580, 590, 600, 610 that are configured to extract light out of a waveguide by redirecting the light, propagating within each respective waveguide, out of the waveguide to output image information to the eye 210.
- Extracted light may also be referred to as out-coupled light and the out-coupling optical elements light may also be referred to light extracting optical elements.
- An extracted beam of light may be outputted by the waveguide at locations at which the light propagating in the waveguide strikes a light extracting optical element.
- the out-coupling optical elements 570, 580, 590, 600, 610 may, for example, be gratings, including diffractive optical features, as discussed further herein.
- each waveguide 270, 280, 290, 300, 310 is configured to output light to form an image corresponding to a particular depth plane.
- the waveguide 270 nearest the eye may be configured to deliver collimated light (which was injected into such waveguide 270), to the eye 210.
- the collimated light may be representative of the optical infinity focal plane.
- the next waveguide up 280 may be configured to send out collimated light which passes through the first lens 350 (e.g., a negative lens) before it can reach the eye 210; such first lens 350 may be configured to create a slight convex wavefront curvature so that the eye/brain interprets light coming from that next waveguide up 280 as coming from a first focal plane closer inward toward the eye 210 from optical infinity.
- first lens 350 e.g., a negative lens
- the third up waveguide 290 passes its output light through both the first 350 and second 340 lenses before reaching the eye 210; the combined optical power of the first 350 and second 340 lenses may be configured to create another incremental amount of wavefront curvature so that the eye/brain interprets light coming from the third waveguide 290 as coming from a second focal plane that is even closer inward toward the person from optical infinity than was light from the next waveguide up 280.
- the other waveguide layers 300, 310 and lenses 330, 320 are similarly configured, with the highest waveguide 310 in the stack sending its output through all of the lenses between it and the eye for an aggregate focal power representative of the closest focal plane to the person.
- a compensating lens layer 620 may be disposed at the top of the stack to compensate for the aggregate power of the lens stack 320, 330, 340, 350 below.
- Such a configuration provides as many perceived focal planes as there are available waveguide/lens pairings.
- Both the out-coupling optical elements of the waveguides and the focusing aspects of the lenses may be static (i.e., not dynamic or electro-active). In some alternative embodiments, either or both may be dynamic using electro-active features.
- two or more of the waveguides 270, 280, 290, 300, 310 may have the same associated depth plane.
- multiple waveguides 270, 280, 290, 300, 310 may be configured to output images set to the same depth plane, or multiple subsets of the waveguides 270, 280, 290, 300, 310 may be configured to output images set to the same plurality of depth planes, with one set for each depth plane. This may provide advantages for forming a tiled image to provide an expanded field of view at those depth planes.
- the out-coupling optical elements 570, 580, 590, 600, 610 may be configured to both redirect light out of their respective waveguides and to output this light with the appropriate amount of divergence or collimation for a particular depth plane associated with the waveguide.
- waveguides having different associated depth planes may have different configurations of out-coupling optical elements 570, 580, 590, 600, 610, which output light with a different amount of divergence depending on the associated depth plane.
- the light extracting optical elements 570, 580, 590, 600, 610 may be volumetric or surface features, which may be configured to output light at specific angles.
- the light extracting optical elements 570, 580, 590, 600, 610 may be volume holograms, surface holograms, and/or diffraction gratings.
- the features 320, 330, 340, 350 may not be lenses; rather, they may simply be spacers (e.g., cladding layers and/or structures for forming air gaps).
- one or more DOEs may be switchable between "on” states in which they actively diffract, and "off states in which they do not significantly diffract.
- a switchable DOE may comprise a layer of polymer dispersed liquid crystal, in which microdroplets comprise a diffraction pattern in a host medium, and the refractive index of the microdroplets may be switched to substantially match the refractive index of the host material (in which case the pattern does not appreciably diffract incident light) or the microdroplet may be switched to an index that does not match that of the host medium (in which case the pattern actively diffracts incident light).
- the camera assembly 630 may be attached to the frame 80 ( Figure 2 ) and may be in electrical communication with the processing modules 140 and/or 150, which may process image information from the camera assembly 630. In some embodiments, one camera assembly 630 may be utilized for each eye, to separately monitor each eye.
- an inward facing camera may also be configured to detect the accommodative response, or accommodation state, of the user's eyes, to display content to the user without requiring the user to change that accommodative response.
- the inward facing camera may be configured to detect the accommodative response, or accommodation state, of each of the user's eyes.
- the displayed content may include alerts, menu items, or other content that may be beneficial for the user to clearly see irrespective of the depth at which their eyes are focused.
- FIG. 7 an example of exit beams outputted by a waveguide is shown.
- One waveguide is illustrated, but it will be appreciated that other waveguides in the waveguide assembly 260 ( Figure 6 ) may function similarly, where the waveguide assembly 260 includes multiple waveguides.
- Light 640 is injected into the waveguide 270 at the input surface 460 of the waveguide 270 and propagates within the waveguide 270 by TIR. At points where the light 640 impinges on the DOE 570, a portion of the light exits the waveguide as exit beams 650.
- the exit beams 650 are illustrated as substantially parallel but, as discussed herein, they may also be redirected to propagate to the eye 210 at an angle (e.g., forming divergent exit beams), depending on the depth plane associated with the waveguide 270. It will be appreciated that substantially parallel exit beams may be indicative of a waveguide with out-coupling optical elements that out-couple light to form images that appear to be set on a depth plane at a large distance (e.g., optical infinity) from the eye 210.
- a full color image may be formed at each depth plane by overlaying images in each of the component colors, e.g., three or more component colors.
- Figure 8 illustrates an example of a stacked waveguide assembly in which each depth plane includes images formed using multiple different component colors.
- the illustrated embodiment shows depth planes 240a - 240f, although more or fewer depths are also contemplated.
- Each depth plane may have three or more component color images associated with it, including: a first image of a first color, G; a second image of a second color, R; and a third image of a third color, B.
- Different depth planes are indicated in the figure by different numbers for diopters (dpt) following the letters G, R, and B.
- the numbers following each of these letters indicate diopters (1/m), or inverse distance of the depth plane from a viewer, and each box in the figures represents an individual component color image.
- the exact placement of the depth planes for different component colors may vary. For example, different component color images for a given depth plane may be placed on depth planes corresponding to different distances from the user. Such an arrangement may increase visual acuity and user comfort and/or may decrease chromatic aberrations.
- each depth plane may have multiple waveguides associated with it.
- each box in the figures including the letters G, R, or B may be understood to represent an individual waveguide, and three waveguides may be provided per depth plane where three component color images are provided per depth plane. While the waveguides associated with each depth plane are shown adjacent to one another in this drawing for ease of description, it will be appreciated that, in a physical device, the waveguides may all be arranged in a stack with one waveguide per level. In some other embodiments, multiple component colors may be outputted by the same waveguide, such that, e.g., only a single waveguide may be provided per depth plane.
- one or more of the in- coupling optical elements 700, 710, 720 may be disposed on the bottom major surface of the respective waveguide 670, 680, 690 (particularly where the one or more in-coupling optical elements are reflective, deflecting optical elements). As illustrated, the in-coupling optical elements 700, 710, 720 may be disposed on the upper major surface of their respective waveguide 670, 680, 690 (or the top of the next lower waveguide), particularly where those incoupling optical elements are transmissive, deflecting optical elements. In some embodiments, the incoupling optical elements 700, 710, 720 may be disposed in the body of the respective waveguide 670, 680, 690.
- the in- coupling optical elements 700, 710, 720 are wavelength selective, such that they selectively redirect one or more wavelengths of light, while transmitting other wavelengths of light. While illustrated on one side or corner of their respective waveguide 670, 680, 690, it will be appreciated that the in-coupling optical elements 700, 710, 720 may be disposed in other areas of their respective waveguide 670, 680, 690 in some embodiments.
- each in-coupling optical element 700, 710, 720 may be laterally offset from one another.
- each in-coupling optical element may be offset such that it receives light without that light passing through another in- coupling optical element.
- each incoupling optical element 700, 710, 720 may be configured to receive light from a different image injection device 360, 370, 380, 390, and 400 as shown in Figure 6 , and may be separated (e.g., laterally spaced apart) from other in- coupling optical elements 700, 710, 720 such that it substantially does not receive light from the other ones of the in-coupling optical elements 700, 710, 720.
- the light distributing elements 730, 740, 750 may be disposed on both top and bottom major surface of associated waveguides 670, 680, 690, respectively; or the light distributing elements 730, 740, 750, may be disposed on different ones of the top and bottom major surfaces in different associated waveguides 670, 680, 690, respectively.
- the waveguides 670, 680, 690 may be spaced apart and separated by, e.g., gas, liquid, and/or solid layers of material.
- layer 760a may separate waveguides 670 and 680; and layer 760b may separate waveguides 680 and 690.
- the layers 760a and 760b are formed of low refractive index materials (that is, materials having a lower refractive index than the material forming the immediately adjacent one of waveguides 670, 680, 690).
- the refractive index of the material forming the layers 760a, 760b is 0.05 or more, or 0.10 or less than the refractive index of the material forming the waveguides 670, 680, 690.
- the lower refractive index layers 760a, 760b may function as cladding layers that facilitate total internal reflection (TIR) of light through the waveguides 670, 680, 690 (e.g., TIR between the top and bottom major surfaces of each waveguide).
- TIR total internal reflection
- the layers 760a, 760b are formed of air. While not illustrated, it will be appreciated that the top and bottom of the illustrated set 660 of waveguides may include immediately neighboring cladding layers.
- the material forming the waveguides 670, 680, 690 are similar or the same, and the material forming the layers 760a, 760b are similar or the same.
- the material forming the waveguides 670, 680, 690 may be different between one or more waveguides, and/or the material forming the layers 760a, 760b may be different, while still holding to the various refractive index relationships noted above.
- the light rays 770, 780, 790 have different properties, e.g., different wavelengths or different ranges of wavelengths, which may correspond to different colors.
- the incoupling optical elements 700, 710, 720 each deflect the incident light such that the light propagates through a respective one of the waveguides 670, 680, 690 by TIR.
- the incoupling optical elements 700, 710, 720 each selectively deflect one or more particular wavelengths of light, while transmitting other wavelengths to an underlying waveguide and associated incoupling optical element.
- in-coupling optical element 700 may be configured to deflect ray 770, which has a first wavelength or range of wavelengths, while transmitting rays 780 and 790, which have different second and third wavelengths or ranges of wavelengths, respectively.
- the transmitted ray 780 impinges on and is deflected by the incoupling optical element 710, which is configured to deflect light of a second wavelength or range of wavelengths.
- the ray 790 is deflected by the in-coupling optical element 720, which is configured to selectively deflect light of third wavelength or range of wavelengths.
- the deflected light rays 770, 780, 790 are deflected so that they propagate through a corresponding waveguide 670, 680, 690; that is, the in-coupling optical elements 700, 710, 720 of each waveguide deflects light into that corresponding waveguide 670, 680, 690 to in-couple light into that corresponding waveguide.
- the light rays 770, 780, 790 are deflected at angles that cause the light to propagate through the respective waveguide 670, 680, 690 by TIR.
- the light rays 770, 780, 790 propagate through the respective waveguide 670, 680, 690 by TIR until impinging on the waveguide's corresponding light distributing elements 730, 740, 750.
- FIG. 9B a perspective view of an example of the plurality of stacked waveguides of Figure 9A is illustrated.
- the in-coupled light rays 770, 780, 790 are deflected by the in-coupling optical elements 700, 710, 720, respectively, and then propagate by TIR within the waveguides 670, 680, 690, respectively.
- the light rays 770, 780, 790 then impinge on the light distributing elements 730, 740, 750, respectively.
- the light distributing elements 730, 740, 750 deflect the light rays 770, 780, 790 so that they propagate towards the out-coupling optical elements 800, 810, 820, respectively.
- the second object 1008 may be perceived to still be located further in depth from the first object 1006 via binocular disparity (e.g., binocular cues may be maintained).
- binocular disparity e.g., binocular cues may be maintained.
- adjusting accommodation cues involves changing the wavefront divergence of light that provides image information for forming the user's view of a virtual object
- adjusting binocular cues involves changing the views of the virtual object presented to one or both eyes of the viewer. Adjusting accommodation cues while retaining binocular cues may result in accommodation-vergence mismatches, such that binocular disparity indicates a first depth to the user, while wavefront divergence indicates a second depth to the user. If this mismatch exceeds a threshold (e.g., 0.33 diopters or more, 0.5 diopters or more), negative physiological responses may be encountered by the user (e.g., headaches).
- a threshold e.g. 0.33 diopters or more
- the display system may determine that the mismatch exceeds the threshold, and may further adjust the binocular cues for the second object 1008 and/or the first object 1006.
- the views of the second object 1008 being provided to each eye of the user by the display system may be adjusted to correspond views of the second object 1008 as they would be seen at a same, or similar, depth as depth plane B (binocular cues may be adjusted). That is, the second object 1008 will be perceived as being located at a same, or similar, depth from the user as the first object 1006. While this may introduce a perceptible modification in three-dimensional location of the second object 1008, the reduction in negative physiological responses may be advantageous.
- a perceived size of the second object 1008 may be adjusted according to a perceived distance that the second object 1008 appears to have moved. For example, as illustrated in Figure 10A the second object 1008 was initially presented as being farther from the first object 1006 (e.g., based at least in part on binocular disparity). Therefore, if binocular cues are adjusted, such that the second object 1008 is perceived to be located closer to the user, the display system may scale the second virtual object 1008 so that it is perceived to be a similar size. For example, the display system may scale the second virtual object 1008 so that it takes up a same quantity (e.g., volume and/or area) of the field of view 1004 of the user. As another example, the display system may scale the second virtual object 1008 so that it matches an angular resolution of the eyes of the user. For example, the font size of text can increase for a book moved further from the user.
- the display system may determine that the mismatch exceeds the threshold, and may adjust binocular cues of the second object 1008, such that the mismatch is less than or equal to the threshold. That is, in contrast to modifying binocular cues (e.g., views of the object 1008 being provided to each eye) to correspond with a same depth as the first virtual object 1006, the display system may modify the binocular cues such that the mismatch is merely less than the threshold. This may have the effect of the second object 1008 being perceived as being closer in depth to the user, but not suddenly be perceived as being at the same depth as the first object 1006.
- the display system may optionally adjust the depth of the second object 1008 upon determining that the user is switching between the first object 1006 and the second object 1008 greater than a threshold metric, e.g., greater than a threshold number of times, greater than a threshold frequency, and/or greater than a threshold duration. For example, if the user routinely fixates on the first object 1006, and rarely fixates or sporadically fixates on the second object 1008, the display system may determine not to adjust a depth plane associated with the second object 1008. On the other hand, as an example, if the user fixates back and forth between the first object 1006 and the second object 1008 over the span of several seconds, or minutes, the display system may place the first and the second object at a common depth.
- a threshold metric e.g., greater than a threshold number of times, greater than a threshold frequency, and/or greater than a threshold duration.
- That common depth may be selected based on various criteria. For example, if the first object 1006 is a real object (e.g., a book) and the second object 1008 is a virtual object, the display system may optionally adjust a depth plane associated with the second object 1008 (e.g., the adjusted depth plane may be depth plane B).
- the first object 1006 and the second object 1008 may be virtual objects and the determination regarding which depth plane to place both objects may be made based on which depth plane to the user is predominantly fixated on. For example, the determination may be based on an amount of time the user fixates on each virtual object.
- the display system may adjust a depth plane associated with the second object 1008 such that the second object is placed on depth plane B.
- the display system may obtain information indicating a preference of the user. For example, the user may indicate that he/she considers the second object 1008 to be the main virtual object, and therefore a depth plane associated with the first object 1006 may be adjusted this correspond to the depth plane of the second object 1008.
- particular types of virtual objects may be identified as not being available for depth adjustment. As an example, a virtual object that contains text or fine-detail may maintain its present depth plane, while other virtual objects are adjusted.
- the display system may adjust a location of the second virtual object 1008, e.g., such that the second virtual object 1008 is shifted in location.
- the user may indicate a location at which the second virtual object 1008 is to be located.
- a user that is a surgeon may prefer that virtual content in the form of medical information is displayed at a particular location with respect to a patient.
- the surgeon may indicate this particular location using inputs, such as hand gestures, voice input, totems, and so on.
- the display system may recognize a hand gesture as indicating a particular location, and the display system may adjust the location of the second virtual object 1008 so that it is perceived to be at the particular location.
- a wavefront divergence of any virtual object to be presented at a depth within range 1106A-1106B may be the same, and therefore be associated with depth plane 2.
- the sizes and shapes of the depth planes can be different than as illustrated in Figure 11 .
- the volumes defining the depth planes may have curved shapes.
- the display system may determine a fixation point of the user. If the fixation point falls within range 1106A-1106B, the display system may present virtual content with a wavefront divergence associated with depth plane 2. If the user then fixates on a location that falls within a range associated with depth plane 1, the display system may present content with a wavefront divergence associated with depth plane 1.
- the display system may be a vari-focal display system, such that for any frame being presented to the user, a single depth plane is utilized. For example, one waveguide may be utilized to output all virtual content for each frame.
- the display system may adjust depth planes at which to present particular virtual objects. In this way, the display system may limit an extent to which a depth plane selected to present virtual content needs to be modified.
- a first object may be presented in depth plane 1 volume 1108, and a second object may be presented in depth plane 2 volume 1109.
- the display system may present the first object and the second object on depth plane 2 1104 (e.g., via a same waveguide associated with depth plane 2 1104). Subsequently, as the user switches fixation to be on the second object, the display may present the first object and the second object via depth plane 1 1102.
- both objects may be placed on the same depth plane. That is, a depth plane associated with either the first object or the second object may be adjusted, so that either depth plane 2 1104 or depth plane 1 1102 may be associated with both objects.
- Figure 12 illustrates a flowchart of an example process 1200 for adjusting a depth plane associated with a virtual object.
- the process 1200 may be described as being performed by a display system (e.g., the wearable display system 60, which may include processing hardware and software, and optionally may provide information to an outside system of one or more computers or other processing, for instance to offload processing to the outside system, and receive information from the outside system).
- a display system e.g., the wearable display system 60, which may include processing hardware and software, and optionally may provide information to an outside system of one or more computers or other processing, for instance to offload processing to the outside system, and receive information from the outside system.
- the display system monitors three-dimensional fixation points of a user's eyes.
- the display system may include sensors to monitor information associated with the user's eyes (e.g., the orientation of the eyes).
- a non- exhaustive list of sensors includes infrared sensors, ultraviolet sensors, and visible wavelength light sensors.
- the sensors may optionally output infrared, ultraviolet, visible light, and/or polarized light onto the user's eyes, and determine reflections of the outputted light from the user's eyes.
- infrared light may be output by an infrared light emitter, and an infrared light sensor. It will be appreciated that the sensor, which may include a light emitter, may correspond to the imaging device 630 of Figure 6 .
- the display system may monitor determined fixation points to track objects that the user is viewing. For example, the display system may determine that the user is viewing a first virtual object based on a determined three-dimensional fixation point corresponding to a three-dimensional location at which the first virtual object is presented. Additionally, the display system may determine that the user is fixating at a location not corresponding to a virtual object, and may determine that a real -world object is likely located at the fixation point.
- the display system obtains location information associated with virtual objects for presentation to the user.
- the display system may obtain three-dimensional location information associated with the virtual objects.
- the virtual objects may be presented to the user such that the content appears to be located in the real -world (e.g., the content may be located at different depth planes within the user's field of view).
- the display system may include, or may have access to, a three-dimensional map of the ambient environment, including the intended locations of any virtual content in this ambient environment. With reference to this map, the display system may access and provide information specifying three-dimensional locations of virtual content within the user's field of view (e.g., locations within a display frustum, as illustrated in Figures 10A-10B ).
- location information for a virtual object may include a three-dimensional location. Based on the three-dimensional location, the virtual object may be associated with a particular depth plane (e.g., as illustrated and described in Figure 11 ), such that if the user fixates on the virtual object, the particular depth plane may be utilized to present all virtual content for each frame until the user switches fixation.
- a particular depth plane e.g., as illustrated and described in Figure 11
- the display system adjusts a depth plane associated with a virtual object.
- the display system may determine that the user is switching between two or more fixation points greater than a threshold metric, e.g., greater than a threshold number of times or greater than a threshold frequency. For example, the user may switch between a first fixation point and a second fixation point once every 30 seconds, every minute, every several minutes, and so on.
- first fixation point and second fixation point are associated with different depth planes, for example as illustrated in Figure 11 , if the fixation points are located in distinct depth plane volumes, then the display system could be required to present content for a different depth plane every time the user switches between the fixation points (e.g., without utilizing the techniques described herein).
- the display system may therefore adjust a depth plane associated with a virtual object. As described herein, this may include associating the virtual object with a depth plane of another object (e.g., virtual object, real-world object) the user is also fixating on. That is, the accommodation cues associated with the virtual object (e.g., wavefront divergence) may be adjusted to correspond to a different depth plane.
- the display system may utilize the obtained location information to determine the virtual object's three-dimensional location, and may update the location information with the adjusted depth plane. As described above, the display system may select a virtual object to adjust according to a frequency with which the virtual object is being fixated upon. For example, if the user is fixating on a first virtual object greater than a second virtual object, the display system may adjust the second virtual object so that it is presented on the depth plane on which the first virtual object is already being presented.
- the display system may obtain an indication of a virtual object that is to be adjusted.
- the user may indicate that a particular virtual object is to be given preference with respect to the adjustment, such that other virtual objects are adjusted and the particular virtual object remains at its associated depth plane.
- virtual objects that are moving e.g., moving faster than a threshold rate
- the display system may obtain indications of virtual objects that are not to be adjusted.
- the display system may optionally obtain indications of virtual objects that are to remain linked and in focus to the user, such that they are presented at a same depth plane, no matter what adjustments are made. For example, the display system may present five virtual objects, and may ensure that two of the five virtual objects remain in focus (e.g., the two virtual objects may be presented at a same depth plane).
- the display system presents the adjusted content to the user.
- the adjusted virtual object may be output to the user with an adjusted wavefront divergence. That is, if the user fixates on the virtual object, the virtual object may be output to the user with a wavefront divergence corresponding to the adjusted depth plane, and not its original depth plane (as indicated by the display system's map of the ambient environment).
- An example of adjusting wavefront divergence is adjusting a particular waveguide from which the virtual object is to be displayed, where different waveguides output light with different amounts of wavefront divergence.
- Figure 13 illustrates a flowchart of another example process 1300 for adjusting a depth plane associated with a virtual object.
- the process 1300 may be described as being performed by a display system (e.g., the wearable display system 60, which may include processing hardware and software, and optionally may provide information to an outside system of one or more computers or other processing, for instance to offload processing to the outside system, and receive information from the outside system).
- a display system e.g., the wearable display system 60, which may include processing hardware and software, and optionally may provide information to an outside system of one or more computers or other processing, for instance to offload processing to the outside system, and receive information from the outside system.
- the display system monitors three-dimensional fixation points, and the display system obtains location information associated with presented content in block 1304, similar to blocks 1202, 1204, respectively ( Figure 12 ).
- the display system adjusts a depth plane associated with a virtual object.
- the display system may associate the virtual object with a different depth plane, thereby adjusting accommodation cues associated with the virtual object.
- the display system may adjust binocular cues to adjust a perceived depth of the virtual object. That is, the display system may adjust views of the virtual object that are provided to each eye of the user, such an adjustment may involve adjusting binocular disparity of the virtual content through dichoptic presentation.
- the virtual object can appear to be presented at a same, or similar, depth as other virtual content, or at a depth different from the virtual object's original depth.
- the display system then presents the adjusted content to the user at block 1308, as described above for block 1208 with respect to Figure 12 .
- Figure 14 illustrates a flowchart of an example process for adjusting accommodation cues and binocular cues associated with a virtual object.
- the process 1300 may be described as being performed by a display system (e.g., the wearable display system 60, which may include processing hardware and software, and optionally may provide information to an outside system of one or more computers or other processing, for instance to offload processing to the outside system, and receive information from the outside system).
- a display system e.g., the wearable display system 60, which may include processing hardware and software, and optionally may provide information to an outside system of one or more computers or other processing, for instance to offload processing to the outside system, and receive information from the outside system.
- the display system obtains an indication of a virtual object to be adjusted, as discussed above for blocks 1202, 1204 ( Figure 12 ).
- the presentation of a particular virtual object, or one or more virtual objects may be adjusted with respect to an associated depth plane.
- a depth plane associated with the virtual object may be adjusted relative to the depth plane indicated by the display system's map.
- the wavefront divergence of the virtual object may be adjusted so that it is the same as one or more other virtual objects on which the user is also fixating on. Since adjusting the depth plane adjusts wavefront divergence, by utilizing binocular disparity, the virtual object may be perceived by the user as being at a same, unadjusted, three-dimensional location.
- the display system may present the same views of the virtual object to each eye of the user as would have been presented without adjusting the wavefront divergence of the virtual object, for example, by maintaining the same dichoptic presentations.
- adjusting the wavefront divergence and not binocular cues may introduce a perceptible mismatch to the user.
- display system adjusts binocular cues associated with the virtual object at block 1406 so that those binocular cues are consistent with changes in the accommodation cues noted above.
- the display system may adjust the views of the virtual object that are presented to each eye of the user. For example, if the accommodation cues associated with the virtual object are adjusted such that depth is perceived to be closer to the user, the display system may similarly adjust the binocular cues to correspond to that closer depth.
- the display system may implement an adjustment of the binocular cues after determining that the user is switching between two or more fixation points greater than a threshold metric, e.g., greater than a threshold number of times or greater than a threshold frequency.
- a threshold metric e.g., greater than a threshold number of times or greater than a threshold frequency.
- the adjusted virtual content is presented to the user. Since both the accommodation cues and the binocular cues are being adjusted, the virtual content will be perceived as being located at a new location in three-dimensional space corresponding to the updated depth. Since the virtual content will be perceived at the new location, optionally the display system may update the location information associated with the virtual content to correspond to the new location.
- each of the processes, methods, and algorithms described herein and/or depicted in the figures may be embodied in, and fully or partially automated by, code modules executed by one or more physical computing systems, hardware computer processors, applicationspecific circuitry, and/or electronic hardware configured to execute specific and particular computer instructions.
- computing systems may include general purpose computers (e.g., servers) programmed with specific computer instructions or special purpose computers, special purpose circuitry, and so forth.
- a code module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language.
- particular operations and methods may be performed by circuitry that is specific to a given function.
- a video may include many frames, with each frame having millions of pixels, and specifically programmed computer hardware is necessary to process the video data to provide a desired image processing task or application in a commercially reasonable amount of time.
- Code modules or any type of data may be stored on any type of non- transitory computer-readable medium, such as physical computer storage including hard drives, solid state memory, random access memory (RAM), read only memory (ROM), optical disc, volatile or non-volatile storage, combinations of the same and/or the like.
- the non-transitory computer-readable medium may be part of one or more of the local processing and data module (140), the remote processing module (150), and remote data repository (160).
- the methods and modules may also be transmitted as generated data signals (e.g., as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames).
- generated data signals e.g., as part of a carrier wave or other analog or digital propagated signal
- computer-readable transmission mediums including wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames).
- the results of the disclosed processes or process steps may be stored, persistently or otherwise, in any type of non-transitory, tangible computer storage or may be communicated via a computer- readable transmission medium.
- conditional language used herein such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Ophthalmology & Optometry (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Controls And Circuits For Display Device (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
- Diffracting Gratings Or Hologram Optical Elements (AREA)
- Eyeglasses (AREA)
- Mechanical Optical Scanning Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
Applications Claiming Priority (8)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201662294147P | 2016-02-11 | 2016-02-11 | |
| US201662366533P | 2016-07-25 | 2016-07-25 | |
| US201662366599P | 2016-07-25 | 2016-07-25 | |
| US201662396071P | 2016-09-16 | 2016-09-16 | |
| US201662440332P | 2016-12-29 | 2016-12-29 | |
| US201662440336P | 2016-12-29 | 2016-12-29 | |
| US201762445630P | 2017-01-12 | 2017-01-12 | |
| PCT/US2017/017505 WO2017139667A1 (en) | 2016-02-11 | 2017-02-10 | Multi-depth plane display system with reduced switching between depth planes |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| EP3414899A1 EP3414899A1 (en) | 2018-12-19 |
| EP3414899A4 EP3414899A4 (en) | 2019-11-06 |
| EP3414899B1 true EP3414899B1 (en) | 2025-08-13 |
Family
ID=59563501
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP17750889.2A Active EP3414899B1 (en) | 2016-02-11 | 2017-02-10 | Multi-depth plane display system with reduced switching between depth planes |
Country Status (10)
| Country | Link |
|---|---|
| US (2) | US12182945B2 (enExample) |
| EP (1) | EP3414899B1 (enExample) |
| JP (4) | JP7089475B2 (enExample) |
| KR (3) | KR102503155B1 (enExample) |
| CN (2) | CN108886612B (enExample) |
| AU (2) | AU2017217972B2 (enExample) |
| CA (1) | CA3014189A1 (enExample) |
| IL (3) | IL260939B2 (enExample) |
| NZ (2) | NZ744822A (enExample) |
| WO (1) | WO2017139667A1 (enExample) |
Families Citing this family (33)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11138793B2 (en) | 2014-03-14 | 2021-10-05 | Magic Leap, Inc. | Multi-depth plane display system with reduced switching between depth planes |
| KR102503155B1 (ko) | 2016-02-11 | 2023-02-22 | 매직 립, 인코포레이티드 | 깊이 평면들 간의 감소된 스위칭을 갖는 다중-깊이 평면 디스플레이 시스템 |
| CN114690881A (zh) | 2016-03-04 | 2022-07-01 | 奇跃公司 | 减少用电的显示系统以及用于减少显示系统的用电的方法 |
| IL311155A (en) | 2016-03-25 | 2024-04-01 | Magic Leap Inc | Virtual and augmented reality systems and methods |
| EP3523782B1 (en) | 2016-10-05 | 2024-10-30 | Magic Leap, Inc. | Periocular test for mixed reality calibration |
| KR102170123B1 (ko) | 2016-12-14 | 2020-10-26 | 주식회사 엘지화학 | 차광막이 형성되어 있는 도파관 및 이의 제조방법 |
| US10373936B2 (en) * | 2017-08-22 | 2019-08-06 | Facebook Technologies, Llc | Pixel elements including light emitters of variable heights |
| CN107835403B (zh) * | 2017-10-20 | 2020-06-26 | 华为技术有限公司 | 一种以3d视差效果显示的方法及装置 |
| KR20250049566A (ko) | 2018-01-17 | 2025-04-11 | 매직 립, 인코포레이티드 | 디스플레이 시스템들에서의 눈 회전 중심 결정, 깊이 평면 선택, 및 렌더 카메라 포지셔닝 |
| US10917634B2 (en) | 2018-01-17 | 2021-02-09 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
| US10284753B1 (en) | 2018-07-03 | 2019-05-07 | Sony Corporation | Virtual reality media content generation in multi-layer structure based on depth of field |
| WO2020018938A1 (en) | 2018-07-19 | 2020-01-23 | Magic Leap, Inc. | Content interaction driven by eye metrics |
| EP4478161A3 (en) | 2018-07-24 | 2025-03-26 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and eyes of a user |
| US11468640B2 (en) | 2018-08-03 | 2022-10-11 | Magic Leap, Inc. | Depth plane selection for multi-depth plane display systems by user categorization |
| JP7444861B2 (ja) | 2018-09-26 | 2024-03-06 | マジック リープ, インコーポレイテッド | 屈折力を有する回折光学要素 |
| CN109803133B (zh) * | 2019-03-15 | 2023-04-11 | 京东方科技集团股份有限公司 | 一种图像处理方法及装置、显示装置 |
| CN111726602B (zh) * | 2019-03-22 | 2022-04-22 | 舜宇光学(浙江)研究院有限公司 | 异视场角摄像-显像系统的匹配方法及其系统和计算系统 |
| EP3911992A4 (en) | 2019-04-11 | 2022-03-23 | Samsung Electronics Co., Ltd. | HEAD MOUNTED DISPLAY DEVICE AND METHOD OF OPERATION THEREOF |
| CN110244839B (zh) * | 2019-05-20 | 2022-11-18 | 联想(上海)信息技术有限公司 | 控制方法、电子设备和存储介质 |
| JP7426413B2 (ja) * | 2019-05-23 | 2024-02-01 | マジック リープ, インコーポレイテッド | ブレンドモード3次元ディスプレイシステムおよび方法 |
| EP3796267A1 (en) * | 2019-09-23 | 2021-03-24 | Dassault Systèmes | A computer-implemented method for assisting a positioning of a 3d object in a 3d scene |
| CN115053270A (zh) * | 2019-12-09 | 2022-09-13 | 奇跃公司 | 用于基于用户身份来操作头戴式显示系统的系统和方法 |
| WO2021119171A1 (en) | 2019-12-10 | 2021-06-17 | Magic Leap, Inc. | Increased depth of field for mixed-reality display |
| JP7717091B2 (ja) * | 2020-05-26 | 2025-08-01 | マジック リープ, インコーポレイテッド | ウェアラブルデバイスのためのモノビジョンディスプレイ |
| CN115668106A (zh) * | 2020-06-05 | 2023-01-31 | 奇跃公司 | 基于图像的神经网络分析的增强眼睛跟踪技术 |
| WO2022219877A1 (ja) * | 2021-04-12 | 2022-10-20 | ソニーグループ株式会社 | 情報処理装置、情報処理方法およびプログラム |
| EP4348587A1 (en) * | 2021-06-02 | 2024-04-10 | Dolby Laboratories Licensing Corporation | Method, encoder, and display device for representing a three-dimensional scene and depth-plane data thereof |
| CN113419350B (zh) * | 2021-06-18 | 2023-05-23 | 深圳市腾讯计算机系统有限公司 | 虚拟现实显示设备、画面呈现方法、装置及存储介质 |
| CN113426110B (zh) * | 2021-06-24 | 2023-11-17 | 腾讯科技(上海)有限公司 | 虚拟角色交互方法、装置、计算机设备和存储介质 |
| EP4430579A4 (en) * | 2021-11-14 | 2025-04-16 | Bria Artificial Intelligence Ltd | FACILITATING THE GENERATION AND USE OF VISUAL CONTENT |
| US11863730B2 (en) * | 2021-12-07 | 2024-01-02 | Snap Inc. | Optical waveguide combiner systems and methods |
| CN115251827B (zh) * | 2022-09-26 | 2022-12-30 | 广东视明科技发展有限公司 | 基于虚实结合的深度知觉评估方法及系统 |
| KR20250121794A (ko) * | 2024-02-05 | 2025-08-12 | 삼성전자주식회사 | 이미지를 디스플레이하기 위한 파라미터를 조절하는 전자 장치 및 그 동작 방법 |
Family Cites Families (89)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6222525B1 (en) | 1992-03-05 | 2001-04-24 | Brad A. Armstrong | Image controllers with sheet connected sensors |
| JP3298082B2 (ja) * | 1994-12-13 | 2002-07-02 | 日本電信電話株式会社 | ヘッドマウントディスプレィ装置 |
| US5670988A (en) | 1995-09-05 | 1997-09-23 | Interlink Electronics, Inc. | Trigger operated electronic device |
| JP3651204B2 (ja) | 1996-12-18 | 2005-05-25 | トヨタ自動車株式会社 | 立体画像表示装置、立体画像表示方法及び記録媒体 |
| JP3882273B2 (ja) * | 1997-06-03 | 2007-02-14 | 日産自動車株式会社 | 両眼立体視表示装置 |
| JPH11109279A (ja) | 1997-10-03 | 1999-04-23 | Minolta Co Ltd | 映像表示装置 |
| AU2002361572A1 (en) * | 2001-10-19 | 2003-04-28 | University Of North Carolina At Chape Hill | Methods and systems for dynamic virtual convergence and head mountable display |
| USD514570S1 (en) | 2004-06-24 | 2006-02-07 | Microsoft Corporation | Region of a fingerprint scanning device with an illuminated ring |
| US8982109B2 (en) | 2005-03-01 | 2015-03-17 | Eyesmatch Ltd | Devices, systems and methods of capturing and displaying appearances |
| US20070081123A1 (en) | 2005-10-07 | 2007-04-12 | Lewis Scott W | Digital eyewear |
| US8696113B2 (en) | 2005-10-07 | 2014-04-15 | Percept Technologies Inc. | Enhanced optical and perceptual digital eyewear |
| US11428937B2 (en) | 2005-10-07 | 2022-08-30 | Percept Technologies | Enhanced optical and perceptual digital eyewear |
| JP5129795B2 (ja) | 2009-08-19 | 2013-01-30 | 東洋ガラス株式会社 | 物体識別装置および物体選別装置 |
| US20110075257A1 (en) * | 2009-09-14 | 2011-03-31 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | 3-Dimensional electro-optical see-through displays |
| JP2011064760A (ja) | 2009-09-15 | 2011-03-31 | Nippon Seiki Co Ltd | 車両用表示装置 |
| JP4679661B1 (ja) | 2009-12-15 | 2011-04-27 | 株式会社東芝 | 情報提示装置、情報提示方法及びプログラム |
| US8467133B2 (en) | 2010-02-28 | 2013-06-18 | Osterhout Group, Inc. | See-through display with an optical assembly including a wedge-shaped illumination system |
| EP2539759A1 (en) | 2010-02-28 | 2013-01-02 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
| US20130278631A1 (en) * | 2010-02-28 | 2013-10-24 | Osterhout Group, Inc. | 3d positioning of augmented reality information |
| WO2012057434A1 (en) * | 2010-10-29 | 2012-05-03 | Lg Electronics Inc. | Stereoscopic image processing system and device and glasses |
| US9304319B2 (en) | 2010-11-18 | 2016-04-05 | Microsoft Technology Licensing, Llc | Automatic focus improvement for augmented reality displays |
| US10156722B2 (en) | 2010-12-24 | 2018-12-18 | Magic Leap, Inc. | Methods and systems for displaying stereoscopy with a freeform optical system with addressable focus for virtual and augmented reality |
| JP6185844B2 (ja) | 2010-12-24 | 2017-08-30 | マジック リープ, インコーポレイテッド | 人間工学的ヘッドマウントディスプレイデバイスおよび光学システム |
| JP5799521B2 (ja) * | 2011-02-15 | 2015-10-28 | ソニー株式会社 | 情報処理装置、オーサリング方法及びプログラム |
| CA3035118C (en) | 2011-05-06 | 2022-01-04 | Magic Leap, Inc. | Massive simultaneous remote digital presence world |
| US8743051B1 (en) | 2011-09-20 | 2014-06-03 | Amazon Technologies, Inc. | Mirror detection-based device functionality |
| US10795448B2 (en) | 2011-09-29 | 2020-10-06 | Magic Leap, Inc. | Tactile glove for human-computer interaction |
| RU2017115669A (ru) | 2011-10-28 | 2019-01-28 | Мэджик Лип, Инк. | Система и способ для дополненной и виртуальной реальности |
| WO2013077895A1 (en) | 2011-11-23 | 2013-05-30 | Magic Leap, Inc. | Three dimensional virtual and augmented reality display system |
| US20150097772A1 (en) * | 2012-01-06 | 2015-04-09 | Thad Eugene Starner | Gaze Signal Based on Physical Characteristics of the Eye |
| CA2869781C (en) | 2012-04-05 | 2021-04-27 | Magic Leap, Inc | Wide-field of view (fov) imaging devices with active foveation capability |
| CN103472909B (zh) * | 2012-04-10 | 2017-04-12 | 微软技术许可有限责任公司 | 用于头戴式、增强现实显示器的逼真遮挡 |
| CN104247411B (zh) * | 2012-04-19 | 2017-05-03 | 汤姆逊许可公司 | 校正由立体显示的调节效应引起的失真误差的方法和装置 |
| US10502876B2 (en) * | 2012-05-22 | 2019-12-10 | Microsoft Technology Licensing, Llc | Waveguide optics focus elements |
| JP2013251813A (ja) | 2012-06-01 | 2013-12-12 | Nikon Corp | 電子機器、表示制御システム、およびプログラム |
| WO2013188464A1 (en) | 2012-06-11 | 2013-12-19 | Magic Leap, Inc. | Multiple depth plane three-dimensional display using a wave guide reflector array projector |
| US9671566B2 (en) | 2012-06-11 | 2017-06-06 | Magic Leap, Inc. | Planar waveguide apparatus with diffraction element(s) and system employing same |
| US9860522B2 (en) | 2012-08-04 | 2018-01-02 | Paul Lapstun | Head-mounted light field display |
| US9841563B2 (en) | 2012-08-04 | 2017-12-12 | Paul Lapstun | Shuttered waveguide light field display |
| KR20150054967A (ko) | 2012-09-11 | 2015-05-20 | 매직 립, 인코포레이티드 | 인체공학적 헤드 마운티드 디스플레이 디바이스 및 광학 시스템 |
| US9310611B2 (en) * | 2012-09-18 | 2016-04-12 | Qualcomm Incorporated | Methods and systems for making the use of head-mounted displays less obvious to non-users |
| US20140092006A1 (en) * | 2012-09-28 | 2014-04-03 | Joshua Boelter | Device and method for modifying rendering based on viewer focus area from eye tracking |
| US9448404B2 (en) * | 2012-11-13 | 2016-09-20 | Qualcomm Incorporated | Modifying virtual object display properties to increase power performance of augmented reality devices |
| CA2898283C (en) | 2013-01-15 | 2021-05-11 | Magic Leap, Inc. | Ultra-high resolution scanning fiber display |
| EP2967322A4 (en) | 2013-03-11 | 2017-02-08 | Magic Leap, Inc. | System and method for augmented and virtual reality |
| US9041741B2 (en) | 2013-03-14 | 2015-05-26 | Qualcomm Incorporated | User interface for a head mounted display |
| NZ735754A (en) | 2013-03-15 | 2019-04-26 | Magic Leap Inc | Display system and method |
| JP5955348B2 (ja) | 2013-05-22 | 2016-07-20 | 株式会社テレパシーホールディングス | 撮影画像のプライバシー保護機能を有するウェアラブルデバイス及びその制御方法並びに画像共有システム |
| US9874749B2 (en) | 2013-11-27 | 2018-01-23 | Magic Leap, Inc. | Virtual and augmented reality systems and methods |
| US10262462B2 (en) | 2014-04-18 | 2019-04-16 | Magic Leap, Inc. | Systems and methods for augmented and virtual reality |
| US10330931B2 (en) | 2013-06-28 | 2019-06-25 | Microsoft Technology Licensing, Llc | Space carving based on human physical data |
| US9952042B2 (en) | 2013-07-12 | 2018-04-24 | Magic Leap, Inc. | Method and system for identifying a user location |
| US10134296B2 (en) | 2013-10-03 | 2018-11-20 | Autodesk, Inc. | Enhancing movement training with an augmented reality mirror |
| CA3134562A1 (en) | 2013-10-16 | 2015-04-23 | Magic Leap, Inc. | Virtual or augmented reality headsets having adjustable interpupillary distance |
| US20150123820A1 (en) | 2013-11-04 | 2015-05-07 | Airbus S.A.S. | Systems and methods for detecting pilot over focalization |
| US9672649B2 (en) | 2013-11-04 | 2017-06-06 | At&T Intellectual Property I, Lp | System and method for enabling mirror video chat using a wearable display device |
| JP5825328B2 (ja) * | 2013-11-07 | 2015-12-02 | コニカミノルタ株式会社 | 透過型hmdを有する情報表示システム及び表示制御プログラム |
| CN107329260B (zh) | 2013-11-27 | 2021-07-16 | 奇跃公司 | 虚拟和增强现实系统与方法 |
| US9857591B2 (en) * | 2014-05-30 | 2018-01-02 | Magic Leap, Inc. | Methods and system for creating focal planes in virtual and augmented reality |
| US9836122B2 (en) * | 2014-01-21 | 2017-12-05 | Osterhout Group, Inc. | Eye glint imaging in see-through computer display systems |
| NZ722903A (en) | 2014-01-31 | 2020-05-29 | Magic Leap Inc | Multi-focal display system and method |
| EP3712680B1 (en) | 2014-01-31 | 2022-07-13 | Magic Leap, Inc. | Multi-focal display system and method |
| US9313481B2 (en) | 2014-02-19 | 2016-04-12 | Microsoft Technology Licensing, Llc | Stereoscopic display responsive to focal-point shift |
| WO2015134740A1 (en) | 2014-03-05 | 2015-09-11 | Arizona Board Of Regents On Behalf Of The University Of Arizona | Wearable 3d augmented reality display with variable focus and/or object recognition |
| US10203762B2 (en) * | 2014-03-11 | 2019-02-12 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
| US11138793B2 (en) | 2014-03-14 | 2021-10-05 | Magic Leap, Inc. | Multi-depth plane display system with reduced switching between depth planes |
| US9977572B2 (en) | 2014-04-01 | 2018-05-22 | Hallmark Cards, Incorporated | Augmented reality appearance enhancement |
| JP6320143B2 (ja) | 2014-04-15 | 2018-05-09 | 株式会社東芝 | 健康情報サービスシステム |
| US10620700B2 (en) | 2014-05-09 | 2020-04-14 | Google Llc | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
| USD759657S1 (en) | 2014-05-19 | 2016-06-21 | Microsoft Corporation | Connector with illumination region |
| JP6577962B2 (ja) | 2014-05-30 | 2019-09-18 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | 仮想または拡張現実装置を用いて仮想コンテンツ表示を生成する方法およびシステム |
| CA3114696A1 (en) | 2014-05-30 | 2015-12-03 | Magic Leap, Inc. | Methods and system for creating focal planes in virtual and augmented reality |
| USD752529S1 (en) | 2014-06-09 | 2016-03-29 | Comcast Cable Communications, Llc | Electronic housing with illuminated region |
| US20150363654A1 (en) | 2014-06-12 | 2015-12-17 | GM Global Technology Operations LLC | Vision-based wet road surface detection using mirrored and real images |
| US9547365B2 (en) | 2014-09-15 | 2017-01-17 | Google Inc. | Managing information display |
| US9494799B2 (en) | 2014-09-24 | 2016-11-15 | Microsoft Technology Licensing, Llc | Waveguide eye tracking employing switchable diffraction gratings |
| EP3198192A1 (en) * | 2014-09-26 | 2017-08-02 | Milan Momcilo Popovich | Holographic waveguide opticaltracker |
| US9936195B2 (en) | 2014-11-06 | 2018-04-03 | Intel Corporation | Calibration for eye tracking systems |
| US9804669B2 (en) | 2014-11-07 | 2017-10-31 | Eye Labs, Inc. | High resolution perception of content in a wide field of view of a head-mounted display |
| JP6542547B2 (ja) | 2015-03-09 | 2019-07-10 | 古河電気工業株式会社 | レーダ装置およびレーダ装置の対象物検知方法 |
| NZ773815A (en) | 2015-03-16 | 2022-07-01 | Magic Leap Inc | Methods and systems for diagnosing and treating health ailments |
| USD758367S1 (en) | 2015-05-14 | 2016-06-07 | Magic Leap, Inc. | Virtual reality headset |
| US10289908B2 (en) | 2015-10-21 | 2019-05-14 | Nokia Technologies Oy | Method, apparatus, and computer program product for tracking eye gaze and eye movement |
| US9984507B2 (en) | 2015-11-19 | 2018-05-29 | Oculus Vr, Llc | Eye tracking for mitigating vergence and accommodation conflicts |
| US10204451B2 (en) | 2015-11-30 | 2019-02-12 | Microsoft Technology Licensing, Llc | Multi-optical surface optical design |
| EP3185176A1 (en) | 2015-12-21 | 2017-06-28 | THOMSON Licensing | Method and device for synthesizing an image of a face partially occluded |
| KR102503155B1 (ko) | 2016-02-11 | 2023-02-22 | 매직 립, 인코포레이티드 | 깊이 평면들 간의 감소된 스위칭을 갖는 다중-깊이 평면 디스플레이 시스템 |
| IL311155A (en) | 2016-03-25 | 2024-04-01 | Magic Leap Inc | Virtual and augmented reality systems and methods |
| KR20210025721A (ko) | 2016-08-02 | 2021-03-09 | 매직 립, 인코포레이티드 | 고정-거리 가상 및 증강 현실 시스템들 및 방법들 |
-
2017
- 2017-02-10 KR KR1020187026210A patent/KR102503155B1/ko active Active
- 2017-02-10 WO PCT/US2017/017505 patent/WO2017139667A1/en not_active Ceased
- 2017-02-10 IL IL260939A patent/IL260939B2/en unknown
- 2017-02-10 CN CN201780022733.7A patent/CN108886612B/zh active Active
- 2017-02-10 IL IL302656A patent/IL302656B2/en unknown
- 2017-02-10 AU AU2017217972A patent/AU2017217972B2/en active Active
- 2017-02-10 JP JP2018540761A patent/JP7089475B2/ja active Active
- 2017-02-10 CA CA3014189A patent/CA3014189A1/en active Pending
- 2017-02-10 KR KR1020237005780A patent/KR102587841B1/ko active Active
- 2017-02-10 KR KR1020237034082A patent/KR102726904B1/ko active Active
- 2017-02-10 CN CN202110498429.8A patent/CN113225547A/zh active Pending
- 2017-02-10 NZ NZ744822A patent/NZ744822A/en unknown
- 2017-02-10 IL IL312473A patent/IL312473A/en unknown
- 2017-02-10 EP EP17750889.2A patent/EP3414899B1/en active Active
- 2017-02-10 NZ NZ758505A patent/NZ758505A/en unknown
-
2021
- 2021-10-04 US US17/493,163 patent/US12182945B2/en active Active
- 2021-12-27 JP JP2021212125A patent/JP7273940B2/ja active Active
-
2022
- 2022-05-09 AU AU2022203090A patent/AU2022203090A1/en not_active Abandoned
-
2023
- 2023-04-28 JP JP2023074418A patent/JP7596436B2/ja active Active
-
2024
- 2024-10-30 JP JP2024190540A patent/JP2025013385A/ja active Pending
- 2024-11-21 US US18/955,217 patent/US20250086907A1/en active Pending
Also Published As
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12182945B2 (en) | Multi-depth plane display system with reduced switching between depth planes | |
| US11138793B2 (en) | Multi-depth plane display system with reduced switching between depth planes | |
| US11966059B2 (en) | Virtual and augmented reality systems and methods |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
| 17P | Request for examination filed |
Effective date: 20180731 |
|
| AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| AX | Request for extension of the european patent |
Extension state: BA ME |
|
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: G02B 27/22 20180101ALI20170905BHEP Ipc: H04N 13/04 20060101AFI20170905BHEP Ipc: G06T 19/20 20110101ALI20170905BHEP Ipc: A61B 3/08 20060101ALI20170905BHEP Ipc: G03B 35/16 20060101ALI20170905BHEP |
|
| RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: HARRISES, CHRISTOPHER, M. Inventor name: SAMEC, NICOLE, ELIZABETH Inventor name: BAERENRODT, MARK Inventor name: ROBAINA, NASTASJA, U. |
|
| DAV | Request for validation of the european patent (deleted) | ||
| DAX | Request for extension of the european patent (deleted) | ||
| R17P | Request for examination filed (corrected) |
Effective date: 20180731 |
|
| A4 | Supplementary search report drawn up and despatched |
Effective date: 20191008 |
|
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06K 9/00 20060101ALI20191001BHEP Ipc: H04N 13/383 20180101ALI20191001BHEP Ipc: G06K 9/78 20060101ALI20191001BHEP Ipc: H04N 13/344 20180101ALI20191001BHEP Ipc: G02B 27/22 20180101ALI20191001BHEP Ipc: G02B 27/01 20060101AFI20191001BHEP Ipc: H04N 13/122 20180101ALI20191001BHEP Ipc: G06K 9/80 20060101ALI20191001BHEP |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
| 17Q | First examination report despatched |
Effective date: 20200618 |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Free format text: PREVIOUS MAIN CLASS: G02B0027010000 Ipc: H04N0013122000 Ref document number: 602017091176 Country of ref document: DE |
|
| GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
| RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 13/122 20180101AFI20250220BHEP |
|
| INTG | Intention to grant announced |
Effective date: 20250305 |
|
| GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
| GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
| AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
| REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
| REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602017091176 Country of ref document: DE |
|
| REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
| REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
| P01 | Opt-out of the competence of the unified patent court (upc) registered |
Free format text: CASE NUMBER: UPC_APP_8519_3414899/2025 Effective date: 20250930 |