WO2025101397A1 - Activating a pass through mode of a headset based on eye pattern - Google Patents
Activating a pass through mode of a headset based on eye pattern Download PDFInfo
- Publication number
- WO2025101397A1 WO2025101397A1 PCT/US2024/053521 US2024053521W WO2025101397A1 WO 2025101397 A1 WO2025101397 A1 WO 2025101397A1 US 2024053521 W US2024053521 W US 2024053521W WO 2025101397 A1 WO2025101397 A1 WO 2025101397A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eye
- headset
- mode
- user
- eyes
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
Definitions
- This disclosure relates generally to monitoring eyes of a user via a headset and, more specifically, to activating a pass through mode of a headset based on eye pattern. Other aspects are also described.
- a virtual reality (VR) headset is a head mounted device (HMD) designed to immerse a user in a VR environment.
- VR headsets may be used in many applications, including video games, simulators, and control systems.
- VR headsets often include a display that the user can see while wearing the headset, providing separate images to each eye of the user to generate the VR environment.
- the headset may also include a sound system and various sensors, such as accelerometers and gyroscopes for tracking an orientation of the user in a real world environment for corresponding to an orientation of the user in the VR environment.
- Implementations of this disclosure include configuring a portable VR headset (or simply headset), used to perform an eye test, to detect an eye pattern of a user that enables transitioning the headset from a VR mode to a pass through mode. This can enable a more natural interaction between a patient and clinical staff during the administration of the eye test.
- Some implementations may include a system for monitoring one or both eyes of a user (e.g., a patient).
- the system may include a headset configured to be worn by the user and one or more processors which may be implemented by the headset, a user device, and/or a system device.
- the headset can include a display to generate an output which the user can see while wearing the headset, and an eye tracking subsystem to track one or both eyes of the user.
- the headset can operate alternately in a VR mode in which the display outputs an ophthalmic examination VR environment to the user to perform the eye test, and in a pass through mode in which the display outputs to the user a view of a real environment of the user.
- the one or more processors may be configured by instructions stored in memory to detect, via the eye tracking subsystem, an eye pattern of one or both eyes of the user.
- the eye pattern can include i) a sequence of movements of one or both eyes or ii) a closing of one or both eyes.
- the eye pattern can be detected while the headset is operating in the VR mode.
- the one or more processors may be further configured by instructions to switch the headset from the VR mode to the pass through mode based on detecting the eye pattern. Other aspects are also described and claimed.
- FIG. 1 is a front isometric view of an example of a headset including a display and an eye tracking subsystem.
- FIG. 2 is a side view of an example of a headset incorporating a display and an eye tracking subsystem.
- FIG. 3 is an exploded view of a portion of an example of a headset including a display and an eye tracking subsystem.
- FIG. 4 is a diagrammatic representation of light paths within an exemplary headset that includes a display and an eye tracking subsystem.
- FIG. 6 is a front cutaway view showing exemplary optical components in an exemplary headset that includes a display and an eye tracking subsystem.
- light 312a from the display 312 is not passed directly through the eyepiece lens 316, but instead passes through a first beam splitter 426 and reflects from a second beam splitter 426 before passing through the eyepiece lens 316 and into the eye 429 of the user.
- Light 428a from the illuminator 428 e.g., infrared (IR) light
- IR infrared
- the light 428a may be used to illuminate features on or inside the eye 429 that may be imaged or otherwise detected by the pupil detectors 422.
- the display 312 could be passed directly through the eyepiece lens 316.
- the display 312 could be configured directly behind the eyepiece lens 316 with or without utilizing beam splitters.
- the illuminator 428 may be configured beside the eyepiece lens 316 (e.g., on axes similar to the pupil detectors 422) instead of behind the eyepiece lens 316 (e.g., with or without utilizing beam splitters).
- the display 312 outputs an ophthalmic examination VR environment to the user.
- the VR environment may comprise an eye test for visual acuity, corneal topography, visual field, cover/uncover, color blindness, eye motility, contrast, or pupil response.
- the display 312 outputs to the user a view of a real environment of the user.
- the headset 100 may process digital images generated by one or more of the exterior cameras 115 (e.g., a digital, visible light camera) that may be integrated into the headset 100.
- the exterior cameras 115 may be configured to “look” outward, while the eye tracking subsystem (which may also be integrated into the headset 100) is configured to look inward at the user’s eyes (e.g., via an infrared light camera).
- the pupil detectors 422 may observe other parts of the eye, including but not limited to the eyelids, eyelashes, cornea, IOL, corneal tear film, tear ducts, and Meibomian glands, instead of or in addition to the pupil.
- other arrangements of optical components e.g., illuminators, cameras, beam splitters, and lenses may be used to achieve the effects disclosed herein.
- FIG. 5 is a diagrammatic representation of light paths within the headset 100 according to a second implementation shown by way of example.
- the display 312, divider 314, two eyepiece lenses 316, and two beam splitters 426 are situated within the display housing 110.
- the divider 314 defines two separate regions of the display 312. Each portion of the display emits light 312a (e.g., an image) that passes through a beam splitter 426 and through an eyepiece lens 316.
- Light 422a from the illuminator 428 reflecting from the surface of each eye passes through an eyepiece lens 316, reflects from a beam splitter 426, and into a pupil detector 422, which sits within a sensor housing 120.
- the pupil detectors 422 are behind the eyepiece lens 316.
- the source of light 422a which reflects from the surface of the eye may include one or more illuminators, such as infrared (IR) illuminators 428 (see FIGS. 4, 7A, 7B).
- IR infrared
- the beam splitter 426 reflects IR light but does not reflect visible light, such as light provided by the display 312.
- the beam splitters 426 reflect IR light and additional light, such as visible light.
- periocular temperature and humidity controller circuit boards 1200 may be located within the sensor housings 120.
- the temperature and humidity controller circuit boards 1200 may comprise one or more of a humidity sensor configured to measure a humidity within the periocular space of the headset 100, a temperature sensor configured to measure a temperature within the periocular space of the headset 100, a dehumidifier or humidity control element configured to control or adjust the humidity within the periocular space of the headset 100, a heating element, and/or a cooling element.
- a humidity sensor configured to measure a humidity within the periocular space of the headset 100
- a temperature sensor configured to measure a temperature within the periocular space of the headset 100
- a dehumidifier or humidity control element configured to control or adjust the humidity within the periocular space of the headset 100
- a heating element and/or a cooling element.
- the headset 100 can include a single temperature and humidity controller circuit board, or multiple temperature and humidity controller circuit boards.
- the headset can include one or more temperature and
- FIG. 6 is a front view of a portion of the headset 100 including the eye tracking subsystem in accordance with at least one implementation of the present disclosure.
- the display 312 has been removed from this view. Visible are the display housing 110, divider 314, two eyepiece lenses 316, two pupil detectors 422, and two beam splitters 426.
- FIG. 7A is a rear view of a portion of the headset 100 in accordance with at least one implementation of the present disclosure. Visible are the display housing 110, head strap attachment 130, forehead rest 150, and eyepiece lenses 316. Surrounding each eyepiece lens 316 is a plurality of illuminators 428 (e.g., infrared LEDs) that are configured to illuminate the eye for observation by the pupil detectors 422.
- FIG. 7B shows an illuminator assembly 430, according to an implementation of the present disclosure.
- the illuminator assembly includes a plurality of illuminators 438 (e.g., infrared LEDs), a positive power connector 434a, a negative power connector 434b, and an aperture 432 or opening.
- the aperture 432 is positioned around a camera of an ophthalmic testing unit.
- the reflections 1008 may represent glints from the illuminators 428.
- various features of this image may be used to assess the hydration or dryness of the eyes, including but not limited to blink pattern, blink rate or frequency, blink duration, between-blink interval, blink speed or slope, the size/brightness/location of reflections 1008 on the surface of the eye, and direct observation and measurement (e.g., pixel count and rate of change) of the thickness of the corneal tear film 1001, including any meniscus and convexity formed at the edge of the eyelid.
- other features of this image may be used to assess other ophthalmic conditions.
- FIG. 9 is an exemplary pupil camera image 1012 from the headset 100 in accordance with at least one implementation of the present disclosure. Visible are the pupil 1002, eyelids 1003, and tear duct 1004.
- the user has been asked, via output to display of the headset, to manually pull open their eyelids 1003, revealing Meibomian glands 1009.
- irregularities in the Meibomian glands or tear duct may be indicative of both the existence and potential causes of a dry eye problem.
- images of the Meibomian glands may be extremely useful in the process of determining a diagnosis and treatment.
- the user may be asked via the headset to assist with other measurements.
- such pupil camera images 1010 may be communicated to one or more clinicians, either co-located with the headset and the device or in one or more remote locations.
- FIG. 10 is an example of a system 1020 for activating a pass through mode of a headset based on an eye pattern of a user.
- the headset 100 may include an eye tracking subsystem 1022 and an ophthalmic testing unit 1024, such as the eye tracking subsystem and the ophthalmic testing unit described above with respect to FIGS. 1-9.
- the system 1020 may also include a device in communication with the headset 100, such as a user device 1026 and/or a system device 1028.
- the user device 1026 may be a mobile device (e.g., a mobile phone, laptop, or tablet) or desktop computer utilized by the user.
- the user device 1026 may run an application program (e.g., an app) to wirelessly control the headset 100, including for testing, monitoring, and evaluating eyes of the user via the display 312 and the ophthalmic testing unit 1024.
- the system device 1028 may be workstation or server (e.g., a cloud based computer) utilized by clinical staff.
- the headset 100 and/or the user device 1026 may communicate with the system device 1028 via a system network, including for testing, monitoring, and evaluating eyes of the user via the display 312 and the ophthalmic testing unit 1024.
- the system device 1028 might not be present, and in some implementations, the user device 1026 might not be present.
- the headset 100, the user device 1026, and/or the system device 1028 may each include one or more processors and memory to perform the various steps described herein.
- the headset 100 may utilize the display 312 to generate an output which the user can see while wearing the headset 100.
- the display 312 can generate VR, AR, and/or MR to the user.
- the headset 100 can operate alternately in a VR mode in which the display outputs the ophthalmic examination VR environment (e.g., via the ophthalmic testing unit 1024, to perform an eye test) to the user, and in a pass through mode in which the display outputs to the user a view of a real environment of the user (e.g., via the exterior cameras 115, oriented in front of the headset 100, transmitting a signal, corresponding to the physical environment of the user, to the display 312).
- a VR mode in which the display outputs the ophthalmic examination VR environment (e.g., via the ophthalmic testing unit 1024, to perform an eye test) to the user
- a pass through mode in which the display outputs to the user a view of a real environment of the user (e.g., via the exterior cameras
- the headset 100 may utilize the ophthalmic testing unit 1024 in the VR mode to output the ophthalmic examination VR environment and obtain sensor information from the user (e.g., testing one or both eyes).
- the ophthalmic testing unit 1024 may utilize the sensors described herein (e.g., the pupil detectors 422) to obtain sensor information indicating one or more physiological measurements of one or more eyes of the user.
- a physiological measurement could comprise a measure of visual acuity, drooping eye lids, blink pattern, blood vessels, inflammatory cells, IOP, muscle movement, corneal curvature, or pupil response.
- the ophthalmic testing unit 1024 may enable testing or grading for visual acuity, corneal topography, visual field, cover/uncover, color blindness, eye motility, contrast, or pupil response, such as for determining an ophthalmic score.
- the headset 100 may also utilize the eye tracking subsystem 1022 to track one or both eyes of the user.
- the eye tracking subsystem 1022 may utilize the sensors described herein (e.g., the pupil detectors 422) to detect eye patterns of one or both eyes of the user.
- the eye tracking subsystem 1022 can trigger the headset 100 to switch (e.g., via the display 312) from the VR mode to the pass through mode.
- the eye pattern can include a sequence of movements of one or both eyes and/or a closing of one or both eyes. The eye pattern can be detected while the headset 100 is operating in the VR mode.
- the eye tracking subsystem 1022 can trigger the headset 100 to toggle between the VR mode and the pass through mode (e.g., to go back and forth between the modes based on the same or different eye patterns).
- the eye pattern can also be detected while the headset 100 is operating in the pass through mode.
- the system 1020 may enable a more natural interaction between a patient and clinical staff during the administration of diagnostic tests utilizing the headset 100.
- the user can enter a training mode, via the headset 100, in which the user performs the eye pattern that will cause the switch or the toggle to be recorded in the data structure.
- the system 1020 can output, via the display 312, an indication to the user of the eye pattern that will cause the switch or the toggle.
- the user can receive training, via the headset 100, in which the user is taught the eye pattern to perform in order to cause the switch or the toggle.
- the eye pattern may correspond to passing or failing an eye test, causing the switch or the toggle.
- the eye test may require the user to follow an object moving in the display 312. The user failing to move their eye to follow the object, or successfully moving their eyes to follow the object, may determine whether to cause the switch or the toggle.
- a machine learning model may be trained to detect the eye pattern.
- the machine learning model may, for example, be or include one or more of a neural network (e.g., a convolutional neural network, recurrent neural network, deep neural network, or other neural network), decision tree, vector machine, Bayesian network, cluster-based system, genetic algorithm, deep learning system separate from a neural network, or other machine learning model.
- a neural network e.g., a convolutional neural network, recurrent neural network, deep neural network, or other neural network
- decision tree e.g., a convolutional neural network, recurrent neural network, deep neural network, or other neural network
- vector machine e.g., vector machine, Bayesian network, cluster-based system, genetic algorithm, deep learning system separate from a neural network, or other machine learning model.
- FIGS. 11-14 are examples of eye patterns for activating a pass through mode of the headset 100, including a sequence of movements of one or both eyes and/or a closing of one or both eyes.
- the eye patterns of one or more of FIGS. 11-14 may cause toggling of the headset 100 between the VR mode and the pass through mode based on detecting different eye patterns (e.g., going back and forth between the modes based on same or different eye patterns).
- the eye tracking subsystem 1022 may be utilized to detect the eye pattern to cause the switch or the toggle.
- the eye tracking subsystem 1022 can detect a sequence of eye movements that includes the user moving their eyes from side to side a number of times as shown in FIG.
- the eye pattern may include a combination of such movements, in a defined order, such as detecting the user moving their eyes up, then down, and then to the left.
- the eye pattern may include the user moving their eyes to comers and/or in circles.
- the eye tracking subsystem 1022 can detect a closing of one or both eyes for a minimum amount of time as shown in FIG. 13, such as an amount of time (e.g., 3 seconds) that is greater than an amount of time associated with a blink.
- the eye tracking subsystem 1022 can detect an eye pattern including a sequence of movements of one eye (e.g., the right eye) and the closing of another other eye (e.g., the left eye) as shown in FIG. 14.
- the eye pattern shown in FIG. 14 could be a combination of the eye patterns shown in FIGS. 11-13.
- the eye tracking subsystem 1022 can detect an eye pattern including opening and closing one or both eyes in a blink pattern (e.g., the user looking directly ahead and blinking quickly and/or slowly in a coded sequence).
- FIG. 15 is a flowchart of an example of a process 1500 for activating a pass through mode of a headset based on an eye pattern of a user.
- the process 1500 can be executed using computing devices, such as the systems, hardware, and software described with respect to FIGS. 1-14.
- the process 1500 can be performed, for example, by executing a machine-readable program or other computer-executable instructions, such as routines, instructions, programs, or other code.
- the operations of the process 1500 or another technique, method, process, or algorithm described in connection with the implementations disclosed herein can be implemented directly in hardware, firmware, software executed by hardware, circuitry, or a combination thereof.
- the process 1500 is depicted and described herein as a series of operations. However, the operations in accordance with this disclosure can occur in various orders and/or concurrently. Additionally, other operations not presented and described herein may be used. Furthermore, not all illustrated operations may be required to implement a technique in accordance with the disclosed subject matter.
- a system can determine an eye pattern of one or both eyes of a user to cause a headset to switch from a VR mode to a pass-through mode and/or to toggle between the VR mode and the pass-through mode.
- the headset 100, the user device 1026, and/or the system device 1028 may be utilized to determine the eye pattern.
- the eye pattern may be stored in a data structure (e.g., the data structure 1040).
- the eye pattern may be determined based on input from the user.
- the eye pattern may be taught to the user via display of the headset (e.g., the display 312).
- the eye pattern may be associated with passing or failing an eye test.
- a machine learning model may be trained to detect the eye pattern.
- the system may determine whether the eye pattern is detected (e.g., the eye pattern determined at operation 1502). In some implementations, the system may continuously or periodically scan for the eye pattern while the eye test is output to the display 312. If the eye pattern is detected (“Yes”), at operation 1508, the system may switch the headset from the VR mode to the pass-through mode. For example, detecting the eye pattern may trigger the system to enter the pass through mode in which the display within the headset changes to display a view from cameras at the exterior of the headset. This would allow the user to see their surroundings.
- the staff could provide visual and/or audio instructions without the user having to remove their headset and possibly cause a loss of eye tracking calibration (e.g., associated with the eye test). This may also cause the eye test to be suspended. Then, if the same eye pattern is detected again (c.g., toggling), or if a different predetermined eye pattern is detected (c.g., to switch from the pass through mode to the VR mode), the system may return to operation 1504 to resume the eye test. In some cases, the system could return to operation 1504 by selecting a button on a remote interface (e.g., the user or staff selecting a button on the user device 1026 or the system device 1028).
- a button on a remote interface e.g., the user or staff selecting a button on the user device 1026 or the system device 1028.
- the system may determine whether the eye testing is complete. If eye testing is not complete (“No”), the system may return to operation 1504 to continue the eye testing, and scanning for the eye pattern, until the eye testing is complete. However, if at operation 1510 the eye testing is complete (“Yes”), at operation 1512 the system may terminate the eye test. In some implementations, terminating the eye test may include proceeding to a next eye test or moving to a waiting room in the VR environment. In some implementations, terminating the eye test may also include generating an ophthalmic score for the eye test based on results from the ophthalmic testing unit. In some implementations, terminating the eye test may also include the system switching from the VR mode to the pass through mode.
- the implementations of this disclosure can be described in terms of functional block components and various processing operations. Such functional block components can be realized by a number of hardware or software components that perform the specified functions.
- the disclosed implementations can employ various integrated circuit components (e.g., memory elements, processing elements, logic elements, look-up tables, and the like), which can carry out a variety of functions under the control of one or more microprocessors or other control devices.
- the systems and techniques can be implemented with a programming or scripting language, such as C, C++, Java, JavaScript, assembler, or the like, with the various algorithms being implemented with a combination of data structures, objects, processes, routines, or other programming elements.
- system as used herein and in the figures, but in any event based on their context, may be understood as corresponding to a functional unit implemented using software, hardware (c.g., an integrated circuit, such as an ASIC), or a combination of software and hardware.
- systems or mechanisms may be understood to be a processor-implemented software system or processor- implemented software mechanism that is part of or callable by an executable program, which may itself be wholly or partly composed of such linked systems or mechanisms.
- Implementations or portions of implementations of the above disclosure can take the form of a computer program product accessible from, for example, a computer-usable or computer-readable medium.
- a computer-usable or computer-readable medium can be a device that can, for example, tangibly contain, store, communicate, or transport a program or data structure for use by or in connection with a processor.
- the medium can be, for example, an electronic, magnetic, optical, electromagnetic, or semiconductor device.
- Such computer-usable or computer-readable media can be referred to as non-transitory memory or media and can include volatile memory or non-volatile memory that can change over time.
- the quality of memory or media being non- transitory refers to such memory or media storing data for some period of time or otherwise based on device power or a device power cycle.
- a memory of an apparatus described herein, unless otherwise specified, does not have to be physically contained by the apparatus, but is one that can be accessed remotely by the apparatus, and does not have to be contiguous with other memory that might be physically contained by the apparatus.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Optics & Photonics (AREA)
- Biophysics (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Disclosed is a system for monitoring one or both eyes of a user. The system includes a headset worn by the user and one or more processors. The headset includes a display to generate an output which the user can see while wearing the headset and an eye tracking subsystem to track one or both eyes of the user. The headset can operate alternately in a virtual reality (VR) mode and in a pass through mode. The one or more processors can detect, via the eye tracking subsystem, an eye pattern of one or both eyes of the user. The eye pattern can be detected while the headset is operating in the VR mode. The one or more processors can switch the headset from the VR mode to the pass through mode based on detecting the eye pattern. Other aspects are also described and claimed.
Description
ACTIVATING A PASS THROUGH MODE OF A HEADSET BASED ON EYE PATTERN
CROSS-REFERENCE TO RELATED APPLICATION
The present application claims the benefit of and priority to U.S. Provisional Patent Application No. 63/597,653, filed November 9, 2023, which is incorporated by reference herein in its entirety.
TECHNICAL FIELD
This disclosure relates generally to monitoring eyes of a user via a headset and, more specifically, to activating a pass through mode of a headset based on eye pattern. Other aspects are also described.
BACKGROUND
A virtual reality (VR) headset is a head mounted device (HMD) designed to immerse a user in a VR environment. VR headsets may be used in many applications, including video games, simulators, and control systems. VR headsets often include a display that the user can see while wearing the headset, providing separate images to each eye of the user to generate the VR environment. The headset may also include a sound system and various sensors, such as accelerometers and gyroscopes for tracking an orientation of the user in a real world environment for corresponding to an orientation of the user in the VR environment.
The information included in this Background section of the specification, including any references cited herein and any description or discussion thereof, is included for technical reference purposes and is not to be regarded as subject matter by which the scope of the disclosure is to be bound.
SUMMARY
Implementations of this disclosure include configuring a portable VR headset (or simply headset), used to perform an eye test, to detect an eye pattern of a user that enables transitioning the headset from a VR mode to a pass through mode. This can enable a more natural interaction between a patient and clinical staff during the administration of the eye test. Some implementations may include a system for monitoring one or both eyes of a user (e.g., a patient). The system may include a headset configured to be worn by the user and one or more processors which may be implemented by the headset, a user device, and/or a system device. The headset can include a display to generate an output which the user can see while wearing the headset, and an eye tracking subsystem to track one or both eyes of the user. The headset can operate alternately in a VR mode in which the display outputs an ophthalmic examination VR environment to the user to perform the eye test, and in a pass through mode in which the display outputs to the user a view of a real environment of the user. The one or more processors may be configured by instructions stored in memory to detect, via the eye tracking subsystem, an eye pattern of one or both eyes of the user. The eye pattern can include i) a sequence of movements of one or both eyes or ii) a closing of one or both eyes. The eye pattern can be detected while the headset is operating in the VR mode. The one or more processors may be further configured by instructions to switch the headset from the VR mode to the pass through mode based on detecting the eye pattern. Other aspects are also described and claimed.
The above summary does not include an exhaustive list of all aspects of the present disclosure. It is contemplated that the disclosure includes all systems and methods that can be practiced from all suitable combinations of the various aspects summarized above, as well as those disclosed in the Detailed Description below and particularly pointed out in the Claims section. Such combinations may have particular’ advantages not specifically recited in the above summary.
BRIEF DESCRIPTION OF THE DRAWINGS
Several aspects of the disclosure here are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to “an” or “one” aspect in this disclosure are not necessarily to the same aspect, and they mean at least one. Also, in the interest of conciseness
and reducing the total number of figures, a given figure may be used to illustrate the features of more than one aspect of the disclosure, and not all elements in the figure may be required for a given aspect.
FIG. 1 is a front isometric view of an example of a headset including a display and an eye tracking subsystem.
FIG. 2 is a side view of an example of a headset incorporating a display and an eye tracking subsystem.
FIG. 3 is an exploded view of a portion of an example of a headset including a display and an eye tracking subsystem.
FIG. 4 is a diagrammatic representation of light paths within an exemplary headset that includes a display and an eye tracking subsystem.
FIG. 5 is another diagrammatic representation of light paths within an exemplary headset that includes a display and an eye tracking subsystem.
FIG. 6 is a front cutaway view showing exemplary optical components in an exemplary headset that includes a display and an eye tracking subsystem.
FIG. 7A is a rear' view of a portion of an exemplary headset that includes a display and an eye tracking subsystem.
FIG. 7B is a front view of an example of an illuminator assembly for an ophthalmic testing unit for a headset.
FIG. 8 is an exemplary pair of pupil camera images from a headset that includes a display and an eye tracking subsystem.
FIG. 9 is an exemplary pupil camera image from a headset that includes a display and an eye tracking subsystem.
FIG. 10 is an example of a system for activating a pass through mode of a headset based on eye pattern.
FIGS. 11-14 are examples of eye patterns for activating a pass through mode of a headset, including a sequence of movements of one or both eyes and closing of one or both eyes.
FIG. 15 is a flowchart of an example of a process for activating a pass through mode of a headset based on eye pattern.
DETAILED DESCRIPTION
VR headsets may be utilized by physicians to allow patients to conduct diagnostic tests, such as ophthalmic screening tests, as well as receiving content. For example, a headset can operate in a VR mode in which the display outputs an ophthalmic examination VR environment to the user. The headset can therefore improve clinic efficiency by enabling clinical staff to perform other tasks while a headset is utilized to test and evaluate a patient. However, some patients may feel insecure when fully immersed in the VR environment. For example, some patients may feel isolated from their surroundings with limited control when conducting diagnostic tests in the VR environment. In addition, if patients need to request help, staff may be limited to communicating with the patients via audio commands.
Some headsets may include a pass through mode in which the display outputs to the user a view of a real environment of the user instead of the VR environment. For example, the headset may include one or more external cameras oriented in front of the user. The headset can switch to a view of those cameras when the pass through mode is selected. However, accessing the pass through mode typically involves a user utilizing a controller and/or selecting a button in the VR environment. Some patients may find this difficult.
Implementations of this disclosure address problems such as these by configuring a portable VR headset (or simply headset), used to perform an eye test, to detect an eye pattern of a user that enables transitioning the headset from a VR mode to a pass through mode. This can enable a more natural interaction between a patient and clinical staff during the administration of the eye test. Some implementations may include a system for monitoring one or both eyes of a user (e.g., a patient). The system may include a headset configured to be worn by the user and one or more processors which may be implemented by the headset, a user device, and/or a system device. The headset can include a display to generate an output which the user can see while wearing the headset, and an eye tracking subsystem to track one or both eyes of the user. The headset can operate alternately in a VR mode in which the display outputs an ophthalmic examination VR environment to the user to perform the eye test, and in a pass through mode in which the display outputs to the user a view of a real environment of the user. The one or more processors may be configured by instructions stored in memory to detect, via the eye tracking subsystem, an eye pattern (e.g., a predetermined eye pattern) of one or both eyes of the user. The eye pattern can include i) a sequence of movements of one or both eyes or ii) a closing of one or
both eyes. The eye pattern can be detected while the headset is operating in the VR mode. The one or more processors may be further configured by instructions to switch the headset from the VR mode to the pass through mode based on detecting the eye pattern. In some cases, the one or more processors may be configured by instructions to toggle the headset between the VR mode and the pass through mode based on detecting the eye pattern (e.g., going back and forth between the modes based on the same or different eye patterns). As a result, the system may enable a more natural interaction between a patient and clinical staff during the administration of diagnostic tests utilizing the headset.
In some implementations, the system may detect when a user (e.g., the patient) closes their eyes for a minimum duration of time (e.g., 3 seconds), longer than a blink, and/or when the patient performs a specific eye pattern (e.g., moves their eyes up then down and to the left). The detection by the system can trigger the system to then enter a pass through mode in which the display within the headset changes to display what the cameras at the exterior of the headset pointing outwards are viewing. This would enable the patient to see their surroundings. In addition, if the patient needs assistance from staff, the staff could provide visual and/or audio instructions without the patient having to remove their headset and possibly cause a loss of eye tracking calibration. Further, the system could return to the VR mode either by the staff member selecting a button on a remote interface (e.g., a tablet or computer) or the patient performing the same eye pattern again or another eye pattern.
These descriptions are provided for exemplary purposes only and should not be considered to limit the scope of the headset periocular temperature and humidity control unit. Certain features may be added, removed, or modified without departing from the spirit of the claimed subject matter.
FIG. 1 is a front isometric view of an example of a headset 100 including a display and an eye tracking subsystem. In the example shown, the headset 100 includes a display housing 110, exterior cameras 115, sensor housings 120, a head strap attachment 130, a head strap 140, and a forehead rest 150. In some implementations, the headset 100 may comprise a VR, augmented reality (AR), and/or mixed reality (MR) headset. The headset 100 can include a display to generate an output which the user can see while wearing the headset 100. The headset 100 can also include an eye tracking subsystem to track one or both eyes of the user while wearing the headset 100. The headset 100 can operate alternately in a VR mode in which the
display outputs an ophthalmic examination VR environment to the user (e.g., to test one or both eyes) and in a pass through mode in which the display outputs to the user a view of a real environment of the user via the exterior cameras 115 (e.g., pointing outwards, in front of the user). In the illustrated implementation, the sensor housings 120 are coupled to a right and left side of the display housing 110. The sensor housings 120 can be housings for the exterior cameras 115, internal cameras, and/or other optical detection devices, including the eye tracking subsystem.
In some implementations, the sensor housings 120 may include one or more temperature and/or humidity sensors. In some implementations, the sensor housings 120 may be in communication with a periocular space within the headset 100 such that one or more sensors (e.g., humidity, temperature sensors) are configured to measure air conditions (e.g., humidity, temperature) of the periocular space within the headset 100. In other implementations, one or both of the sensor housings 120, or one or more components of the sensor housings 120, may be positioned on the front of the display housing 110, on a top surface of the display housing 110, on a bottom surface of the display housing 110, and/or on the head strap 140. In some implementations, the headset 100 does not include sensor housings 120 coupled to an exterior of the headset 100. In that regard, electronic components positioned within the sensor housings 120 could be positioned within one or more components of the headset 100, such as the display housing 110, or the head strap 140. Further, the exterior cameras 115 could be positioned on an opaque front surface of the headset 100.
FIG. 2 is a side view of an example of the headset 100. In the example shown, the headset 100 includes a head strap tightener or tension adjuster 260 and a power cord 270. Also visible are the display housing 110, sensor housing 120, head strap attachment 130, head strap 140, and forehead rest 150. It will be understood that, in some implementations, the headset 100 does not include the power cord 270, but rather includes a battery coupled to the headset 100 and configured to provide electrical power to the components of the headset 100. The headset 100 may also include speakers 280. The speakers 280 may be used, for example, to output audio notes or cues to the user wearing the headset 100. For example, one audio note could indicate to the user that a switch from the VR mode to the pass through mode (or from the pass through mode to the VR mode) has occurred due to an eye pattern detected by the eye tracking subsystem.
FIG. 3 is an exploded view of a portion of the headset 100 in accordance with at least one implementation of the present disclosure. For example, FIG. 3 shows an exploded view of a display housing of a headset 100, such as the display housing 110. Shown is a display 312, which attaches to the display housing 110. Also shown is a divider 314 separating the display view between the user's eyes. In an example, the display 312 shows an image through one lens 316, and a slightly different image through the other lens 316, such that the user perceives a 3D image. In an example, the headset may also be used to display 2D images, or to display images only to one eye at a time. In other implementations, the headset 100 does not include the divider 314 and/or the lenses 316.
As explained further below, in some implementations, the display 312 is configured to display images to the patient or subject while the headset 100 is performing a protocol to test for ophthalmic conditions. In some aspects, the display 312 can be used to condition or induce stress in the eye to facilitate testing. The display 312 can operate alternately between a VR mode and a pass through mode.
FIG. 4 is a diagrammatic representation of an optical configuration of an eye tracking subsystem within the headset 100 according to a first implementation shown by way of example. The eye tracking subsystem may be used to detect a predetermined or specific eye pattern of one or both eyes of the user (e.g., via cameras oriented toward pupils of the user). For example, the eye pattern may include a sequence of movements of one or both eyes and/or a closing of one or both eyes. The eye pattern may be used to switch the headset 100 from the VR mode to the pass through mode, and/or to toggle the headset 100 between the VR mode and the pass through mode. The eye tracking subsystem may also be utilized by an ophthalmic testing unit comprising one or more sensors (e.g., pupil detectors and/or retinal detectors) for diagnosing and/or grading ophthalmic conditions. In that regard, the ophthalmic testing unit is configured to obtain one or more physiological measurements of an eye, for example, blink rate, frequency, or pattern, blink duration, blink speed or slope, the size/brightness/location of reflections on the surface of the eye, and/or the thickness of the tear film of the eye, including any meniscus and convexity formed at the edge of the eyelid. Other examples may include a measure of visual acuity, drooping eye lids, blood vessels, inflammatory cells, IOP, muscle movement, corneal curvature, or pupil response. The eye tracking subsystem and the ophthalmic testing unit shown in this figure may utilize one or more structures in common. For example, they may both utilize two
off-axis pupil detectors 422, two beam splitters 426, and/or an illuminator 428. The pupil detectors 422 can include optical detectors, cameras, or any other suitable detector. In this first implementation shown in FIG. 4, the pupil detectors 422 are beside the eyepiece lens 316. The example shown in this figure may include a same or a different configuration than shown in FIG. 3.
In this example, light 312a from the display 312 is not passed directly through the eyepiece lens 316, but instead passes through a first beam splitter 426 and reflects from a second beam splitter 426 before passing through the eyepiece lens 316 and into the eye 429 of the user. Light 428a from the illuminator 428 (e.g., infrared (IR) light) reflects from a first beam splitter 426 and a second beam splitter 426 before passing through the eyepiece lens 316 to the eye 429. The light 428a may be used to illuminate features on or inside the eye 429 that may be imaged or otherwise detected by the pupil detectors 422. In this example, light 422a reflecting from the surface of the eye 429 passes into the pupil detectors 422 without first passing through any other optical components. In some implementations, the display 312 could be passed directly through the eyepiece lens 316. For example, the display 312 could be configured directly behind the eyepiece lens 316 with or without utilizing beam splitters. Additionally, in some implementations, the illuminator 428 may be configured beside the eyepiece lens 316 (e.g., on axes similar to the pupil detectors 422) instead of behind the eyepiece lens 316 (e.g., with or without utilizing beam splitters).
In the VR mode, the display 312 outputs an ophthalmic examination VR environment to the user. For example, the VR environment may comprise an eye test for visual acuity, corneal topography, visual field, cover/uncover, color blindness, eye motility, contrast, or pupil response. In the pass through mode, the display 312 outputs to the user a view of a real environment of the user. For example, the headset 100 may process digital images generated by one or more of the exterior cameras 115 (e.g., a digital, visible light camera) that may be integrated into the headset 100. The exterior cameras 115 may be configured to “look” outward, while the eye tracking subsystem (which may also be integrated into the headset 100) is configured to look inward at the user’s eyes (e.g., via an infrared light camera).
In some implementations, the pupil detectors 422 may observe other parts of the eye, including but not limited to the eyelids, eyelashes, cornea, IOL, corneal tear film, tear ducts, and Meibomian glands, instead of or in addition to the pupil. In some implementations, other
arrangements of optical components (e.g., illuminators, cameras, beam splitters, and lenses) may be used to achieve the effects disclosed herein.
FIG. 5 is a diagrammatic representation of light paths within the headset 100 according to a second implementation shown by way of example. In this example, the display 312, divider 314, two eyepiece lenses 316, and two beam splitters 426 are situated within the display housing 110. The divider 314 defines two separate regions of the display 312. Each portion of the display emits light 312a (e.g., an image) that passes through a beam splitter 426 and through an eyepiece lens 316. Light 422a from the illuminator 428 reflecting from the surface of each eye passes through an eyepiece lens 316, reflects from a beam splitter 426, and into a pupil detector 422, which sits within a sensor housing 120. In this second implementation shown in FIG. 5, the pupil detectors 422 are behind the eyepiece lens 316. The source of light 422a which reflects from the surface of the eye may include one or more illuminators, such as infrared (IR) illuminators 428 (see FIGS. 4, 7A, 7B). In the implementation shown in FIG. 5, the beam splitter 426 reflects IR light but does not reflect visible light, such as light provided by the display 312. In other implementations, such as the implementation shown in FIG. 4, the beam splitters 426 reflect IR light and additional light, such as visible light.
In some implementations, periocular temperature and humidity controller circuit boards 1200 may be located within the sensor housings 120. The temperature and humidity controller circuit boards 1200 may comprise one or more of a humidity sensor configured to measure a humidity within the periocular space of the headset 100, a temperature sensor configured to measure a temperature within the periocular space of the headset 100, a dehumidifier or humidity control element configured to control or adjust the humidity within the periocular space of the headset 100, a heating element, and/or a cooling element. Although more than one temperature and humidity controller circuit boards 1200 are shown in FIG. 5, the headset 100 can include a single temperature and humidity controller circuit board, or multiple temperature and humidity controller circuit boards. Further, in some implementations, the headset can include one or more temperature and/or humidity sensors in communication with the temperature and humidity controller circuit boards 1200 and configured to monitor a temperature and/or a humidity in the external environment around the headset 100.
FIG. 6 is a front view of a portion of the headset 100 including the eye tracking subsystem in accordance with at least one implementation of the present disclosure. For clarity,
the display 312 has been removed from this view. Visible are the display housing 110, divider 314, two eyepiece lenses 316, two pupil detectors 422, and two beam splitters 426.
FIG. 7A is a rear view of a portion of the headset 100 in accordance with at least one implementation of the present disclosure. Visible are the display housing 110, head strap attachment 130, forehead rest 150, and eyepiece lenses 316. Surrounding each eyepiece lens 316 is a plurality of illuminators 428 (e.g., infrared LEDs) that are configured to illuminate the eye for observation by the pupil detectors 422. FIG. 7B shows an illuminator assembly 430, according to an implementation of the present disclosure. In that regal'd, the illuminator assembly includes a plurality of illuminators 438 (e.g., infrared LEDs), a positive power connector 434a, a negative power connector 434b, and an aperture 432 or opening. In some implementations, the aperture 432 is positioned around a camera of an ophthalmic testing unit.
FIG. 8 is an exemplary view of a graphical interface for testing or grading ophthalmic conditions (e.g., eye tests for visual acuity, corneal topography, visual field, cover/uncover, color blindness, eye motility, contrast, or pupil response), such as for determining an ophthalmic score in connection with an eye test. The interface 1000 shows a pair of pupil camera images 1010 from a headset 100 in accordance with at least one implementation of the present disclosure. Visible are corneal tear film 1001, pupil 1002, eyelids 1003, tear ducts 1004, corneas 1006, intraocular lenses 1007, reflections 1008, and computer-generated gaze indicators 1005 at the center of each pupil 1002. In some aspects, the reflections 1008 may represent glints from the illuminators 428. For example, for a dry test, various features of this image may be used to assess the hydration or dryness of the eyes, including but not limited to blink pattern, blink rate or frequency, blink duration, between-blink interval, blink speed or slope, the size/brightness/location of reflections 1008 on the surface of the eye, and direct observation and measurement (e.g., pixel count and rate of change) of the thickness of the corneal tear film 1001, including any meniscus and convexity formed at the edge of the eyelid. In other aspects, other features of this image may be used to assess other ophthalmic conditions.
FIG. 9 is an exemplary pupil camera image 1012 from the headset 100 in accordance with at least one implementation of the present disclosure. Visible are the pupil 1002, eyelids 1003, and tear duct 1004. In this example, the user has been asked, via output to display of the headset, to manually pull open their eyelids 1003, revealing Meibomian glands 1009. For example, irregularities in the Meibomian glands or tear duct may be indicative of both the
existence and potential causes of a dry eye problem. Thus, images of the Meibomian glands may be extremely useful in the process of determining a diagnosis and treatment. In other aspects, the user may be asked via the headset to assist with other measurements. In addition to automated analysis, such pupil camera images 1010 may be communicated to one or more clinicians, either co-located with the headset and the device or in one or more remote locations.
FIG. 10 is an example of a system 1020 for activating a pass through mode of a headset based on an eye pattern of a user. For example, the headset 100 may include an eye tracking subsystem 1022 and an ophthalmic testing unit 1024, such as the eye tracking subsystem and the ophthalmic testing unit described above with respect to FIGS. 1-9. The system 1020 may also include a device in communication with the headset 100, such as a user device 1026 and/or a system device 1028. For example, the user device 1026 may be a mobile device (e.g., a mobile phone, laptop, or tablet) or desktop computer utilized by the user. The user device 1026 may run an application program (e.g., an app) to wirelessly control the headset 100, including for testing, monitoring, and evaluating eyes of the user via the display 312 and the ophthalmic testing unit 1024. The system device 1028 may be workstation or server (e.g., a cloud based computer) utilized by clinical staff. The headset 100 and/or the user device 1026 may communicate with the system device 1028 via a system network, including for testing, monitoring, and evaluating eyes of the user via the display 312 and the ophthalmic testing unit 1024. In some implementations, the system device 1028 might not be present, and in some implementations, the user device 1026 might not be present. The headset 100, the user device 1026, and/or the system device 1028 may each include one or more processors and memory to perform the various steps described herein.
The headset 100 may utilize the display 312 to generate an output which the user can see while wearing the headset 100. In various implementations, the display 312 can generate VR, AR, and/or MR to the user. The headset 100 can operate alternately in a VR mode in which the display outputs the ophthalmic examination VR environment (e.g., via the ophthalmic testing unit 1024, to perform an eye test) to the user, and in a pass through mode in which the display outputs to the user a view of a real environment of the user (e.g., via the exterior cameras 115, oriented in front of the headset 100, transmitting a signal, corresponding to the physical environment of the user, to the display 312).
The headset 100 may utilize the ophthalmic testing unit 1024 in the VR mode to output the ophthalmic examination VR environment and obtain sensor information from the user (e.g.,
testing one or both eyes). The ophthalmic testing unit 1024 may utilize the sensors described herein (e.g., the pupil detectors 422) to obtain sensor information indicating one or more physiological measurements of one or more eyes of the user. For example, a physiological measurement could comprise a measure of visual acuity, drooping eye lids, blink pattern, blood vessels, inflammatory cells, IOP, muscle movement, corneal curvature, or pupil response. The ophthalmic testing unit 1024 may enable testing or grading for visual acuity, corneal topography, visual field, cover/uncover, color blindness, eye motility, contrast, or pupil response, such as for determining an ophthalmic score.
The headset 100 may also utilize the eye tracking subsystem 1022 to track one or both eyes of the user. The eye tracking subsystem 1022 may utilize the sensors described herein (e.g., the pupil detectors 422) to detect eye patterns of one or both eyes of the user. When the eye tracking subsystem 1022 detects a predetermined or specific eye pattern of the user, the eye tracking subsystem 1022 can trigger the headset 100 to switch (e.g., via the display 312) from the VR mode to the pass through mode. The eye pattern can include a sequence of movements of one or both eyes and/or a closing of one or both eyes. The eye pattern can be detected while the headset 100 is operating in the VR mode. In some cases, the eye tracking subsystem 1022 can trigger the headset 100 to toggle between the VR mode and the pass through mode (e.g., to go back and forth between the modes based on the same or different eye patterns). For example, the eye pattern can also be detected while the headset 100 is operating in the pass through mode. As a result, the system 1020 may enable a more natural interaction between a patient and clinical staff during the administration of diagnostic tests utilizing the headset 100.
In some implementations, the eye pattern may be a predetermined or specific eye pattern that is stored in a data structure (e.g., a look up table in the data structure 1040, such as a database accessible by the user device 1026 and/or the system device 1028). For example, the eye pattern may be determined based on saved information associated with the user, the eye test, and/or the equipment (e.g., the headset 100). In some implementations, the system 1020 can receive, via the eye tracking subsystem 1022, input from the user indicating the eye pattern to cause a switch from the VR mode to the pass through mode, or a toggle between the VR mode and the pass through mode. For example, the user can enter a training mode, via the headset 100, in which the user performs the eye pattern that will cause the switch or the toggle to be recorded in the data structure. In some implementations, the system 1020 can output, via the display 312,
an indication to the user of the eye pattern that will cause the switch or the toggle. For example, the user can receive training, via the headset 100, in which the user is taught the eye pattern to perform in order to cause the switch or the toggle. In some implementations, the eye pattern may correspond to passing or failing an eye test, causing the switch or the toggle. For example, the eye test may require the user to follow an object moving in the display 312. The user failing to move their eye to follow the object, or successfully moving their eyes to follow the object, may determine whether to cause the switch or the toggle.
In some implementations, a machine learning model may be trained to detect the eye pattern. The machine learning model may, for example, be or include one or more of a neural network (e.g., a convolutional neural network, recurrent neural network, deep neural network, or other neural network), decision tree, vector machine, Bayesian network, cluster-based system, genetic algorithm, deep learning system separate from a neural network, or other machine learning model.
In some implementations, the system 1020 can detect, via the exterior cameras 115, a gesture from a second user that is near the headset 100 (e.g., standing a few feet in front of the headset 100). The headset 100 can then toggle between the VR mode and the pass through mode based on detecting the gesture (e.g., a same or different gestures). For example, a member of the clinical staff helping the user could make a first gesture (e.g., an up and down hand wave) in front of the headset 100 to switch the headset 100 from the VR mode to the pass through mode. Then, the member could make the first gesture again, or alternatively a second gesture (e.g., a side to side hand wave) in front of the headset 100, to switch the headset 100 from the pass through mode to the VR mode.
In some implementations, the headset 100 can output, via the display 312 when operating in the pass through mode, a portion or all of the real environment to one eye that is not output to the other eye. For example, to perform an eye test in the real environment, such as a confrontation test in which a member of the clinical staff holds fingers to the left, right, up, and down in front of the user, the VR system could only display the fingers to one tested eye (or both eyes) without requiring the user to close the other eye as in a standard confrontation test. The selection could be made by input from staff via the user device 1026 and/or the system device 1028. In another example, the input could cause the display 312 to selectively occlude one eye or the other in the pass through mode for eye testing.
FIGS. 11 -14 are examples of eye patterns for activating a pass through mode of the headset 100, including a sequence of movements of one or both eyes and/or a closing of one or both eyes. In some implementations, the eye patterns of one or more of FIGS. 11-14 may cause toggling of the headset 100 between the VR mode and the pass through mode based on detecting different eye patterns (e.g., going back and forth between the modes based on same or different eye patterns). The eye tracking subsystem 1022 may be utilized to detect the eye pattern to cause the switch or the toggle. For example, the eye tracking subsystem 1022 can detect a sequence of eye movements that includes the user moving their eyes from side to side a number of times as shown in FIG. 11, and/or the user moving their eyes up and down a number of times as shown in FIG. 12. In some cases, the eye pattern may include a combination of such movements, in a defined order, such as detecting the user moving their eyes up, then down, and then to the left. In some cases, the eye pattern may include the user moving their eyes to comers and/or in circles. In another example, the eye tracking subsystem 1022 can detect a closing of one or both eyes for a minimum amount of time as shown in FIG. 13, such as an amount of time (e.g., 3 seconds) that is greater than an amount of time associated with a blink. In another example, the eye tracking subsystem 1022 can detect an eye pattern including a sequence of movements of one eye (e.g., the right eye) and the closing of another other eye (e.g., the left eye) as shown in FIG. 14. For example, the eye pattern shown in FIG. 14 could be a combination of the eye patterns shown in FIGS. 11-13. In another example, the eye tracking subsystem 1022 can detect an eye pattern including opening and closing one or both eyes in a blink pattern (e.g., the user looking directly ahead and blinking quickly and/or slowly in a coded sequence).
FIG. 15 is a flowchart of an example of a process 1500 for activating a pass through mode of a headset based on an eye pattern of a user. The process 1500 can be executed using computing devices, such as the systems, hardware, and software described with respect to FIGS. 1-14. The process 1500 can be performed, for example, by executing a machine-readable program or other computer-executable instructions, such as routines, instructions, programs, or other code. The operations of the process 1500 or another technique, method, process, or algorithm described in connection with the implementations disclosed herein can be implemented directly in hardware, firmware, software executed by hardware, circuitry, or a combination thereof.
For simplicity of explanation, the process 1500 is depicted and described herein as a series of operations. However, the operations in accordance with this disclosure can occur in various orders and/or concurrently. Additionally, other operations not presented and described herein may be used. Furthermore, not all illustrated operations may be required to implement a technique in accordance with the disclosed subject matter.
At operation 1502, a system (e.g., the system 1020) can determine an eye pattern of one or both eyes of a user to cause a headset to switch from a VR mode to a pass-through mode and/or to toggle between the VR mode and the pass-through mode. For example, the headset 100, the user device 1026, and/or the system device 1028 may be utilized to determine the eye pattern. In some implementations, the eye pattern may be stored in a data structure (e.g., the data structure 1040). In some implementations, the eye pattern may be determined based on input from the user. In some implementations, the eye pattern may be taught to the user via display of the headset (e.g., the display 312). In some implementations, the eye pattern may be associated with passing or failing an eye test. In some implementations, a machine learning model may be trained to detect the eye pattern.
At operation 1504, the system may operate the headset in the VR mode. For example, the headset 100 may output an ophthalmic examination VR environment to the user via the display 312 to perform the eye test. In some implementations, the eye testing may be for visual acuity, corneal topography, visual field, cover/uncover, color blindness, eye motility, contrast, or pupil response (e.g., via the ophthalmic testing unit 1024). In some implementations, the eye testing may result in an ophthalmic score when the eye testing is completed.
At operation 1506, the system (e.g., the eye tracking subsystem 1022) may determine whether the eye pattern is detected (e.g., the eye pattern determined at operation 1502). In some implementations, the system may continuously or periodically scan for the eye pattern while the eye test is output to the display 312. If the eye pattern is detected (“Yes”), at operation 1508, the system may switch the headset from the VR mode to the pass-through mode. For example, detecting the eye pattern may trigger the system to enter the pass through mode in which the display within the headset changes to display a view from cameras at the exterior of the headset. This would allow the user to see their surroundings. If the user needs assistance from staff, the staff could provide visual and/or audio instructions without the user having to remove their headset and possibly cause a loss of eye tracking calibration (e.g., associated with the eye test).
This may also cause the eye test to be suspended. Then, if the same eye pattern is detected again (c.g., toggling), or if a different predetermined eye pattern is detected (c.g., to switch from the pass through mode to the VR mode), the system may return to operation 1504 to resume the eye test. In some cases, the system could return to operation 1504 by selecting a button on a remote interface (e.g., the user or staff selecting a button on the user device 1026 or the system device 1028).
However, if at operation 1506 the eye pattern is not detected (“No”), at operation 1510 the system may determine whether the eye testing is complete. If eye testing is not complete (“No”), the system may return to operation 1504 to continue the eye testing, and scanning for the eye pattern, until the eye testing is complete. However, if at operation 1510 the eye testing is complete (“Yes”), at operation 1512 the system may terminate the eye test. In some implementations, terminating the eye test may include proceeding to a next eye test or moving to a waiting room in the VR environment. In some implementations, terminating the eye test may also include generating an ophthalmic score for the eye test based on results from the ophthalmic testing unit. In some implementations, terminating the eye test may also include the system switching from the VR mode to the pass through mode.
The implementations of this disclosure can be described in terms of functional block components and various processing operations. Such functional block components can be realized by a number of hardware or software components that perform the specified functions. For example, the disclosed implementations can employ various integrated circuit components (e.g., memory elements, processing elements, logic elements, look-up tables, and the like), which can carry out a variety of functions under the control of one or more microprocessors or other control devices. Similarly, where the elements of the disclosed implementations are implemented using software programming or software elements, the systems and techniques can be implemented with a programming or scripting language, such as C, C++, Java, JavaScript, assembler, or the like, with the various algorithms being implemented with a combination of data structures, objects, processes, routines, or other programming elements.
Functional aspects can be implemented in algorithms that execute on one or more processors. Furthermore, the implementations of the systems and techniques disclosed herein could employ a number of conventional techniques for electronics configuration, signal processing or control, data processing, and the like. The terms “system” as used herein and in the
figures, but in any event based on their context, may be understood as corresponding to a functional unit implemented using software, hardware (c.g., an integrated circuit, such as an ASIC), or a combination of software and hardware. In certain contexts, such systems or mechanisms may be understood to be a processor-implemented software system or processor- implemented software mechanism that is part of or callable by an executable program, which may itself be wholly or partly composed of such linked systems or mechanisms.
Implementations or portions of implementations of the above disclosure can take the form of a computer program product accessible from, for example, a computer-usable or computer-readable medium. A computer-usable or computer-readable medium can be a device that can, for example, tangibly contain, store, communicate, or transport a program or data structure for use by or in connection with a processor. The medium can be, for example, an electronic, magnetic, optical, electromagnetic, or semiconductor device.
Other suitable mediums are also available. Such computer-usable or computer-readable media can be referred to as non-transitory memory or media and can include volatile memory or non-volatile memory that can change over time. The quality of memory or media being non- transitory refers to such memory or media storing data for some period of time or otherwise based on device power or a device power cycle. A memory of an apparatus described herein, unless otherwise specified, does not have to be physically contained by the apparatus, but is one that can be accessed remotely by the apparatus, and does not have to be contiguous with other memory that might be physically contained by the apparatus.
While the disclosure has been described in connection with certain implementations, it is to be understood that the disclosure is not to be limited to the disclosed implementations but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.
Claims
1. A system for monitoring one or both eyes of a user, comprising: a headset configured to be worn by the user, the headset including a display to generate an output which the user can see while wearing the headset and an eye tracking subsystem to track one or both eyes of the user, wherein the headset operates alternately in a virtual reality (VR) mode in which the display outputs an ophthalmic examination VR environment to the user and in a pass through mode in which the display outputs to the user a view of a real environment of the user; and one or more processors configured by instructions stored in memory to: detect, via the eye tracking subsystem, an eye pattern of one or both eyes of the user, the eye pattern including at least one of i) a sequence of movements of one or both eyes or ii) a closing of one or both eyes, the eye pattern detected while the headset is operating in the VR mode; and switch the headset from the VR mode to the pass through mode based on detecting the eye pattern.
2. The system of claim 1, wherein the one or more processors are further configured by instructions stored in memory to: toggle the headset between the VR mode and the pass through mode based on detecting the eye pattern.
3. The system of claim 1, wherein the sequence of movements includes two or more of up, down, left, or right in a defined order.
4. The system of claim 1, wherein the eye pattern comprises closing one or both eyes for a minimum amount of time that is greater than an amount of time associated with a blink.
5. The system of claim 1 , wherein the eye pattern comprises opening and closing one or both eyes in a blink pattern.
6. The system of claim 1, wherein the one or more processors are further configured by instructions stored in memory to: receive, via the eye tracking subsystem, input from the user indicating the eye pattern to cause a switch from the VR mode to the pass through mode.
7. The system of claim 1, wherein the one or more processors are further configured by instructions stored in memory to: output, via the display, an indication to the user of the eye pattern to cause a switch from the VR mode to the pass through mode.
8. The system of claim 1, wherein the one or more processors are further configured by instructions stored in memory to: output an eye test to the user via the display, wherein the eye pattern is associated with passing or failing the eye test.
9. The system of claim 1, wherein the ophthalmic examination VR environment performs an eye test for at least one of i) visual acuity, ii) corneal topography, iii) visual field, iv) cover/uncover, v) color blindness, vi) eye motility, vii) contrast, or viii) pupil response.
10. The system of claim 1, wherein the one or more processors are further configured by instructions stored in memory to: output, via a speaker of the headset, an audio note indicating a switch from the VR mode to the pass through mode caused by the eye pattern.
11. The system of claim 1, wherein the eye tracking subsystem comprises one or more cameras oriented toward pupils of the user.
12. The system of claim 1 , wherein the one or more processors are further configured by instructions stored in memory to: detect a gesture from a second user near the headset; and toggle the headset between the VR mode and the pass through mode based on detecting the gesture.
13. The system of claim 1, wherein the one or more processors are further configured by instructions stored in memory to: output, via the display operating in the pass through mode, at least a portion of the real environment to one eye that is not output to the other eye.
14. A method for monitoring one or both eyes of a user, comprising: detecting, via an eye tracking subsystem of a headset that tracks one or both eyes of a user, an eye pattern of one or both eyes, the eye pattern including at least one of i) a sequence of movements of one or both eyes or ii) a closing of one or both eyes, the eye pattern detected while the headset is operating in a virtual reality (VR) mode in which a display of the headset, which the user can see while wearing the headset, outputs an ophthalmic examination VR environment to the user; and switching the headset, based on detecting the eye pattern, from the VR mode to a pass through mode in which the display outputs to the user a view of a real environment of the user.
15. The method of claim 14, further comprising: toggling the headset between the VR mode and the pass through mode based on detecting different eye patterns.
16. The method of claim 14, wherein the eye pattern includes the sequence of movements and the closing of both eyes.
17. The method of claim 14, wherein the eye pattern comprises the sequence of movements of one eye and the closing of another other eye.
18. The method of claim 14, wherein the eye pattern comprises opening and closing one eye in a blink pattern.
19. The method of claim 14, further comprising: receiving, via the eye tracking subsystem, input from the user indicating the eye pattern to cause a switch from the VR mode to the pass through mode.
20. The method of claim 14, further comprising: outputting, via the display, an indication to the user of the eye pattern to cause a switch from the VR mode to the pass through mode.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363597653P | 2023-11-09 | 2023-11-09 | |
| US63/597,653 | 2023-11-09 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025101397A1 true WO2025101397A1 (en) | 2025-05-15 |
Family
ID=95696247
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/053521 Pending WO2025101397A1 (en) | 2023-11-09 | 2024-10-30 | Activating a pass through mode of a headset based on eye pattern |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025101397A1 (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150091793A1 (en) * | 2012-03-08 | 2015-04-02 | Samsung Electronics Co., Ltd. | Method for controlling device on the basis of eyeball motion, and device therefor |
| US20160025978A1 (en) * | 2014-07-22 | 2016-01-28 | Sony Computer Entertainment Inc. | Virtual reality headset with see-through mode |
| US20200312038A1 (en) * | 2016-06-20 | 2020-10-01 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
| US20220128813A1 (en) * | 2019-06-07 | 2022-04-28 | Hewlett-Packard Development Company, L.P. | Eye movement controls in extended reality |
| US20230293004A1 (en) * | 2020-09-16 | 2023-09-21 | Nevada Research & Innovation Corporation | Mixed reality methods and systems for efficient measurement of eye function |
| US20230337910A1 (en) * | 2022-04-25 | 2023-10-26 | Twenty Twenty Therapeutics Llc | Mapping of corneal topography using a vr headset |
-
2024
- 2024-10-30 WO PCT/US2024/053521 patent/WO2025101397A1/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150091793A1 (en) * | 2012-03-08 | 2015-04-02 | Samsung Electronics Co., Ltd. | Method for controlling device on the basis of eyeball motion, and device therefor |
| US20160025978A1 (en) * | 2014-07-22 | 2016-01-28 | Sony Computer Entertainment Inc. | Virtual reality headset with see-through mode |
| US20200312038A1 (en) * | 2016-06-20 | 2020-10-01 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
| US20220128813A1 (en) * | 2019-06-07 | 2022-04-28 | Hewlett-Packard Development Company, L.P. | Eye movement controls in extended reality |
| US20230293004A1 (en) * | 2020-09-16 | 2023-09-21 | Nevada Research & Innovation Corporation | Mixed reality methods and systems for efficient measurement of eye function |
| US20230337910A1 (en) * | 2022-04-25 | 2023-10-26 | Twenty Twenty Therapeutics Llc | Mapping of corneal topography using a vr headset |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109645955B (en) | Multifunctional visual function detection device and method based on VR and eye tracking | |
| JP5700427B2 (en) | Autism diagnosis support device | |
| JP6340503B2 (en) | Eye tracking system and method for detecting dominant eye | |
| US9439592B2 (en) | Eye tracking headset and system for neuropsychological testing including the detection of brain damage | |
| KR101983279B1 (en) | Nerve disprder diagnosis apparatus and method using virtual reality | |
| JP7106569B2 (en) | A system that evaluates the user's health | |
| US20230225609A1 (en) | A system and method for providing visual tests | |
| JP2021533462A (en) | Depth plane selection for multi-depth plane display systems by user categorization | |
| JP2021511723A (en) | Eye rotation center determination, depth plane selection, and rendering camera positioning within the display system | |
| US10709328B2 (en) | Main module, system and method for self-examination of a user's eye | |
| US12076088B2 (en) | Virtual reality-based portable nystagmography device and diagnostic test method using same | |
| US9888847B2 (en) | Ophthalmic examination system | |
| CN109758107A (en) | A VR visual function inspection device | |
| US20200352432A1 (en) | Ophthalmic instrument, management method, and management device | |
| US20240049963A1 (en) | Cover-Uncover Test in a VR/AR Headset | |
| US20250349429A1 (en) | Systems and methods for ophthalmic digital diagnostics via telemedicine | |
| CN114303117A (en) | Eye tracking and gaze estimation using off-axis cameras | |
| CN111857342A (en) | Eye movement tracking system and method based on medical endoscope | |
| US20220369924A1 (en) | Head-mounted vision detection equipment, vision detection method and electronic device | |
| WO2015198023A1 (en) | Ocular simulation tool | |
| WO2025101397A1 (en) | Activating a pass through mode of a headset based on eye pattern | |
| US20240122469A1 (en) | Virtual reality techniques for characterizing visual capabilities | |
| Sadok et al. | Performing the HINTS-exam using a mixed-reality head-mounted display in patients with acute vestibular syndrome: a feasibility study | |
| CN115813343A (en) | Method and system for evaluating abnormal behavior of children | |
| US20250160637A1 (en) | Monitoring ophthalmic conditions using a headset |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24889384 Country of ref document: EP Kind code of ref document: A1 |