CN117119946A - Apparatus, system and method for objectively assessing ocular accommodation - Google Patents

Apparatus, system and method for objectively assessing ocular accommodation Download PDF

Info

Publication number
CN117119946A
CN117119946A CN202280025971.4A CN202280025971A CN117119946A CN 117119946 A CN117119946 A CN 117119946A CN 202280025971 A CN202280025971 A CN 202280025971A CN 117119946 A CN117119946 A CN 117119946A
Authority
CN
China
Prior art keywords
eye
display
subject
controller
beam splitter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280025971.4A
Other languages
Chinese (zh)
Inventor
V·M·赫南德兹
A·索尔坦扎迪
R·D·安格洛波洛斯
T·W·斯迈利
M·迈高夫
V·斯里尼瓦萨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alcon Inc
Original Assignee
Alcon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alcon Inc filed Critical Alcon Inc
Priority claimed from PCT/US2022/021212 external-priority patent/WO2022216451A1/en
Publication of CN117119946A publication Critical patent/CN117119946A/en
Pending legal-status Critical Current

Links

Landscapes

  • Eye Examination Apparatus (AREA)

Abstract

Devices, systems, and methods for objectively assessing ocular accommodation are disclosed. For example, a method for objective assessment adjustment may include displaying a stimulation target at a far display for a first duration and displaying the stimulation target at a near display for a second duration. The stimulation target displayed at the near display may be projected onto a first beam splitter positioned at an oblique angle relative to the near display. The stimulation target displayed on the remote display may be axially aligned with the stimulation target projected onto the first beam splitter. The method may further include obtaining, at the controller, measurements regarding the refractive state of the eye during the first and second durations from a refractor device in communication with the controller, and determining, using the controller, an accommodation response of the eye based in part on the respective refractive states.

Description

Apparatus, system and method for objectively assessing ocular accommodation
Cross Reference to Related Applications
The present application claims the benefit of U.S. provisional application No. 63/171,320 filed on 6 th 4 th 2021 and U.S. provisional application No. 63/261,801 filed on 29 th 9 th 2021, the contents of which are incorporated herein by reference in their entireties.
Technical Field
The present disclosure relates generally to the field of ophthalmic equipment, and more particularly to devices, systems, and methods for objectively assessing ocular accommodation (accommation).
Background
Accommodation refers to the increase in the refractive power of the eye as the subject attempts to focus on a near object or target. It is one third of the motor nerves near the triplet, others are convergence and pupil constriction. Accommodation decreases with age, wherein presbyopia or presbyopia occurs due to the loss of elasticity of the eye lens.
The regulatory capacity of a subject is traditionally measured in a clinical setting using subjective techniques such as standard push tests and questions for the patient. Subjective testing, however, suffers from a prejudice to both the patient and the clinician, and often overestimates the patient's ability to regulate.
Furthermore, while some academic or research institutions have proposed the design of systems for objectively measuring regulation, such systems are difficult to build and the data obtained from such systems require extensive post-processing to obtain useful results. As such, these systems are not intended for use in clinical environments where ease of use is a primary concern.
Furthermore, some systems for measuring accommodation rely on optically induced accommodation stimuli that are difficult for the patient to focus on and may underestimate the accommodation capacity of the patient.
Accordingly, a solution to the above challenges is needed. Such a solution should provide an accurate and objective assessment of the ability of the subject to regulate without being overly complex. Such a solution should be designed to take clinical factors into account and should also present the patient with a convincing real world goal that can maintain the focus of the patient.
Disclosure of Invention
Disclosed herein are devices, systems, and methods for objectively assessing ocular accommodation. In one embodiment, a system for objectively assessing ocular accommodation of a subject is disclosed. The system may include a near display, a first beam splitter, and a far display positioned farther from the eye than the near display and the first beam splitter. The system may also include a controller in communication with the near display and the far display.
The near display may be oriented in a downward direction. The first beam splitter may be positioned at an oblique angle relative to the near display such that a graphic or image displayed on the near display is projected onto the near display.
The controller may include one or more processors and a tangible, non-transitory machine readable medium including instructions stored thereon. The one or more processors may execute at least some of the instructions to cause the controller to instruct the stimulation target to appear on the far display for a first duration and on the near display for a second duration. The near display may be configured to project the stimulation target onto the first beam splitter. The stimulation target displayed on the remote display may be axially aligned with the stimulation target projected on the first beam splitter. The one or more processors may execute further instructions to cause the controller to obtain measurements from a refractor device in communication with the controller regarding a refractive state of the eye during the first duration and the second duration, and determine an accommodation response of the eye based in part on the refractive state.
The system may also include a support assembly and a motorized stage coupled to a top of the support assembly. The motorized stage may be configured to translate the near display and the first beam splitter in a linear direction. The first beam splitter may be automatically translated to a plurality of stimulation locations located at a variable distance from the subject's eye. For example, the stimulation locations may be located about 0.80 meters, 0.37 meters, 0.33 meters, and 0.25 meters from the subject's eyes. In some embodiments, the remote display may be positioned between about 4 meters and 6 meters from the subject's eye.
In some embodiments, the stimulus target may be a optotype letter. The stimulation target may have a height dimension of between about 1.5cm and 2.0 cm.
The system may further include an angled mirror and a heat mirror positioned above the angled mirror. The heat mirror may be positioned between the near display assembly and the subject.
The refractor apparatus may include a refractor light source configured to generate an illumination beam, and a refractor camera configured to capture or detect light reflected by the eye in response to the illumination beam. The angled mirror and the heat mirror may be configured to steer the illumination beam to an eye of the subject. Light reflected by the eye may be diverted back toward the refractor means via the hot mirror and the angled mirror. The one or more processors of the optometry device may be configured to determine a refractive state of the eye based on the light reflected by the eye.
The system may further include a second beam splitter, and an alignment camera in communication with the controller and configured to capture real-time images of the eye. The second beam splitter may be positioned on the optic line of the eye, at the distal end of the support assembly, and between the near display assembly and the far display. The line of sight of the subject's eye may extend through the heat mirror, the first beam splitter, and the second beam splitter such that the subject views the stimulus target displayed on the remote display through the heat mirror, the first beam splitter, and the second beam splitter.
The alignment camera may be positioned offset from the line of sight of the eye. The second beam splitter may be configured to reflect an image of the subject's eye toward the alignment camera. A controller display in communication with the controller may be configured to display a Graphical User Interface (GUI) showing real-time images of the eyes captured by the alignment camera. The GUI may further show a fixed reticle pattern overlaid on the real-time image of the eye. When the GUI shows anatomical features of the eye within at least a portion of the fixed reticle pattern, the subject's eye is optically aligned with the optometry device.
The system may also include a distally aligned light source positioned at the distal end of the support assembly. The remote alignment light source may be configured to project the optical marker onto the remote display via the second beam splitter. The stimulation target may be displayed on the remote display within an area surrounding the optical marker.
The system may further include an input device configured to receive user input from the subject regarding a stimulation target displayed on at least one of the near display and the far display. An input device may be in communication with the controller. The one or more processors are configured to execute further instructions to cause the controller to instruct the stimulation target to appear on the far display for a first duration in a plurality of first rotational orientations, receive user input from the input device corresponding to the first rotational orientations, instruct the stimulation target to appear on the near display for a second duration in a plurality of second rotational orientations, and receive user input from the input device corresponding to the second rotational orientations.
In some embodiments, the input device may be a joystick. In these embodiments, each of the user inputs is a joystick movement initiated by the subject in a direction associated with a rotational orientation of the stimulation target displayed on the near display or the far display.
The one or more processors of the controller may be configured to execute further instructions to cause the controller to obtain measurements from the optometry device regarding pupil diameter and gaze displacement of the eye. The controller may determine the accommodation response of the eye only if the pupil diameter exceeds a minimum diameter threshold and the gaze displacement is less than a maximum displacement threshold.
The one or more processors of the controller may be further configured to execute further instructions to cause the controller to obtain refractive data from the optometry device about the subject's eye during the calibration procedure. The calibration procedure may include placing an infrared filter in front of the eye, placing each of a plurality of test lenses having different diopter strengths in sequence in front of the eye, and directing the subject to look toward the remote display and simultaneously measuring the refractive condition of the eye for each of the test lenses using the optometry device. The controller may then fit a line to the data points plotted for the different diopter strengths of the test lens relative to the average of the measured refractive states using regression techniques and calculate the slope of the line for use as a calibration factor.
The one or more processors of the controller may be configured to execute further instructions to cause the controller to calculate an accommodation response of the eye using the formula:
regulatory response = CF (X F -X N )
In the above formula, CF is the calibration factor, X F Is an average teledioptic value calculated using measurements taken during a first duration when the stimulus target is displayed on the teledisplay, and X N Is the average near refractive value calculated using measurements taken during a second duration when the stimulus target is displayed on the near display.
A method for objectively assessing ocular accommodation in a subject is also disclosed. In one embodiment, the method may include displaying the stimulation target at the far display for a first duration and displaying the stimulation target at the near display for a second duration.
In some embodiments, a stimulation target displayed at the near display may be projected onto a first beam splitter positioned at an oblique angle relative to the near display. In these and other embodiments, the stimulation target displayed on the remote display may be axially aligned with the stimulation target projected on the first beam splitter.
The method may further include obtaining, at the controller, measurements regarding the refractive state of the eye during the first duration and the second duration from a refractor device in communication with the controller. The method may further include determining, using the controller, an accommodation response of the eye based in part on the respective refractive states.
In another embodiment, the method may include displaying the stimulation target in a plurality of first rotational orientations for a first duration at the far display and displaying the stimulation target in a plurality of second rotational orientations for a second duration at the near display. In this embodiment, the method may further comprise receiving, at the controller, a plurality of user inputs applied by the subject to the input device. An input device (e.g., a joystick) may be in communication with the controller. The user input (e.g., joystick movement) may correspond to a first rotational orientation and a second rotational orientation. The method may further include obtaining, at the controller, measurements from the optometry device regarding the refractive state of the eye during the first and second durations. The method may further include determining, using the controller, an accommodation response of the eye based in part on the respective refractive states and the user input.
In some embodiments, the method may include translating the near display and the first beam splitter to a stimulation position via a motorized stage prior to displaying the stimulation target.
The method may include generating an illumination beam using an optometry light source of the optometry device, and capturing or detecting light reflected by the eye in response to the illumination beam using an optometry camera of the optometry device. In some embodiments, the illumination beam may be diverted to the subject's eye by an angled mirror and a heat mirror positioned above the angled mirror. Light reflected by the eye may be diverted back toward the refractor means via the hot mirror and the angled mirror. The one or more processors of the optometry device can determine a refractive state of the eye based on the light reflected by the eye.
The method may further include capturing a real-time image of the eye using an alignment camera positioned offset from a line of sight of the eye. The image of the eye may be reflected by a second beam splitter positioned on the line of sight of the eye toward the alignment camera. The method may also include displaying a Graphical User Interface (GUI) showing a real-time image of the eye captured by the alignment camera using a controller display in communication with the controller. The GUI may further show a fixed reticle pattern overlaid on the real-time image of the eye. The method may further include aligning the eye with the optometry device by adjusting at least one of the chin rest assembly and the position of the subject's head until the GUI shows anatomical features of the eye within at least a portion of the fixed reticle pattern.
The method may further comprise obtaining measurements from the optometry apparatus regarding pupil diameter and gaze displacement of the eye, and determining in turn an accommodation response of the eye only when the pupil diameter exceeds a minimum diameter threshold and the gaze displacement is less than a maximum displacement threshold.
The method may further comprise calibrating the instrument from the subject's eye by: the method includes placing an infrared filter in front of the eye, placing each of a plurality of test lenses having different diopter strengths in sequence in front of the eye, and directing the subject to look toward the far display and simultaneously measuring the refractive condition of the eye for each of the test lenses using the optometry device. The controller may then fit a line to the data points plotted for the different diopter strengths of the test lens relative to the average of the measured refractive states using regression techniques and calculate the slope of the line for use as a calibration factor.
The method may further include calculating an average distance refraction value using measurements taken during a first duration when the stimulation target is displayed on the distance display, and calculating an average near refraction value using measurements taken during a second duration when the stimulation target is displayed on the near display. The accommodation response of the subject's eye can then be calculated by subtracting the average distance refractive value from the average near refractive value and multiplying the result by a calibration factor.
Drawings
Fig. 1 schematically illustrates one embodiment of a system for objectively assessing ocular accommodation.
Fig. 2 illustrates a perspective view of a portion of the system shown in fig. 1.
Figure 3A illustrates a perspective view of one embodiment of the system with the refractor aligned with the light source.
Fig. 3B and 3C are schematic diagrams depicting examples of alignment of two stimulation targets.
Fig. 4 illustrates a top down perspective view of one embodiment of a system showing a portion of a light source in remote alignment.
Fig. 5A illustrates a remote display of the system showing a number of possible areas where stimulation targets may be displayed.
Fig. 5B illustrates that once a region is selected from the possible regions, the stimulation target may be displayed in that region.
Fig. 6 illustrates that the joystick may be controlled by the subject to match the joystick movement to the rotational orientation of the displayed stimulation target.
Fig. 7 illustrates one embodiment of a method of objectively assessing ocular accommodation.
FIG. 8 illustrates one embodiment of an alignment Graphical User Interface (GUI).
FIG. 9 illustrates one embodiment of a calibration GUI.
FIG. 10 illustrates one embodiment of an adjustment GUI.
FIG. 11 is a graph that visualizes the change in raw or un-scaled refractive measurements made by the refractor apparatus over time.
Fig. 12 graphically illustrates both an idealized response curve and a subject response curve based on the regulatory response calculated at each stimulation location.
Detailed Description
Fig. 1 schematically illustrates one embodiment of a system 100 for objectively assessing ocular accommodation of a subject. As the subject views the real world stimulus target with both eyes in free space, the system 100 can measure real-time changes in the refractive state of the eye. Fig. 2 is a perspective view of a portion of the system 100 shown in fig. 1.
The system 100 may be used to assess visual accommodation under a variety of conditions. In some embodiments, the system 100 may be used to determine the efficacy of an implanted accommodating intraocular lens (AIOL). For example, the system 100 disclosed herein may be used to determine the efficacy of any AIOL disclosed in the following U.S. patent applications and publications, as well as in the following issued U.S. patents: U.S. patent application Ser. Nos. 17/060,901 and 17/060,919, filed on 1/10/2020; U.S. patent publication 2020/0337833; U.S. patent publication No. 2018/0256315; U.S. patent publication No. 2018/0153682; and U.S. patent publication No. 2017/0049561, which is U.S. patent No. 10,299,913; U.S. patent No. 10,195,020; and U.S. patent No. 8,968,396, the contents of which are incorporated herein by reference in their entirety.
In other embodiments, the system 100 may be used to evaluate the efficacy of an intraocular lens or phakic intraocular lens. In further embodiments, the system 100 may be used to evaluate the visual accommodation of a subject's natural lens.
As shown in fig. 1 and 2, the system 100 may include a chin rest assembly 102 that further includes a chin rest 104 for supporting a subject's chin. The chin rest 104 may support the chin of a subject when the subject sits in a chair or seat (not shown).
The chin rest assembly 102 may be adjusted with the subject's chin resting on the chin rest 104. Chin rest assembly 102 may be adjusted to align the subject's eyes with certain components of system 100.
For example, the chin rest 104 may be raised or lowered in a vertical direction or moved laterally. Alignment of the subject's eyes will be discussed in more detail in the following paragraphs.
The system 100 may also include a near display assembly 106 and a far display 108 that is farther from the eye than the near display assembly 106. The near display assembly 106 may be automatically translated to a plurality of stimulus positions 110 located at a variable distance 112 from the eye. The remote display 108 may be located at a distance 114 from the eye. The remote distance 114 may be greater than any of the variable distances 112. In some embodiments, the distance 114 may be between about 4.0 meters and 6.0 meters (e.g., 4 meters, 4.5 meters, 5.0 meters, 5.5 meters, or 6.0 meters). Variable distance 112 may be located approximately 0.80 meters, 0.37 meters, 0.33 meters, and 0.25 meters from the subject's eye.
Near display assembly 106 may include a near display 116 and a first beam splitter 118 carried or otherwise supported by a near display frame 120. The first beam splitter 118 may be a dichroic filter or be made in part of a dichroic material.
In some embodiments, the beam splitters disclosed herein (including either the first beam splitter 118 or the second beam splitter 150, see also fig. 4) may be made of tourmaline. Tourmaline is a crystalline borosilicate mineral which also contains aluminum, iron, magnesium, lithium, potassium and/or sodium. In other embodiments, either of first beam splitter 118 or second beam splitter 150 may be a plate beam splitter made by coating a mirror or fused silica substrate in the shape of a plate with a dichroic optical coating or a metallic coating.
As shown in fig. 1 and 2, first beam splitter 118 may be positioned at an oblique angle (i.e., not a right angle) with respect to near display 116. For example, when first beam splitter 118 is designed to have an angle of incidence of 45 degrees, the beam splitter may be positioned at a 45 degree angle with respect to near display 116.
In some embodiments, first beam splitter 118 may be mounted to one or more frame sides of near-display frame 120. The first beam splitter 118 may be coupled to the near display frame 120 via a rotatable hinge such that the angle of the first beam splitter 118 is adjustable.
Near display 116 may display an image or graphic and project it onto first beam splitter 118 for viewing by the subject. Further, as shown in fig. 1 and 2, the near display 116 may be oriented in a vertically downward direction or face down such that the near display 116 faces the base of the near display frame 120.
Motorized stage 122 may be positioned along top of support assembly 124. In some embodiments, the support assembly 124 may be an elongated table or bench. Motorized stage 122 may translate near display frame 120 carrying near display 116 and first beam splitter 118 in a linear direction. In some embodiments, motorized stage 122 may be controlled by one or more servo controllers or motors (e.g., brushless servo controllers). In other embodiments, motorized stage 122 may be controlled or otherwise operated by another type of actuator (such as a linear actuator, a rotary actuator, or a stepper motor).
The first beam splitter 118 can translate along a line of sight 126 of the subject's eye to the plurality of stimulation locations 110 (one such location is shown in phantom in fig. 1). The stimulation location 110 may be located at any distance between 0.25 meters and 1.0 meters from the subject's eye. More particularly, the stimulation site 110 may be located at about 0.80 meters, 0.37 meters, 0.33 meters, and 0.25 meters from the subject's eye. These may correspond to diopter distances of 1.25 diopters, 2.7 diopters, 3.0 diopters, and 4.0 diopters, respectively. In other embodiments, the stimulation locations 110 may also be located at approximately 0.66 meters, 0.57 meters, 0.50 meters, and 0.29 meters from the subject's eye (corresponding to diopter distances of 1.50 diopters, 1.75 diopters, 2.0 diopters, and 3.5 diopters, respectively).
The system 100 may include a controller 128 having one or more processors 130 and at least one memory 132. The one or more processors 130 may include one or more Central Processing Units (CPUs), graphics Processing Units (GPUs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs), or combinations thereof.
Memory 132 may be a tangible, non-transitory machine-readable medium comprising instructions (e.g., software instructions) stored thereon. For example, memory 132 may refer to non-volatile memory, or other types of computer-readable storage. More specifically, the memory 132 may refer to flash memory (in the form of a solid state drive), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM) (such as Low Power Double Data Rate (LPDDR) SDRAM), an embedded multimedia controller (eMMC) memory device, or a combination thereof. The one or more processors 130 may execute machine-readable instructions stored on the memory 132 to control certain electronic components of the system 100.
The controller 128 may communicate with other electronic components via one or more networks 134. In some embodiments, the network 134 may be or refer to a network established using a physical or wired connection, such as a bi-directional high-speed bus (e.g., a serial communication bus), fiber optic cable, ethernet, or a combination thereof. For example, network 134 may refer to a portion of a Local Area Network (LAN), such as a Controller Area Network (CAN) or controller area network (CAN-FD) with flexible data rates. In other embodiments, the network 134 may be or refer to the use of a wireless communication protocol or standard (e.g., 3G wireless communication standard, 4G wireless communication standard, 5G wireless communication standard, long Term Evolution (LTE) wireless communication standard, bluetooth) TM (IEEE 802.15.1) or low power consumption Bluetooth TM (BLE) short range communication protocol, wireless fidelity (WiFi) (IEEE 802.11) communication protocol, ultra Wideband (UWB) (IEEE 802.15.3) communication protocol, zigBee TM (IEEE 802.15.4) communication protocols, or combinations thereof.
The controller 128 may be communicatively coupled with a controller display 136. In some embodiments, the controller 128 may be a dedicated desktop, laptop, or tablet computer, and the controller display 136 may be the display of a desktop, laptop, or tablet computer, respectively. In other embodiments, the controller display 136 may be a separate display. For example, the controller display 136 may be a high definition television, an ultra high definition television, a projector, or a computer display. Although not shown in the figures, the controller 128 may also communicate with a data management module or cloud storage database to manage patient data or test data.
The controller 128 may be communicatively coupled with the near display 116 and the far display 108. The one or more processors 130 of the controller 128 may execute instructions stored in the memory 132 to cause the controller 128 to indicate that the stimulation target 200 is present on the far display 108 and the near display 116 (see, e.g., fig. 2, 3B, 3C, and 5B).
In some embodiments, the far display 108 and the near display 116 may be electronic flat panel displays. For example, the far display 108 and the near display 116 may be high resolution liquid crystal displays. For example, at least one of the far display 108 and the near display 116 may have a display resolution of 1280x 1080. In other embodiments, at least one of the far display 108 and the near display 116 may have a display resolution of 1920x 1080.
In other embodiments, at least one of the far display 108 and the near display 116 may be a Light Emitting Diode (LED) display, an Organic Light Emitting Diode (OLED) display, or an Active Matrix OLED (AMOLED) display.
In some embodiments, the stimulation target 200 may be a digitally or electronically presented graphic, icon, or letter displayed on the far display 108 or the near display 116. In some embodiments, the stimulation target 200 may be a digitally or electronically presented optotype letter for assessing visual acuity of the subject. For example, the stimulus target 200 may be the letter "E" presented digitally or electronically or another optotype letter (e.g., the letter "F") with spaced lines or patterns. In other embodiments, the stimulation target 200 may be any asymmetric graphic, icon, or letter.
The stimulation target 200 may have a height dimension of between about 1.50cm and 2.0 cm. For example, stimulation target 200 may have a height dimension of approximately 1.75 cm.
Since the stimulus target 200 is electronically generated on the far display 108 and the near display 116, the stimulus target 200 may be resized as the stimulus target 200 is closer to the subject's eye. In some embodiments, the size of the stimulation target 200 may be reduced when the stimulation target 200 is displayed on the near display 116. In other embodiments, the size of the stimulation target 200 may be increased or enlarged when the stimulation target 200 is displayed on the remote display 108.
In some embodiments, the stimulation target 200 may be a high contrast letter, graphic, or icon. For example, the stimulation target 200 may be a light graphic, icon, or letter (e.g., the white letter "E") displayed on a dark background (e.g., a black background). The contrast between the stimulus target 200 and the background should be large enough to allow the subject to see clearly one or more edges of the stimulus target 200.
In some embodiments, the pixel intensity of the dark background may be set to 0 and the pixel intensity of the stimulus target 200 (e.g., the digitally presented letter "E") may be set to 255.
In other embodiments, the stimulation target 200 may be a dark graphic, icon, or letter (e.g., a dark letter) displayed on a light background. In additional embodiments, the color of the stimulus target 200 may vary as the stimulus target 200 appears on different displays.
The near display 116 may be configured to project the stimulation target 200 onto the underlying first beam splitter 118. First beam splitter 118 can reflect an image of stimulus target 200 onto the subject's eye. When the stimulation target 200 is displayed on the far display 108 (and the near display 116 does not display the stimulation target 200), the subject may view the stimulation target 200 displayed on the far display 108 through the first beam splitter 118.
The stimulation target 200 displayed on the remote display 108 may be axially aligned with the stimulation target 200 projected on the first beam splitter 118. For example, the stimulus target 200 and the first beam splitter 118 displayed on the remote display 108 may each be axially aligned with the line of sight 126 of the subject's eye. As will be discussed in more detail in the following paragraphs, the subject's eyes may also be aligned with one or more optical components or cameras/detectors of the system 100.
One technical advantage of the system 100 disclosed herein is that the display and movement of the stimulation target 200 is automated and controlled by the controller 128. The clinician or technician may perform the adjustment assessment by simply clicking on a button presented as part of a graphical user interface presented on the controller display 136. The system 100 automatically presents the real world stimulation target 200 to the subject for viewing without the clinician or technician having to physically move such target or require the subject to change positions. Furthermore, the stimulation target 200 is translated in a repeatable manner so that subsequent evaluations may be made and the results compared to earlier evaluations.
Fig. 1 and 2 also illustrate that system 100 may include a refractor apparatus 138 coupled to a rail 140 positioned at the base of support assembly 124. When the stimulus target 200 is shown to the subject via the far display 108 and the first beam splitter 118, the optometry device 138 may determine the refractive state of the eye of the subject.
In some embodiments, refractor apparatus 138 may be a photographic refractor, such as an Infrared (IR) photographic refractor. For example, optometry device 138 may be from PlusOptix GmbHPowerRef 3 photographic refractometer.
In other embodiments, refractor means 138 may be an autorefractor. For example, refractor means 138 may be a Grand Seiko WR-5100K autorefractor from Shigiya Machinery Works Ltd company.
Optometry device 138 may use decentration light refraction to measure the refractive state of the eye. For example, refractor apparatus 138 may generate and direct illumination beam 141 (e.g., an IR or Near Infrared (NIR) beam at about 880nm±80 nm) toward the eye of the subject using refractor light source 142. Since infrared light is not visible to the human eye, patient comfort, pupil diameter, or accommodation response is not affected.
The fundus of the eye may reflect the illumination beam 141 (e.g., IR or NIR beam) back as reflected light 143 (e.g., reflected IR or NIR light) that exits from the pupil of the eye. Reflected light 143 may be captured by a refractor camera 144 (e.g., an infrared camera or detector).
In some embodiments, refractor device 138 may determine the refractive state of the eye based on the range of crescent shapes observed along the pupil of the eye. A larger crescent size may indicate that the subject's eye has a larger defocus.
Optometry device 138 may record images of reflected light 143 at a capture rate of 50Hz or once every 20 milliseconds (ms). Optometry device 138 may use the recorded images to determine the refractive state of the eye.
System 100 may also include angled mirror 146 and heat mirror 148 configured to direct illumination beam 141 to the eye and reflected light 143 back to refractor apparatus 138. Fig. 1 and 2 illustrate that optometry device 138 may be positioned outside of line of sight 126 of a subject. For example, optometry device 138 may be positioned vertically below chin rest 104, motorized stage 122, and near display assembly 106 (i.e., vertically below in the Y-direction). As shown in fig. 1, at least a portion of the beam path of the illumination beam 141 may be parallel to but vertically offset (i.e., vertically offset in the Y-direction) from the line of sight 126 of the subject's eye. Angled mirror 146 and heat mirror 148 may cooperate to steer illumination beam 141 to the subject's eye. In addition, the same heat mirror 148 and angled mirror 146 may be used to steer reflected light 143 back to refractor apparatus 138. At least a portion of the optical path of the reflected light 143 may also be parallel to but vertically offset from the line of sight 126 of the subject's eye.
The reflective surface of the angled mirror 146 may be positioned at an angle relative to the beam path of the illumination beam 141. For example, the reflective surface of the angled mirror 146 may be positioned at a 45 degree angle relative to the beam path of the illumination beam 141. The heat mirror 148 may be positioned at an angle relative to the beam path reflected from the angled mirror 146 and the eye's line of sight 126. For example, the heat mirror 148 may be positioned at a 45 degree angle relative to the beam path reflected from the angled mirror 146 and the line of sight 126 of the eye.
An angled mirror 146 may be positioned at the proximal end of the rail 140. The angled mirror 146 may be configured to reflect the illumination beam 141 from the refractor light source 142 (e.g., IR or NIR light source from the refractor apparatus 138) to the heat mirror 148.
The heat mirror 148 may be positioned at or beyond the proximal end of the motorized stage 122. The heat mirror 148 may be positioned between the near display assembly 106 and the subject. The heat mirror 148 may be positioned vertically above the angled mirror 146 (vertically above in the Y direction as shown in fig. 1). The heat mirror 148 may be configured to selectively reflect the illumination beam toward the subject's eye.
The heat mirror 148 may also be configured to reflect an image of the illuminated eye or IR/NIR light emitted from the pupil of the eye toward the angled mirror 146. In some embodiments, the heat mirror 148 may be a dichroic filter or be made in part of a dichroic material.
The angled mirror 146 may be configured to reflect an image of the illuminated eye or IR/NIR light emitted from the pupil of the eye back toward a refractor camera 144 (e.g., an infrared camera or detector within a photo refractor).
One technical problem faced by applicant is how to position or align optometry device 138 with respect to the eyes of a subject in this manner: so that the optometry device 138 does not obstruct the subject's line of sight 126 and/or distract the subject when the subject attempts to focus on the stimulation target 200. One solution that applicant has found and developed is the arrangement disclosed herein, in which optometry device 138 is positioned vertically below line of sight 126 of the subject (see, e.g., fig. 1 and 2) and both angled mirror 146 and heat mirror 148 are used to optically align optometry device 138 with the eye being tested of the subject. The subject may view the stimulation target 200 through the heat mirror 148.
The system 100 may determine the accommodation response of the eye based on the refractive state of the eye measured by the refractor device 138. More specifically, the system 100 can measure real-time changes in the refractive state of the eye as the subject is presented with stimulus targets 200 at different distances from the eye. For example, the controller 128 may be programmed to instruct the stimulation target 200 to appear on the far display 108 for a first duration and then on the near display 116 for a second duration. The stimulation target 200 displayed on the near display 116 may be projected onto the first beam splitter 118 for viewing by the subject.
In some embodiments, the first duration and the second duration may be a period of time lasting between about 5 seconds and 15 seconds (e.g., 10 seconds). In other embodiments, at least one of the first duration and the second duration may be less than 5 seconds or greater than 15 seconds.
Optometry device 138 may determine the refractive state of the eye when the subject is focused or is attempting to focus on stimulus target 200 during the first duration and the second duration. Optometry device 138 may transmit the measured refractive status to controller 128.
The controller 128 may then determine an accommodation response of the eye based in part on the refractive conditions. As will be discussed in more detail in the following paragraphs, the controller 128 may first use the calibration factor 908 to scale or adjust the refractive state (see, e.g., fig. 9). The controller 128 may also calculate a first mean refractive state using the refractive state measured during the first duration and calculate a second mean refractive state using the refractive state measured during the second duration. Finally, the controller 128 may use the first and second mean refractive states to determine an accommodation response.
Fig. 2 illustrates that the system 100 may further include a second beam splitter 150 (see also fig. 4) positioned in the line of sight 126 of the eye under test, between the near display assembly 106 and the far display 108. The second beam splitter 150 may be fastened or otherwise coupled to a raised platform 152 positioned at the distal end of the support assembly 124.
For example, the raised platform 152 may be a flat surface positioned at the top of a post that serves as one of the legs of the support assembly 124. In other embodiments, the raised platform 152 may be a separate structure from the support assembly 124.
The system 100 may also include an alignment camera 202 secured or otherwise coupled to the raised platform 152. The alignment camera 202 may be positioned laterally offset (e.g., offset in the X-direction shown in fig. 2) from the line of sight 126 of the eye. For example, the camera lens of the alignment camera 202 may be positioned substantially perpendicular to the line of sight 126 of the eye.
When the subject is properly positioned at the chin rest assembly 102, the second beam splitter 150 may reflect an image of the subject's eye toward the alignment camera 202. The second beam splitter 150 may be positioned at an angle (e.g., about 45 degrees) relative to the line of sight 126 of the eye. This may allow the second beam splitter 150 to reflect an image of the eye to the alignment camera 202 while allowing the subject to view the stimulus target 200 on the far display 108.
The alignment camera 202 may be configured to capture real-time images 806 of the eye (see, e.g., fig. 8, 9, and 10). The operator may use the images captured by the alignment camera 202 to determine whether the eye is properly aligned (i.e., optically aligned) with the various components of the system 100. For example, the operator may use the images captured by alignment camera 202 to determine whether the eye is properly aligned with optometry device 138.
One technical problem faced by the applicant is how to capture an image of the subject's eye during an accommodation procedure in this way: so that the camera does not obstruct the subject's line of sight 126 and/or distract the subject when the subject attempts to focus on the stimulation target 200. One solution that applicant has discovered and developed is the alignment camera 202 (see, e.g., fig. 2 and 3) disclosed herein that is positioned substantially perpendicular to and laterally offset from the subject's line of sight 126 (see, e.g., fig. 1). The alignment camera 202 utilizes a second beam splitter 150 positioned at an angle relative to the alignment camera 202 and the subject's line of sight 126 to reflect an image of the subject's eye to the alignment camera 202. The subject may view the stimulation target 200 displayed on the remote display 108 through the second beam splitter 150.
The alignment camera 202 may also be configured to transmit a real-time image 806 of the eye to the controller 128. The alignment camera 202 may be communicatively coupled with the controller 128.
As previously discussed, the controller display 136 may be communicatively coupled with the controller 128. The controller 128 may generate one or more Graphical User Interfaces (GUIs) (e.g., an alignment GUI 800 (see fig. 8), a calibration GUI 900 (see fig. 9), or an adjustment GUI 1000 (see fig. 10)) to be displayed on the controller display 136. The GUI may display a real-time image 806 of the eye captured by the alignment camera 202. As will be discussed in more detail in the following paragraphs, each of these GUIs may further illustrate a fixed reticle pattern 804 (see, e.g., fig. 8, 9, and 10) overlaid on a real-time image 806 of the eye.
The operator may determine whether the subject's eyes are optically aligned with certain components of the refractor apparatus 138 (e.g., the refractor light source 142 and the refractor camera 144) based on whether the GUI shows anatomical features of the eyes within at least a portion of the fixed reticle pattern 804. The anatomical features of the eye may be anatomical structures or components of the eye selected based on their visibility and their location relative to the pupil. In one exemplary embodiment, the anatomical feature of the eye may be the limbus of the eye. In other embodiments, the anatomical feature of the eye may be the outer boundary of the iris.
The fixed reticle pattern 804 may include a number of circular shapes or voids. For example, the operator may adjust at least one of the chin rest assembly 102 or the subject's head until the limbus of the eye is within (or surrounded by) at least one of the circular shape or the void of the fixed reticle pattern 804.
In some embodiments, the fixed reticle 804 may correspond to aligning the center of the camera lens of the camera 202. In other embodiments, fixed reticle 804 may correspond to another reference point aligned with optometry device 138 and stimulation target 200 to be displayed. As will be discussed in later paragraphs, several pre-alignment steps may be taken to ensure that the alignment camera 202 is optically aligned with the optometry device 138 and the stimulus target 200 and the first beam splitter 118 displayed on the remote display 108.
Fig. 1 also illustrates that the subject may view the stimulation target 200 (see, e.g., fig. 5B) displayed on the remote display 108 through the heat mirror 148, the first beam splitter 118, and the second beam splitter 150. As previously discussed, both the first beam splitter 118 and the second beam splitter 150 are angled to allow the subject to see through these beam splitters.
The system 100 may also include a distally aligned light source 400 (see, e.g., fig. 4) positioned at the distal end of the support assembly 124. In some embodiments, the far alignment light source 400 may be positioned on the same stage 152 as the second beam splitter 150 and the alignment camera 202. As will be discussed in more detail in the following paragraphs, the far-aligned light source 400 may project the optical marker 500 onto the far display 108 via the second beam splitter 150.
The light emitted by the far-aligned light source 400 may be light in the visible spectrum (e.g., a colored laser, such as a red laser). For example, the far-aligned light source 400 may be a laser pointer. The light marker 500 may be used to determine where the stimulation target 200 is displayed on the far display 108. For example, the optical marker 500 may be a red laser spot. The far alignment light source 400 may be used to ensure that the stimulus target 200 displayed on the far display 108 is aligned with the line of sight 126 of the eye.
FIG. 1 also illustrates that the system 100 may include an input device 154. The input device 154 may be configured to receive user input from the subject regarding the stimulation target 200 displayed on the near display 116, the far display 108, or a combination thereof.
In some embodiments, the input device 154 may be a joystick (see, e.g., fig. 6). In other embodiments, the input device 154 may be a touch pad configured to receive touch input, a keyboard configured to receive key strokes, or a computer mouse configured to receive mouse clicks. The input device 154 may be another type of computer input device known to those skilled in the art.
In some embodiments, the controller 128 may be programmed to instruct the stimulation target 200 to appear on the far display 108 in a plurality of first rotational orientations (see fig. 6 for an example of a different rotational orientation) for a first duration. In response to the displayed stimulation target 200, the controller 128 may receive user input (e.g., joystick movement) from the input device 154 corresponding to the first rotational orientation. The controller 128 may be further programmed to instruct the stimulation target 200 to appear on the near display 116 (which is then projected onto the first beam splitter 118) in a plurality of second rotational orientations for a second duration. In response to the displayed stimulation target 200, the controller 128 may receive user input (e.g., joystick movement) from the input device 154 corresponding to the second rotational orientation.
As will be discussed in more detail in the following paragraphs, the stimulus target 200 may be the light letter "E" displayed on a dark background. When the light letter "E" is displayed on the far display 108 and then on the near display 116, it may be rotated in a different direction (such that the letter "E" appears to scroll). In this example, the input device 154 may be a joystick. The subject may be instructed to coordinate the joystick movement of the subject (e.g., push forward, pull backward, push left, or push right) with the rotational orientation of the letter "E" displayed. The controller 128 may then match the joystick movement to the rotational orientation to ensure that the subject is engaged and focused during the assessment.
Optometry device 138 may determine the refractive state of the eye when the subject is focused or is attempting to focus on the rotating stimulus target 200 during the first duration and the second duration. Optometry device 138 may transmit the measured refractive status to controller 128. The controller 128 may then determine an accommodation response of the eye based in part on the refractive conditions.
Fig. 3A shows a perspective view of refractor alignment light source 300. When initially setting up system 100, an operator may use refractor alignment light source 300 to align refractor apparatus 138 with alignment camera 202 in a pre-alignment procedure.
Optometry alignment light source 300 may be coupled to rail 140 below support assembly 124 (see, e.g., fig. 1 and 2). The refractor alignment light source 300 may be positioned between the refractor apparatus 138 and the angled mirror 146. Optometry alignment light source 300 can be coupled to the rail via a hinged connector. This is to allow the refractor alignment light source 300 to be folded or otherwise rotated downward when not in use (i.e., such that the refractor alignment light source 300 is parallel to the rail 140) so that the refractor alignment light source 300 does not block the refractor light source 142 (e.g., IR/NIR light source) of the refractor device 138.
Refractor alignment light source 300 may emit a refractor alignment light beam 302. Optometry alignment beam 302 may be a visible light (e.g., green laser) beam directed toward angled mirror 146 (see, e.g., fig. 1 and 2). The refractor alignment light source 300 may be sized and/or positioned such that the refractor pair Ji Guangshu simulates the illumination beam 141 emitted by the refractor light source 142 and the beam path of the refractor pair Ji Guangshu 302 is aligned with the beam path of the illumination beam 141.
When the operator's chin rests on chin rest 104 (see, e.g., fig. 1 and 2), angled mirror 146 and heat mirror 148 may steer or otherwise direct optometry pair Ji Guangshu 302 to the operator's eyes. If the operator does not see refractor alignment light source 300, the operator may adjust at least one of angled mirror 146 and heat mirror 148.
When the refractor pair Ji Guangshu 302 (e.g., green laser) is visible to the operator (e.g., as a green laser spot on the heat mirror 148), the same operator or another operator may then adjust the alignment camera 202 (or the combination of the alignment camera 202 and the second beam splitter 150) until the refractor pair Ji Guangshu is aligned with the center of the camera lens of the alignment camera 202. As previously discussed, the center of the camera lens of the alignment camera 202 may correspond to a fixed reticle pattern 804 (see, e.g., fig. 8, 9, and 10) overlaid on a real-time image 806 of the eye captured by the alignment camera 202. The real-time image 806 of the eye and the fixed reticle 804 may be shown on a graphical user interface displayed on the controller display 136.
The same operator or another operator may examine the graphical user interface to see if the pair Ji Guangshu of refractors 302 (e.g., shown as laser points, such as green laser points) is aligned with the center of the fixed reticle 804 or surrounded by the fixed reticle 804. To this end, the same operator or another operator may sit where the subject is typically sitting, with the operator's chin resting on chin rest 104. Alternatively, an opaque or reflective object or surface may be held behind the chin rest 104. This may indicate that alignment camera 202 is optically aligned with refractor means 138 when refractor pair Ji Guangshu 302 is aligned with the center of fixed reticle 804. Once alignment camera 202 is aligned with refractor apparatus 138, the clinician need only confirm that the anatomical features of the subject's measured eye (e.g., the limbus of the subject's eye) are aligned with fixed reticle pattern 804 to ensure that the measured eye is optically aligned with alignment camera 202 and refractor apparatus 138.
As part of the pre-alignment procedure, the operator may also cause the controller 128 to instruct the near display 116 and the far display 108 to display the stimulation targets 200 when the optometry alignment light source 300 is turned on (this may be done one at a time). The same operator or another operator may then adjust the positioning of at least one of first beam splitter 118 (or first beam splitter 118 and near display 116) and far display 108 until stimulation target 200 displayed on each of first beam splitter 118 and far display 108 is aligned with refractor pair Ji Guangshu 302. In addition, the operator may also input certain commands into the controller 128 to cause the controller 128 to adjust where the stimulation target 200 is displayed on the near display 116 and/or the far display 108.
Fig. 3B and 3C illustrate examples of alignment of stimulus target 200 with refractor pair Ji Guangshu 302 (shown as laser points). When the laser points representing optotype pair Ji Guangshu 302 are seen to be at about the same location on each stimulus target 200 (e.g., when stimulus target 200 is letter "E," the laser points can be seen to be at the midline of the vertical segment of letter "E"), stimulus target 200 can be aligned with optotype pair Ji Guangshu 302. This may ensure that not only the two stimulus targets 200 displayed are aligned with each other, but also that the stimulus targets 200 are displayed along an axis that is aligned with the viewer's line of sight 126 when the viewer's (the viewer is an operator or subject) eye is aligned with the fixed reticle 804.
As will be discussed in more detail in the following paragraphs, a remote alignment light source 400 (see, e.g., fig. 4) may also be used to facilitate alignment of the stimulation target 200 on the remote display 108. Because the clinician or operator may move the far display 108 after or between test sessions (while the first beam splitter 118 and the near display 116 are fixed to the near display frame 120 and are less likely to be moved between test sessions), the far alignment light source 400 may ensure that the stimulation target 200 displayed on the far display 108 is aligned with the stimulation target 200 projected on the first beam splitter 118 and other components of the system 100.
Fig. 4 illustrates a top-down perspective view of a portion of system 100 showing a remotely aligned light source 400. A distally aligned light source 400 may be positioned at the distal end of the support assembly 124. In some embodiments, the far alignment light source 400 may be positioned on the same stage 152 as the second beam splitter 150 and the alignment camera 202.
The far aligned light source 400 may emit a far aligned light beam 402. The far-aligned beam 402 may be a beam in the visible spectrum (e.g., a colored laser, such as a red laser). For example, the far-alignment light source 400 may be a laser pointer (e.g., a red laser pointer).
The far pair Ji Guangshu 402 may ultimately be directed to the far display 108. The far alignment beam 402 may be used as an optical marker 500 (see, e.g., fig. 5A) to mark the point on the far display 108 where the stimulation target 200 should be displayed. For example, a clinician or operator of the system 100 may use the light marker 500 to determine the area or place on the remote display where the stimulation target 200 should be displayed. As a more specific example, the optical marker 500 may be a red laser spot.
As previously discussed, the far-alignment light source 400 may be used to ensure that the stimulus target 200 displayed on the far display 108 is aligned with the stimulus target 200 projected on the first beam splitter 118 and the line of sight 126 of the eye. The far-aligned light source 400 may be used as part of the final step in the pre-alignment procedure.
As shown in fig. 4, the far alignment light source 400 may face the alignment camera 202. The far alignment light source 400 may be positioned such that the far pair Ji Guangshu 402 is aligned with the center of the camera lens of the alignment camera 202 (e.g., when the far alignment light source 400 is turned on, a red laser spot is visible at the center of the camera lens of the alignment camera 202).
When the second beam splitter 150 is placed in position between the alignment camera 202 and the far alignment light source 400, at least a portion of the far pair Ji Guangshu 402 may be reflected or diverted by the second beam splitter 150 to the far display 108 distal of the platform 152. Since the second beam splitter 150 is already positioned at a 45 degree angle relative to the alignment camera 202, the far pair Ji Guangshu 402 may also hit the second beam splitter 150 at a 45 degree angle (because the far pair Ji Guangshu 402 is aligned with the center of the camera lens of the alignment camera 202). When the far-alignment light source 400 is aligned in this manner, the far pair Ji Guangshu 402 reaching the far display 108 should also be aligned with the line of sight 126 (see, e.g., fig. 1) of the subject's eye and the stimulation target 200 on the first beam splitter 118.
One technical advantage of the arrangement shown in fig. 4 is that the far-aligned light source 400 and the alignment camera 202 share the same beam splitter (i.e., they both use the second beam splitter 150), although the beam splitter is used for a different purpose. For example, the far pair Ji Guangyuan 400 uses the second beam splitter 150 to capture an image of the subject's eye during a procedure (e.g., an alignment procedure, a calibration procedure, and/or an accommodation procedure) in which the second beam splitter 150 is used to direct the far pair Ji Guangshu 402 to the far display 108 as part of a pre-alignment step, while the alignment camera 202 is oriented by the subject.
Fig. 5A illustrates a remote display 108 showing a plurality of possible areas 502 in which the stimulation target 200 may be displayed. For example, the remote display 108 may be divided into 24 regions. Region 502 may be displayed as a square formed by intersecting grid lines. It should be understood by one of ordinary skill in the art, and it is contemplated by the present disclosure that the number of regions 502 and the shape of such regions 502 may be adjusted based on the size of the remote display 108 and the size and shape of the stimulation target 200.
As previously discussed, the remote display 108 may be an LCD screen and the area 502 and the stimulation target 200 may be presented as graphics on the LCD screen.
The regions 502 may be displayed when the far pair Ji Guangshu 402 (see, e.g., fig. 4) is directed toward the far display 108 such that the optical marker 500 (e.g., red laser spot) is projected in at least one of the regions 502. Once the light indicia 500 appear in the region 502, an operator or clinician may enter a command or apply user input to the controller 128 via an input device (e.g., a keyboard or mouse) to instruct the far display 108 to display the stimulation target 200 in that particular region 502 (in fig. 5A, the region is the region 502 labeled "3B"). For example, the operator or clinician may select the region number from a drop down menu of all region numbers shown on the controller display 136.
In the case where the optical marker 500 is present on a boundary between two regions 502 or a portion of the optical marker 500 is present in an adjoining region 502, the remote display 108 may be moved until the optical marker 500 is enclosed in only one region 502.
Fig. 5B illustrates that once the region 502 is selected, the stimulation target 200 may be displayed in that region 502. In some embodiments, the stimulation target 200 may be displayed such that the stimulation target 200 overlaps with the optical marker 500 (seen in dashed lines in fig. 5B). In other embodiments, the stimulation target 200 may be displayed such that the optical marker 500 appears at the center of the stimulation target 200 or at the midline of a vertical or horizontal element of the stimulation target 200. Then, when the stimulation target 200 is displayed, the graphics showing these areas 502 may be hidden.
The foregoing step of using the far alignment light source 400 and the displayed area 502 to align the stimulation target 200 on the far display 108 may be necessary when the far display 108 is moved after or between test sessions. For example, the remote display 108 may be an LCD screen secured to a movable mount (e.g., a wheel mount). The remote display 108 may need to be moved after the session to allow the clinician to change the settings of the clinical office to run other ophthalmic tests.
One technical problem faced by the applicant is how to prevent a clinician or another operator of the system 100 from having to realign other components of the system 100 each time the remote display 108 is moved (e.g., to make room for other test equipment in the clinician's office). One technical solution that applicant has found and developed is a quick alignment procedure involving the remote alignment of the light source 400 and the remote display 108 displaying a plurality of possible areas 502. As previously discussed, the stimulation target 200 may be displayed within an area 502 surrounding the optical marker 500 projected by the far pair Ji Guangyuan 400 on the far display 108. Using these components and the methods disclosed herein, a clinician can quickly find out where to display an aligned instance of the stimulation target 200 on the far display 108, even if the far display 108 was previously moved.
Fig. 6 illustrates that a joystick used as input device 154 may be controlled by a subject to match joystick movement to the rotational orientation 602 of the displayed stimulation target 200. Joystick movement may be an example of user input 600.
The controller 128 may be programmed to instruct the stimulation target 200 to appear on the display (either the far display 108 or the near display 116) in a plurality of rotational orientations 602. For example, the controller 128 may instruct these displays to show the stimulation target 200 as facing up, facing left, facing right, or facing down. In response to the displayed stimulation target 200, the subject may apply user input 600 (e.g., joystick movement) to the input device 154 corresponding to the rotational orientation 602.
In some embodiments, the stimulation target 200 may be rotated in a random pattern (i.e., not a series of predictable clockwise or counterclockwise rotations). In other embodiments, stimulation target 200 may be rotated in part in a set pattern (e.g., a clockwise or counter-clockwise rotation pattern) and in part in a random pattern.
For example, as shown in fig. 6, the stimulus target 200 may be a light letter "E" displayed on a dark background. When the light letter "E" is displayed on the display (either the far display 108 or the near display 116) for a period of time, it may be rotated in different directions (such that the letter "E" appears to scroll). In this example, the subject may coordinate the subject's joystick movement with the rotational orientation of the displayed letter "E". More specifically, a forward pushing motion applied to the joystick may match the example shown with the letter "E" facing upward, a backward pulling motion may match the example shown with "E" facing downward, a right pushing motion may match the example shown with "E" facing right (normal "E"), and a left pushing motion may match the example shown with "E" facing left (mirror image "E").
Obtaining user input 600 from a subject in the form of joystick movement may be a way to ensure that the subject is engaged in adjusting the test and concentrating on the attention during the test session. For example, continuously rotating the stimulation target 200 may force the subject to remain alert and continuously hold or attempt to hold the stimulation target 200 in focus.
In some embodiments, user input 600 received from a subject may be assessed and matched to the rotational orientation 602 of the stimulation target 200. In these embodiments, if enough user inputs 600 of the subject do not match the rotational orientation 602 of the displayed stimulation target 200, the test session may be stopped or aborted. In some embodiments, the threshold may be set such that if a predetermined threshold number of mismatches are detected (i.e., a maximum threshold is reached), the test session is stopped or aborted.
In other embodiments, the system 100 does not count the user input 600, but still encourages the subject to match the user input 600 with the rotational orientation 602 to maintain the subject's participation during the test session.
One technical problem faced by applicant is how to keep the subject alert and engaged during regulatory assessment so that the subject makes enough effort to keep the stimulation target 200 in focus at all times. One technical solution that applicant has found and developed is to continuously rotate the stimulation target 200 and ask the subject to match the user input 600 (e.g., joystick movement) to the input device with the rotational orientation 602 of the stimulation target 200. In this way, the subject continuously makes efforts during the test period to focus on the stimulation target 200 to see the different rotational orientations 602.
Fig. 7 is a flowchart showing one embodiment of a method 700 for objectively assessing eye accommodation using the components of the system 100 disclosed herein. For example, certain steps of method 700 may be performed by controller 128 of fig. 1. The various outputs from method 700 may also be presented and displayed through a Graphical User Interface (GUI) as shown in fig. 8-10. It will also be appreciated by those of ordinary skill in the art that some method steps may be omitted for brevity and that method 700 need not be applied in the particular order shown and described herein.
The method 700 may begin with several pre-alignment steps 702. The method may then proceed to a series of alignment steps 704 (also referred to as an alignment program 704), calibration steps 706 (also referred to as a calibration program 706), and adjustment evaluation steps 708 (also referred to as an adjustment program 708).
The pre-alignment step 702 may involve aligning certain components of the system 100 with one another. For example, pre-alignment step 702 may include using optometry alignment light source 300 to align optometry device 138 with alignment camera 202 (see, e.g., fig. 1, 2, and 3). If an operator of system 100 (e.g., a clinician or technician) does not see refractor alignment light source 300 while seated in the place where the subject is to be seated, the operator may adjust at least one of angled mirror 146 and heat mirror 148 (see, e.g., fig. 1 and 2). When the refractor pair Ji Guangshu 302 (e.g., green laser) is visible to the operator (e.g., as a green laser spot on the heat mirror 148), the same operator or another operator may then adjust the alignment camera 202 (or the combination of the alignment camera 202 and the second beam splitter 150) until the refractor pair Ji Guangshu is aligned with the center of the camera lens of the alignment camera 202.
In some embodiments, the center of the camera lens of the alignment camera 202 may correspond to a fixed reticle 804 that is presented overlaid on a real-time or near real-time image 806 of the eye captured by the alignment camera 202. The fixed reticle graphic 804 and the real-time image or near real-time image 806 of the eye may be displayed as part of various Graphical User Interfaces (GUIs) (e.g., alignment GUI 800, calibration GUI 900, and adjustment GUI 1000) shown to the operator/clinician on the controller display 136 (see, e.g., fig. 8, 9, and 10).
Further, the pre-alignment step 702 may include an operator adjusting the positioning of at least one of the first beam splitter 118 (or the first beam splitter 118 and the near display 116) and the far display 108 until the stimulus targets 200 displayed on each of the first beam splitter 118 and the far display 108 are axially aligned with each other and with the line of sight 126 of the viewer when the eyes of the viewer are aligned with the fixed reticle pattern 804. In this case, the observer may be an operator or another individual that helps the operator to align the equipment.
Further, the operator may adjust the positioning of at least one of first beam splitter 118 (or first beam splitter 118 and near display 116) and far display 108 until stimulation target 200 displayed on each of first beam splitter 118 and far display 108 is axially aligned with optometry pair Ji Guangshu 302. For example, when the laser points representing optotype pair Ji Guangshu 302 are seen at about the same location on each stimulus target 200 (e.g., when stimulus target 200 is the letter "E", the laser points can be seen at the midline of the vertical segment of the letter "E") (see, e.g., fig. 3B and 3C), stimulus target 200 can be aligned with optotype pair Ji Guangshu 302 (and with each other).
The pre-alignment step 702 may also include using the far-alignment light source 400 (e.g., a red laser pointer, see fig. 4) to determine where the stimulation target 200 should be displayed on the far display 108. This may be done initially as part of the equipment alignment procedure and then each time the clinician or operator moves the remote display 108.
For example, the pre-alignment step 702 may further include displaying or presenting on the remote display 108 a plurality of possible areas 502 (see, e.g., fig. 5A) in which the stimulation target 200 may be displayed. The pre-alignment step 702 may further include pointing the far pair Ji Guangshu 402 toward the far display 108 using the far alignment light source 400. This may cause the optical marker 500 (e.g., a red laser spot) to be projected in at least one of the areas 502 (see, e.g., fig. 5A). Once the light indicia 500 appear in the region 502, an operator or clinician may enter commands or apply user input to the controller 128 via an input device (e.g., a keyboard or mouse) to instruct the remote display 108 to display the stimulation target 200 in that particular region 502 (see, e.g., fig. 5B). Once the region 502 is selected, the stimulation target 200 may be displayed in the selected region 502.
In some embodiments, the pre-alignment step 702 may include displaying the stimulation target 200 such that the stimulation target 200 overlaps the optical marker 500. In other embodiments, the pre-alignment step 702 may include displaying the stimulation target 200 such that the optical marker 500 appears at the center of the stimulation target 200 or at the midline of a vertical or horizontal element of the stimulation target 200. Then, when the stimulation target 200 is displayed, the graphics showing these possible areas 502 may be hidden.
In some embodiments, all of the pre-alignment steps 702 described thus far may be performed without involving the subject (e.g., an operator of the system 100 may perform these steps). In other embodiments, at least some of the pre-alignment steps 702 may involve a subject.
Once the pre-alignment step 702 is performed, the method 700 may proceed to an alignment step 704. From this point on, method 700 involves the subject.
Alignment step 704 may begin with one or more initialization steps 710, such as an operator or clinician invoking alignment GUI 800 (see, e.g., fig. 8) on controller display 136 and activating or turning on various components of system 100, such as alignment camera 202 and refractor apparatus 138.
The subject may then be instructed to place the subject's chin on chin rest 104 and open both eyes to look toward far display 108.
The initialization step 710 may then include an operator or clinician checking a display window 808 on the adjustment GUI 1000 to ensure that the anatomical features of the subject's measured eye (e.g., the limbus of the eye) are located within at least a portion of the fixed reticle pattern 804 (e.g., the central circular shape of the fixed reticle pattern 804) shown in the display window 808. If this is the case, the subject's eye is considered to be aligned with the system 100, and the remainder of the alignment procedure 704 may begin at step 712.
Step 712 may include displaying certain visual stimuli to the subject and obtaining measurements from optometry device 138 regarding pupil diameter 812 and gaze displacement 814 of the subject's eye (see, e.g., fig. 8). For example, the controller 128 may be programmed to instruct the far display 108 to display the stimulation target 200 for a first adjustment duration and instruct the near display 116 to display the stimulation target 200 (which is then projected onto the first beam splitter 118 for viewing by the subject) for a second adjustment duration.
In some embodiments, during the actual adjustment procedure 708, the first adjustment duration may be less than the amount of time that the stimulation target 200 is displayed on the far display 108. For example, for the actual evaluation adjustment procedure 708, the first adjustment duration and the second adjustment duration may each be approximately 5 seconds, while the stimulation target 200 is displayed on each of the far display 108 and the near display 116 for 10 seconds.
The controller 128 may also be programmed to instruct the stimulation target 200 to appear (on the far display 108 or the near display 116) in a plurality of rotational orientations 602 (e.g., any combination of upward, leftward, rightward, or downward facing). In response to the displayed stimulation target 200, the subject may be instructed to apply user input 600 (e.g., joystick movement) to the input device 154 corresponding to the rotational orientation 602. In some embodiments, the stimulation target 200 may be rotated in a random pattern (i.e., not a series of predictable clockwise or counterclockwise rotations). In other embodiments, stimulation target 200 may be rotated in part in a set pattern (e.g., a clockwise or counter-clockwise rotation pattern) and in part in a random pattern. Rotating the stimulation target 200 and instructing the subject to apply the user input 600 may keep the subject engaged during the alignment procedure 704.
Before the stimulus target 200 is displayed, the near display assembly 106 (or at least one of the near display 116 and the first beam splitter 118) may be initially positioned at a stimulus position 110 (such as a 0.80 meter stimulus position 110) furthest from the subject's eye. Once the stimulus targets 200 are displayed in succession on the far display 108 and the near display 116, the near display assembly 106 can be moved to the next stimulus position 110 (e.g., the stimulus position 110 that is closer to the subject's eye). This process may be repeated until the near display assembly 106 is moved to the stimulation site 110 closest to the subject's eye. In this manner, the stimulation target 200 is presented in a manner that mimics the presentation of the stimulation target 200 during the upcoming accommodation procedure 708, albeit in a abbreviated manner. During this time, optometry device 138 is measuring pupil diameter 812 and gaze displacement 814 of the subject's eye (see, e.g., fig. 8) and transmitting such data to controller 128.
Gaze displacement 814 may include a horizontal gaze displacement 816 and a vertical gaze displacement 818 (see, e.g., fig. 8). In some embodiments, the average pupil diameter or average gaze displacement may also be calculated by the controller 128.
Some predefined criteria or thresholds may be set with respect to the alignment measurements. Step 714 may include determining whether pupil diameter 812 and gaze displacement 814 of the subject meet predefined criteria. Alignment program 704 may be set such that the subject does not advance to calibration stage or calibration program 706 unless the alignment measurements meet predefined criteria.
The predefined criteria may include a maximum value of gaze displacement 814 and a minimum value of pupil diameter 812. In some embodiments, the maximum displacement threshold may be about 5 degrees or 4 degrees. In other embodiments, the maximum displacement threshold may be less than 4 degrees, such as 3 degrees or 2 degrees. The subject may be instructed to maintain the focus of the subject on the stimulation target 200 and not to wander or drift their gaze. Further, a minimum diameter threshold may be set for pupil diameter 812. In some embodiments, the minimum diameter threshold may be about 3.5mm. In other embodiments, the minimum diameter threshold may be about 3.0mm or 4.0mm. If the subject's pupil diameter 812 is measured to be less than the minimum diameter threshold, the clinician may determine whether the application of non-ciliary paralytic dilation drops may increase the subject's pupil size.
If the pupil diameter 812 and gaze displacement 814 of the subject meet the predefined criteria, the method 700 may proceed to a calibration phase or calibration step 706. If the pupil diameter 812 or gaze displacement 814 (horizontal gaze displacement 816 or vertical gaze displacement 818) of the subject does not meet the predefined criteria, the alignment procedure 704 may be repeated at least two more times before switching to the other eye of the subject.
Calibration procedure 706 is required because the eye pigment of the subject and the ambient light in the test/clinical environment can affect the intensity of light received and assessed by optometry device 138. As such, the optometry device 138 must be calibrated for each subject and each time the subject returns to the test/clinical environment.
The calibration routine 706 may begin with several pre-calibration steps 716. For example, the pre-calibration step 716 may include an operator or clinician invoking the calibration GUI 900 (see, e.g., fig. 9) on the controller display 136 by applying user input to the calibration tab 902. The pre-calibration step 716 may also include an operator or clinician confirming that the measured eye is still aligned with the system 100 by examining the display window 808 on the accommodation GUI 1000 to ensure that the anatomical features of the eye (e.g., the limbus of the eye) are located within at least a portion of the fixed reticle pattern 804 shown in the display window 808. The pre-calibration step 716 may further include an operator or clinician placing the IR filter and test lens with a particular diopter strength in front of the subject's eye.
Step 718 may include directing the subject to look far away from the display 108 (eyes open) while the refractor device 138 measures the refractive condition of the eye covered by the test lens. Once the controller 128 has received the measurement from the optometry device 138, a new test lens with a different diopter strength may then be placed in front of the subject's eye and a new refraction measurement made. These steps can be repeated until a refractive measurement is made using test lenses having several different diopter strengths. In some embodiments, a test lens with the following diopter strength may be used as part of the calibration procedure 706: +4.0D, +3.0D, +2.0D, +1.0D, 0.0, -1.5D, and-2.5D.
The controller 128 may then use the refractive measurements to determine a mean or average refractive value for the various trial lenses used. For example, the controller 128 may determine an average or mean refractive value for each test lens having a particular diopter strength or a particular diopter power.
The controller 128 may then plot the different diopter strengths of the test lens against these mean or average diopter values. The controller 128 may then be programmed to fit a line 906 (see, e.g., fig. 9) using regression techniques (e.g., R-square regression) to the data points plotted against the mean or average refractive values for the different diopter strengths of the test lenses. Finally, the controller 128 may be programmed to calculate the slope of the line 906 for use as the calibration factor 908 (see, e.g., fig. 9).
Step 720 may determine whether the calibration factor 908 was successfully calculated, and if the calibration factor 908 was successfully calculated from the slope of the regression-derived line, the method 700 may proceed to the adjustment procedure 708. If the calibration factor 908 is not calculated (e.g., if a line cannot be fit to the data points), the calibration routine 706 may be repeated at least two more times before attempting to calibrate the routine 706 to the other eye of the subject.
The conditioning procedure 708 may begin with some pre-evaluation step 722. The pre-evaluation step may include an operator or clinician invoking the adjustment GUI 1000 (see, e.g., fig. 10) on the controller display 136 by applying user input to the adjustment tab 1002. The pre-evaluation step 722 may also include the operator or clinician reconfirming that the measured eye is still aligned with the system 100 by examining the display window 808 on the accommodation GUI 1000 to ensure that the anatomical features of the eye (e.g., the limbus of the eye) are located within at least a portion of the fixed reticle pattern 804 shown in the display window 808. The pre-evaluation step 722 may further include an operator or clinician removing the IR filter or test lens from the subject's eye.
Accommodation procedure 708 may further include translating near display assembly 106 (including near display 116 and first beam splitter 118) to a stimulation location 110 furthest from the subject's eye (e.g., 0.80 meter stimulation location 110) in step 724. The near display assembly 106 may be automatically translated via a motorized stage 122 on a support assembly 124 (see, e.g., fig. 1, 2, and 4).
Accommodation program 708 may further include several stimulus-driven test steps 726 to assess eye accommodation. For example, the test step 726 may include displaying the stimulation target 200 at the remote display 108 for a first duration (e.g., about 10 seconds) while rotating the stimulation target 200 into the plurality of rotational orientations 602. The testing step 726 may also include displaying the stimulation target 200 at the near display 116 for a second duration (e.g., about 10 seconds) while rotating the stimulation target 200 into the plurality of rotational orientations 602. The subject may be instructed to focus on the rotating stimulation target 200 and keep the stimulation target 200 as clear as possible.
The stimulus target 200 displayed by the near display 116 may be projected onto the first beam splitter 118 for viewing by the subject's eye. The refractor means 138 may measure the refractive state of the eye when the subject is presented with the rotating stimulus target 200 during the first duration and the second duration. Further, during this same period, the controller 128 may receive user input 600 applied to the input device 154 by a subject attempting to match the user input 600 to a rotational orientation 602. For example, the user input 600 may be a joystick movement by the subject in a direction associated with the displayed rotational orientation 602 of the stimulation target 200.
Although rotating the stimulation target 200 is discussed and illustrated in fig. 6, the present disclosure contemplates and one of ordinary skill in the art will appreciate that the same adjustment procedure 708 may be performed without rotating the stimulation target 200.
In some embodiments, refractor device 138 may use decentration light refraction to measure the refractive state of the eye. For example, the refractive device may measure the refractive state of the eye by generating and directing an illumination beam 141 (e.g., an IR/NIR beam) toward the eye of the subject (see, e.g., fig. 1). The illumination beam 141 may be generated by an optometry light source 142, and the illumination beam 141 may be diverted to the subject's eye by an angled mirror 146 and a heat mirror 148 positioned above the angled mirror 146. Optometry device 138 may then use optometry camera 144 to capture or detect light (e.g., IR/NIR light) reflected by the eyes of the subject in response to illumination beam 141. Light reflected by the eye may be directed back toward optometry device 138 via hot mirror 148 and angled mirror 146.
The controller 128 may obtain measurements of the refractive state of the eye taken by the optometry device 138 during the first and second durations. Optometry device 138 is capable of taking measurements at a rate of 50Hz or once every 20 ms. Test step 726 may also include using controller 128 to determine an accommodation response of the eye based in part on the refractive status obtained from refractor device 138 (see, e.g., fig. 10).
In certain embodiments, the accommodation response of the eye is determined based in part on the refractive status obtained from refractor device 138 and user input 600. For example, user input 600 received from a subject may be assessed and matched to the rotational orientation 602 of the stimulation target 200. In these embodiments, if enough user inputs 600 from the subject do not match the rotational orientation 602 of the displayed stimulation target 200, the adjustment procedure 708 may be stopped or aborted. In some embodiments, the threshold may be set such that the adjustment procedure 708 is stopped or aborted only when a predetermined threshold number of mismatches are detected (i.e., a maximum threshold is reached). In other embodiments, the system 100 does not count the user input 600, but still encourages the subject to match the user input 600 with the rotational orientation 602 to maintain the subject's engagement during the adjustment procedure 708.
In some embodiments, determining the accommodation response of the eye may include calculating an average or mean refractive value from measurements made by refractor device 138. For example, the controller 128 may calculate an average near refractive value (X) from all refractive measurements made by the refractor means 138 during the first duration (when the stimulus target 200 is displayed at the far display 108) N ). Also, for example, the controller 128 may calculate an average distance refraction value (X) from all refraction measurements made by the refractor means 138 during the second duration (when the stimulus target 200 is displayed at the near display 116) F ). Average near refractive value (X) N ) And average teledioptric value (X F ) Is the original refractive measurement that was not scaled.
Then the average distance refraction value (X F ) Subtracting the average near refractive value (X) N ) And multiplies the result by a Calibration Factor (CF) 908 calculated according to the calibration program 706 to calculate the accommodation response at that particular stimulus location 110 (the stimulus location 110 furthest from the eye) (see equation I below).
Regulatory response = CF (X F -X N ) (public)Formula I
Accommodation procedure 708 may further include determining in step 728 whether near display assembly 106 has reached its final stimulation location 110 (e.g., the stimulation location closest to the subject's eye). If the near display assembly 106 has not reached its final stimulation location 110, then the near display assembly 106 may be automatically translated to the next stimulation location 110 in step 730 and the test step 726 may be repeated at the new stimulation location 110 (i.e., the stimulation target 200 is displayed first on the far display 108 and then on the near display 116).
Once the near display assembly 106 is moved to the final stimulation locations 110 (e.g., the stimulation locations 110 closest to the subject's eyes or the 4.0D/0.25 meter stimulation locations 110), the calculated adjustment response at each stimulation location 110 may be listed in an adjustment table 1010 included as part of the adjustment GUI 1000 (see, e.g., fig. 10).
Conditioning program 708 may further include determining whether another test cycle is desired in step 732. Cycling may refer to moving the near display assembly 106 to all stimulation locations 110 and determining an adjustment response at each stimulation location 110. In some embodiments, the adjustment program 708 may end when the system 100 has completed three cycles. In other embodiments, step 732 may include asking the clinician or operator (e.g., via a pop-up window displayed on the adjustment GUI 1000) if another test cycle is desired. If another test cycle is desired, the near display assembly 106 is automatically moved back to the initial stimulation position 110 and the subject is notified that the test will begin again.
When the last test cycle is complete, the entire method 700 may end. At this point, all of the calculated adjustment reactions may be presented in the adjustment table 1010 of the adjustment GUI 1000. Furthermore, a regulatory response may be used to generate the subject response curve 1202, and the subject response curve 1202 may be presented alongside the idealized response curve 1200 (see, e.g., fig. 12). In additional embodiments, patient data collected as part of the adjustment assessment may be made available for transmission to a cloud-based database or for printing.
FIG. 8 illustrates one embodiment of an alignment GUI 800 that may be displayed on the controller display 136 during an alignment program 704. An operator or clinician may access the alignment GUI 800 by selecting an alignment tab 802 on the main GUI presented to the operator or clinician for operation of the system 100. In some embodiments, the master GUI may be displayed on the controller display 136.
As previously discussed, the alignment GUI 800 (along with the calibration GUI 900 and the adjustment GUI 1000 (see fig. 9 and 10, respectively)) may show a fixed reticle graphic 804 overlaid on a real-time image 806 of an eye captured by the alignment camera 202 and displayed via a display window 808.
An operator or clinician may determine whether the subject's eye is aligned with the refractor apparatus 138 (including the refractor light source 142 and the refractor camera 144) based on whether the display window 808 displays anatomical features of the eye within at least a portion of the fixed reticle 804. In addition, when the anatomical features of the eye are located within at least a portion of the fixed reticle 804, the line of sight 126 of the eye is axially aligned with the stimulus target 200 displayed on the distal display 108 and the first beam splitter 118.
For example, the anatomical feature of the eye may be an anatomical structure or component of the eye that is selected based on its visibility and its location relative to the pupil. In one exemplary embodiment, the anatomical feature of the eye may be the limbus of the eye. In other embodiments, the anatomical feature of the eye may be the outer boundary of the iris.
The fixed reticle pattern 804 may include a plurality of lines radially arranged around a central circular void. These radially arranged lines may define a central circular shape. The subject's eyes may be considered aligned when the anatomical features of the eyes, such as the limbus of the eyes, are within or surrounded by a central circular shape defined by these radially disposed lines. For example, an operator or clinician may adjust the chin rest assembly 102 (including the chin rest 104), the subject's head, or a combination thereof until the limbus of the subject's eye is within the central circular shape of the fixed reticle pattern 804.
As previously discussed, the fixed reticle 804 may correspond to aligning the center of the camera lens of the camera 202. Because alignment camera 202 has been optically aligned with refractor apparatus 138 (including refractor light source 142 and refractor camera 144) through a series of pre-alignment steps, the subject's eye is considered to be aligned with refractor apparatus 138 when the anatomical features of the eye (e.g., limbus) are located within fixed reticle pattern 804 (e.g., a central circular shape).
One technical problem faced by the applicant is how to provide a clinician or operator with the ability to quickly confirm that a subject's measured eye is aligned with various components of the system 100 without having to physically adjust any of the components of the system 100. One technical solution that applicant has found and developed is the display window 808 shown in the alignment GUI 800 (as well as the calibration GUI 900 and the adjustment GUI 1000, see fig. 11 and 12, respectively), wherein the fixed reticle pattern 804 is overlaid on the real-time image 806 of the eye captured by the alignment camera 202. By providing this display window 808 on the alignment GUI 800 (as well as the calibration GUI 900 and the adjustment GUI 1000), the clinician or operator need only quickly review the display window 808 to confirm that the subject's eyes are aligned with the components of the system 100. If the clinician or operator finds that the subject's eyes have become misaligned (e.g., the limbus is not within the central circular shape of the fixed reticle 804), the clinician or operator may adjust the subject's head, chin rest 104, or a combination thereof until the subject's eyes are realigned.
Once the operator or clinician sees the anatomical features of the eye (e.g., limbus) within the fixed reticle graphic 804 (e.g., center circular shape) on the display window 808 of the alignment GUI 800, the operator or clinician can apply user input (e.g., mouse click or touch input) to the check alignment button 810 to begin the next portion of the alignment procedure 704.
In some embodiments, application of user input to inspection alignment button 810 may activate or activate optometry apparatus 138. In other embodiments, the optometry device 138 may be activated or activated by an operator or clinician before the check alignment button 810 is pressed.
Once the operator or clinician has applied user input to the review alignment button 810, the controller 128 may be programmed to instruct the far display 108 to display the stimulation target 200 for a first adjustment duration and instruct the near display 116 to display the stimulation target 200 (which is then projected onto the first beam splitter 118 for viewing by the subject) for a second adjustment duration. In some embodiments, during the actual adjustment procedure 708, the first adjustment duration may be less than the amount of time that the stimulation target 200 is displayed on the far display 108. For example, for the actual evaluation adjustment procedure 708, the first adjustment duration may be about 5 seconds, while the stimulation target 200 is displayed on the far display 108 for 10 seconds. The second adjustment duration may also be less than the amount of time that the stimulation target 200 is displayed on the near display 116 during the actual evaluation adjustment procedure 708 (e.g., 708,5 seconds versus 10 seconds for the actual evaluation adjustment procedure).
The controller 128 may also be programmed to instruct the stimulation target 200 to appear in a plurality of rotational orientations 602 (on the far display 108 or on the near display 116). For example, the controller 128 may instruct these displays to show the stimulation target 200 as facing up, facing left, facing right, or facing down. In response to the displayed stimulation target 200, the subject may be instructed to apply user input 600 (e.g., joystick movement) to the input device 154 corresponding to the rotational orientation 602.
In some embodiments, the stimulation target 200 may be rotated in a random pattern (i.e., not a series of predictable clockwise or counterclockwise rotations). In other embodiments, stimulation target 200 may be rotated in part in a set pattern (e.g., a clockwise or counter-clockwise rotation pattern) and in part in a random pattern. Rotating the stimulation target 200 and instructing the subject to apply the user input 600 may keep the subject engaged during the alignment procedure 704.
The near display assembly 106 may be initially positioned at a stimulation location 110 furthest from the subject's eye (e.g., a 0.80 meter stimulation location 110) before the stimulation target 200 is displayed. Once the stimulus targets 200 are displayed in succession on the far display 108 and the near display 116, the near display assembly 106 can be moved to the next stimulus position 110 (e.g., the stimulus position 110 that is closer to the subject's eye). This process may be repeated until the near display assembly 106 is moved to the stimulation site 110 closest to the subject's eye. In this manner, the stimulus target 200 is presented in a manner that mimics the presentation of the stimulus target 200 during the conditioning procedure 708, albeit in a abbreviated manner. During this time, optometry device 138 is measuring pupil diameter 812 and gaze displacement 814 of the subject's eye and transmitting such data to controller 128.
The gaze displacement 814 of the eye may include a horizontal gaze displacement 816 (or displacement of the gaze of the eye along the X-axis, see fig. 1 and 2) and a vertical gaze displacement 818 (or displacement of the gaze of the eye along the Y-axis, see fig. 1 and 2). The horizontal gaze displacement 816 and the vertical gaze displacement 818 may be tracked by the optometry apparatus 138, and the gaze displacement values may be graphically represented in real-time or near real-time on a gaze chart 820 presented as part of the alignment GUI 800.
In addition to gaze chart 820, pupil diameter 812 and gaze displacement 814 measured by optometry apparatus 138 may be presented in an alignment measurement window 822 presented as part of alignment GUI 800. In some embodiments, an average pupil diameter or average gaze displacement may be calculated and displayed in alignment measurement window 822.
As previously discussed, certain predefined criteria or thresholds may be set with respect to the alignment measurements such that the subject does not advance to the calibration phase or calibration procedure 706 unless the alignment measurements meet the predefined criteria. The predefined criteria may include a maximum value of gaze displacement 814 and a minimum value of pupil diameter 812.
For example, a maximum displacement threshold may be set for gaze displacement 814. In some embodiments, the maximum displacement threshold may be about 5 degrees or 4 degrees. In other embodiments, the maximum displacement threshold may be less than 4 degrees, such as 3 degrees or 2 degrees. The subject may be instructed to maintain the subject's focus on the stimulation target 200 and not to drift or drift their gaze to prevent the horizontal gaze displacement 816 or the vertical gaze displacement 818 from exceeding the maximum displacement threshold.
In addition, for example, a minimum diameter threshold may be set for pupil diameter 812. In some embodiments, the minimum diameter threshold may be about 3.5mm. In other embodiments, the minimum diameter threshold may be about 3.0mm or 4.0mm. If the measured pupil diameter 812 of the subject is less than the minimum diameter threshold, the clinician or operator may determine whether the application of non-ciliary paralytic dilation drops may increase the pupil size of the subject. Non-ciliary paralytic drops can dilate the pupil without paralyzing the muscles that help focus the eye.
FIG. 9 illustrates one embodiment of a calibration GUI 900 that may be displayed on the controller display 136 during the calibration procedure 706. An operator or clinician may access the alignment GUI 800 by selecting the calibration tab 902 on the master GUI.
The calibration GUI 900 may also include an example of a display window 808 showing a fixed reticle pattern 804 overlaid on a real-time image 806 of the eye captured by the alignment camera 202. Once the operator or clinician confirms that the anatomical features of the eye (e.g., the limbus of the eye) are within the fixed reticle pattern 804 (e.g., the central circular shape) shown in the display window 808, the operator or clinician may then calibrate the optometry device 138 for the subject's eye and test/clinical environment. Calibration is desirable because the eye pigment of the subject and the ambient light in the test/clinical environment can affect the intensity of light received and assessed by optometry device 138. As such, the optometry device 138 must be calibrated for each subject and each time the subject returns to the test/clinical environment.
Calibration procedure 706 may include an operator or clinician placing an IR filter (a filter that blocks visible light and allows only IR light to pass) and a test lens having a particular diopter strength or a particular diopter capability in front of the subject's eye. The subject is then instructed to look toward the stimulus target 200 (eyes open) shown on the far display 108. At this point, the operator or clinician may apply user input (e.g., a mouse click or touch input) to the measurement button 904 on the calibration GUI 900 to cause the refractor means 138 to make a measurement of the refractive state of the eye covered by the test lens.
A new test lens with a different diopter strength may then be placed in front of the subject's eye and a new refractive measurement made. These steps can be repeated until a refractive measurement is obtained using test lenses having several different diopter strengths. In some embodiments, a test lens with the following diopter strength may be used as part of the calibration procedure: +4.0D, +3.0D, +2.0D, +1.0D, 0.0, -1.5D, and-2.5D.
The refraction measurements made by the optometry device 138 may be transmitted to or otherwise obtained by the controller 128. The controller 128 may then determine an average or mean refractive value based on the refractive measurements taken by the refractometer apparatus 138. For example, the controller 128 may determine an average or mean refractive value for each test lens having a particular diopter strength or a particular diopter power.
The controller 128 may then plot the different diopter strengths of the test lens against the average refractive value calculated from the refractive measurements made by the refractor means 138. The controller 128 may then be programmed to fit a line 906 using regression techniques (e.g., R-square regression) to the data points plotted against the mean refractive value for the different diopter strengths of the test lens. Finally, the controller 128 may be programmed to calculate the slope of the line 906 to use as the calibration factor 908. The controller 128 may use the calibration factor 908 to determine the accommodation response of the subject's eye.
As shown in fig. 9, calibration GUI 900 may include a graph window 910 that presents the curve and fit line 906. The calibration GUI 900 may also display the calculated slope of the line 906 that may be used as the calibration factor 908.
FIG. 10 illustrates one embodiment of an adjustment GUI 1000 that may be displayed on the controller display 136 during an adjustment procedure 708. The operator or clinician may access the adjustment GUI 1000 by selecting the adjustment tab 1002 on the main GUI. Only when the subject has successfully completed alignment procedure 704 and calibration procedure 706 (the output of which is the calculated calibration factor 908) can the operator or clinician in turn evaluate eye accommodation.
The adjustment GUI 1000 may also include an example of a display window 808 showing a fixed reticle pattern 804 overlaid on a real-time image 806 of the eye captured by the alignment camera 202. Once the operator or clinician again confirms that the anatomical features of the eye (e.g., the limbus of the eye) are within the fixed reticle pattern 804 (e.g., the central circular shape) shown in the display window 808, the operator or clinician can proceed with the accommodation assessment by applying user input (e.g., mouse click or touch input) to the start test button 1004 on the accommodation GUI 1000.
In response to user input to start test button 1004, controller 128 may be programmed to instruct stimulation target 200 to appear on far display 108 for a first duration and on near display 116 for a second duration. In some embodiments, before the stimulation target 200 is displayed, the near display assembly 106 including the near display 116 may be positioned or moved to a stimulation location 110 furthest from the subject's eye (e.g., a 0.80 meter stimulation location 110). The stimulation target 200 displayed on the near display 116 may be projected onto the first beam splitter 118 for viewing by the subject.
As previously discussed, the stimulation target 200 may be displayed in a plurality of rotational orientations 602 (e.g., the stimulation target 200 may be randomly rotated). The subject may be instructed to focus on the stimulation target 200 and keep the stimulation target 200 as clear as possible. The subject may also be instructed to apply user input 600 to the input device 154 (e.g., move the joystick in various directions) to match the rotational orientation 602 of the stimulation target 200.
During the first duration and the second duration, refractor means 138 may continuously take measurements of the refractive state of the eye. For example, refractor means 138 can take measurements at a rate of 50Hz or once every 20 ms. The refraction measurements may be transmitted to controller 128 or otherwise obtained by controller 128 from optometry device 138.
In some embodiments, the refraction measurements received from the optometry device 138 may be raw or non-scaled refraction measurements (see, e.g., fig. 11). The original or un-scaled refractive measurements may be multiplied by a calibration factor 908 (see, e.g., fig. 9), and the resulting scaled refractive measurements may be plotted against time (in seconds) and displayed in a refractive chart 1006 on the accommodation GUI 1000.
The controller 128 may also be programmed to calculate an average or mean refractive value from the refractive measurements taken by the refractometer apparatus 138 when the stimulus target 200 is displayed at the far display 108 and near display 116/first beam splitter 118.
In some embodiments, the average distance refractive value (X F ) Subtracting the average near refractive value (X) N ) And multiplies the result by a Calibration Factor (CF) 908 calculated according to the calibration program 706 to calculate the modulation response 1008 at that particular stimulation location 110 (see equation I above).
Average near refractive value (X in formula I N ) And average teledioptric value (X F ) Is the original refractive measurement that was not scaled by the calibration factor 908.
The regulatory response at other stimulation locations 110 (e.g., at three or more other stimulation locations 110) may also be determined. For example, near display assembly 106 may be initially positioned at stimulation location 110 furthest from the subject's eye (e.g., 0.80 meter stimulation location 110) before stimulation target 200 is displayed. Once the stimulus targets 200 are displayed in succession on the far display 108 and the near display 116, the near display assembly 106 can be moved to the next stimulus position 110 (e.g., the stimulus position 110 that is closer to the subject's eye). This process may be repeated until near display assembly 106 is moved to a stimulation site 110 that is closest to the subject's eye (e.g., a 4.0D or 0.25 meter stimulation site 110). The adjustment responses calculated at each stimulation location 110 may be listed in an adjustment table 1010 that is included as part of the adjustment GUI 1000 and/or presented as a report to be printed by an operator or clinician.
FIG. 11 is a graph that visualizes the raw or un-scaled refractive measurements taken by the refractor means 138 over time. The vertical axis shows refractive values and the horizontal axis shows time in seconds. The refractive measurements may include a number of far refractive measurements 1100 made by the refractor means 138 during a first duration (when the stimulus target 200 is displayed at the far display 108) and a number of near refractive measurements 1102 made by the refractor means 138 during a second duration (when the stimulus target 200 is displayed at the near display 116 and projected onto the first beam splitter 118 for viewing). A transition 1104, which shows several measurements made during the transition period, may separate the far refractive measurement 1100 from the near refractive measurement 1102.
The first duration may begin at a first time 1106 and end at a second time 1108. For example, the first duration may be between about 5 seconds and 20 seconds (e.g., about 10 seconds). The second duration may begin at a third time 1110 and end at a fourth time 1112. For example, the second duration may also be between about 5 seconds and 20 seconds (e.g., about 10 seconds). Optometry device 138 is capable of taking measurements at a rate of 50Hz or once every 20 ms.
The distance refraction measurement 1100 may be averaged to obtain an average distance refraction value (X F ). Near refractive measurement 1102 may be averaged to obtain an average near refractive value (X N )。
In some embodiments, the average distance refractive value (X F ) Subtracting the average near refractive value (X) N ) And multiplies the result by a Calibration Factor (CF) calculated according to the calibration routine 706 to calculate the modulation response at that particular stimulation location 110 (see equation I above).
In other embodiments, the raw or un-scaled refractive measurements taken by the refractometer apparatus 138 may be first multiplied by the calibration factor 908, and the resulting scaled refractive measurements may be plotted against time (in seconds) and displayed in the refractive chart 1006 shown via the accommodation GUI 1000 (see, e.g., fig. 10). When calculating the scaled refractive measurements (using a negative slope/calibration factor), an average scaled near refractive value and an average scaled far refractive value may be calculated, and in this case the accommodation response is simply the average scaled near refractive minus the average scaled far refractive.
The accommodation response at these other stimulation locations 110 may also be determined by repeating the same calculations using the refraction measurements made by the optometry device 138 while the stimulation target 200 is displayed at the other stimulation locations 110 (e.g., at three or more of the other stimulation locations 110). For example, a response curve 1202 (see, e.g., fig. 12) can then be plotted to assess the regulatory capacity of the subject.
Fig. 12 is a graph showing both an idealized response curve 1200 (perfect modulation) and a subject response curve 1202 based on subject-modulated responses calculated at each stimulation site 110. The subject response curve 1202 can be compared to the idealized response curve 1200 to assess the subject's ability to modulate. Additional response curves 1202 may also be generated based on additional adjusted response values calculated from additional test runs or test cycles.
The subject regulatory responses calculated at each stimulation location 110 may also be stored as part of a matrix or table. The subject regulatory responses and subject response curves 1202 calculated at each stimulation location 110 may be stored for further assessment by a clinician or other medical professional and may also be compared to previous or future regulatory data obtained from the subject.
Several embodiments have been described. However, one of ordinary skill in the art appreciates that various changes and modifications can be made to the present disclosure without departing from the spirit and scope of the embodiments. Elements of the systems, apparatuses, devices, and methods shown with respect to any embodiment are exemplary for a particular embodiment and may be used in combination or otherwise with other embodiments within the disclosure. For example, the steps of any method depicted in the figures or described in this disclosure need not be in the particular order shown or described, or sequential order, to achieve desirable results. In addition, other steps, operations may be provided, or steps or operations may be eliminated or omitted from the described methods or processes to achieve the desired results. Furthermore, any component or portion of any device or system described in this disclosure or depicted in the drawings may be removed, eliminated, or omitted to achieve the desired results. In addition, certain components or portions of the systems, devices, or apparatus shown or described herein have been omitted for brevity and clarity.
Accordingly, other embodiments are within the scope of the following claims and the specification and/or drawings are to be regarded in an illustrative rather than a restrictive sense.
Each of the various variations or embodiments described and illustrated herein has discrete components and features that can be readily separated from or combined with the features of any of the other variations or embodiments. Modifications may be made to adapt a particular situation, material, composition of matter, process action or step, to the objective, spirit or scope of the present invention.
The methods recited herein may be performed in any order that is logically possible for the recited events and in any order for the recited events. Furthermore, additional steps or operations may be provided or may be eliminated to achieve the desired results.
Furthermore, where a range of values is provided, each intervening value, to the extent it is between the upper and lower limit of that range, and any other stated or intervening value in that stated range, is encompassed within the invention. Moreover, any optional feature of the inventive variations described may be set forth and claimed independently, or in combination with any one or more of the features described herein. For example, descriptions of ranges 1 to 5 should be considered as having disclosed sub-ranges (e.g., 1 to 3, 1 to 4, 2 to 5, 3 to 5, etc.) as well as individual numbers within the range (e.g., 1.5, 2.5, etc.) and any overall or partial increments therebetween.
All existing subject matter (e.g., publications, patents, patent applications) mentioned herein is incorporated by reference in its entirety, except in the case where such subject matter might conflict with the subject matter of the present disclosure (in which case the subject matter is subject to the existence of this text). The items mentioned are provided only because they were disclosed before the filing date of the present application. Nothing herein is to be construed as an admission that: the present application does not precede such materials by the prior application.
Reference to the singular includes the possibility that there are plural of the same items present. More specifically, as used herein and in the appended claims, the singular forms "a," "an," "the," and "the" include plural referents unless the context clearly dictates otherwise. It is further noted that the claims may be drafted to exclude any optional element. Accordingly, this statement is intended to serve as antecedent basis for use of "only," "only," and the like exclusive terminology or use of "negative" limitations in connection with recitation of claim elements. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
Reference to at least one of the phrases ". The term" when such phrase modifies a plurality of items or components (or an enumerated list of items or components) refers to any combination of one or more of those items or components. For example, the phrase "at least one of A, B and C" refers to: (i) A; (ii) B; (iii) C; (iv) A, B and C; (v) A and B; (vi) B and C; or (vii) A and C.
In understanding the scope of the present disclosure, the term "comprising" and its derivatives, as used herein, are intended to be open ended terms that specify the presence of the stated features, elements, components, groups, integers, and/or steps, but do not exclude the presence of other unstated features, elements, components, groups, integers and/or steps. The foregoing also applies to words having similar meanings such as the terms, "including", "having" and their derivatives. Also, the terms "section," "segment," "portion," "member," "element" or "component" when used in the singular can have the dual meaning of a single part or a plurality of parts. As used herein, the following directional terms "forward, rearward, above, downward, vertical, horizontal, below, transverse, lateral and vertical" as well as any other similar directional terms refer to those positions of a device or apparatus or those directions of a device or apparatus being translated or moved.
Finally, terms of degree such as "substantially", "about" and "approximately" as used herein mean either the specified value or a reasonable amount of deviation from the specified value (e.g., up to + -0.1%, + -1%, + -5% or + -10% of the deviation as such change is appropriate) such that the end result is not significantly or materially changed. For example, "about/about 1.0m" may be interpreted to mean "1.0m" or "between 0.9m and 1.1 m". When a degree term (such as "about" or "approximately") is used to refer to a number or value that is part of a range, that term can be used to modify both the minimum and maximum numbers or values.
The disclosure is not intended to be limited to the scope of the specific forms set forth, but is intended to cover alternatives, modifications, and equivalents of the variations or embodiments described herein. Further, the scope of the present disclosure fully encompasses other variations or embodiments that may become apparent to those skilled in the art in view of the present disclosure.

Claims (30)

1. A system for objectively assessing ocular accommodation in a subject, the system comprising:
a near display and a first beam splitter positioned at an oblique angle relative to the near display;
a far display positioned farther from the eye than the near display and the first beam splitter; and
A controller in communication with the near display and the far display, the controller comprising one or more processors and a tangible, non-transitory machine-readable medium comprising instructions stored thereon,
wherein execution of the instructions by the one or more processors causes the controller to:
indicating the presence of a stimulus target on the far display for a first duration and on the near display for a second duration, wherein the near display is configured to project the stimulus target onto the first beam splitter, wherein the stimulus target displayed on the far display is axially aligned with the stimulus target projected onto the first beam splitter,
obtaining measurements from an optometry device in communication with the controller regarding the refractive state of the eye during the first and second durations, and
an accommodation response of the eye is determined based in part on the refractive state.
2. The system of claim 1, further comprising:
a support assembly; and
a motorized stage coupled to a top of the support assembly, wherein the motorized stage is configured to translate the near display and the first beam splitter in a linear direction, wherein the first beam splitter is configured to translate to a plurality of stimulation locations, and wherein the near display is oriented in a downward direction.
3. The system of claim 2, wherein the stimulation location is located approximately 0.80 meters, 0.37 meters, 0.33 meters, and 0.25 meters from the eye of the subject.
4. The system of claim 1, wherein the remote display is positioned between about 4 meters and 6 meters from the eye of the subject.
5. The system of claim 1, further comprising:
an angled mirror; and
a heat mirror positioned above the angled mirror and between the near display and the subject; and is also provided with
Wherein, the optometry device comprises:
a refractor light source configured to generate an illumination beam, wherein the angled mirror and the heat mirror are configured to steer the illumination beam to the eye of the subject;
a refractor camera configured to capture or detect light reflected by the eye in response to the illumination beam, wherein light reflected by the eye is diverted back toward the refractor device via the hot mirror and the angled mirror; and
one or more refractor processors configured to determine a refractive state of the eye based on light reflected by the eye.
6. The system of claim 5, further comprising:
a second beam splitter positioned on the line of sight of the eye, at a distal end of the support assembly and between the near display assembly and the far display;
an alignment camera in communication with the controller and configured to capture a real-time image of the eye, wherein the alignment camera is positioned offset from a line of sight of the eye, and wherein the second beam splitter is configured to reflect an image of the eye of the subject toward the alignment camera; and
a controller display in communication with the controller and configured to display a graphical user interface GUI showing a real-time image of the eye captured by the alignment camera, wherein the GUI further shows a fixed reticle graphic overlaid on the real-time image of the eye, wherein the eye of the subject is optically aligned with the optometry device when the GUI shows anatomical features of the eye within at least a portion of the fixed reticle graphic.
7. The system of claim 1, wherein the one or more processors of the controller are configured to execute further instructions to cause the controller to:
Obtaining measurements from the refractor means regarding pupil diameter and gaze displacement of the eye; and
only if the pupil diameter exceeds a minimum diameter threshold and the gaze displacement is less than a maximum displacement threshold, then an accommodation response of the eye is determined.
8. The system of claim 6, wherein a line of sight of the eye of the subject extends through the heat mirror, the first beam splitter, and the second beam splitter such that the subject views the stimulus target displayed on the remote display through the heat mirror, the first beam splitter, and the second beam splitter.
9. The system of claim 6, further comprising a far-alignment light source positioned at a distal end of the support assembly, wherein the far-alignment light source is configured to project optical markers onto the far display via the second beam splitter, and wherein the stimulation targets are displayed on the far display within an area surrounding the optical markers.
10. The system of claim 1, further comprising:
an input device configured to receive user input from the subject regarding the stimulation target displayed on at least one of the near display and the far display; and is also provided with
Wherein the input device is in communication with the controller, and wherein the one or more processors are configured to execute further instructions to cause the controller to:
indicating that the stimulation target appears on the remote display in a plurality of first rotational orientations for the first duration,
receiving user input from the input device corresponding to the first rotational orientation,
indicating that the stimulation target appears on the near display in a plurality of second rotational orientations for the second duration, and
user input corresponding to the second rotational orientation is received from the input device.
11. The system of claim 10, wherein the input device is a joystick, and wherein each of the user inputs is a joystick movement initiated by the subject in a direction associated with a rotational orientation of the stimulation target displayed on the near display or the far display.
12. The system of claim 1, wherein the one or more processors of the controller are configured to execute further instructions to cause the controller to:
obtaining refractive data about the eye of the subject from the optometry device during a calibration procedure, wherein the calibration procedure comprises:
An infrared filter is placed in front of the eye,
placing each of a plurality of test lenses having different diopter strengths in sequence in front of the eye, and
directing the subject to look at the far display while measuring the refractive condition of the eye for each of the test lenses using the refractor means;
fitting a line to data points plotted for different diopter strengths of the test lens relative to the average value of the measured refractive states using regression techniques; and
the slope of the line is calculated to be used as a calibration factor.
13. The system of claim 12, wherein the one or more processors of the controller are configured to execute further instructions to cause the controller to calculate the accommodation response of the eye using the formula:
regulatory response = CF (X F -X N )
Wherein CF is the calibration factor, wherein X F Is used when the stimulus target is displayed on the remote displayAn average teledioptric value calculated from measurements taken during the extended time period, and wherein X N Is an average near refractive value calculated using measurements taken during the second duration when the stimulus target is displayed on the near display.
14. The system of claim 1, wherein the stimulus target is a optotype letter.
15. The system of claim 1, wherein the stimulation target has a height dimension of between about 1.5cm and 2.0 cm.
16. A method for objectively assessing ocular accommodation in a subject, the method comprising:
displaying the stimulation target at the remote display for a first duration;
displaying the stimulation target at a near display for a second duration, wherein the stimulation target displayed at the near display is projected onto a first beam splitter positioned at an oblique angle relative to the near display, and wherein the stimulation target displayed on the far display is axially aligned with the stimulation target projected onto the first beam splitter;
obtaining, at a controller, measurements from an optometry device in communication with the controller regarding a refractive state of the eye during the first and second durations; and
using the controller, an accommodation response of the eye is determined based in part on the respective refractive states.
17. The method of claim 16, further comprising translating the near display and the first beam splitter to a stimulation position via a motorized stage prior to displaying the stimulation target.
18. The method of claim 16, further comprising:
generating an illumination beam using a refractor light source of the refractor apparatus, wherein the illumination beam is diverted to the eye of the subject by an angled mirror and a heat mirror positioned above the angled mirror;
capturing or detecting light reflected by the eye in response to the illumination beam using a refractor camera of the refractor apparatus, wherein light reflected by the eye is diverted back toward the refractor apparatus via the hot mirror and the angled mirror; and
one or more processors, using the optometry device, determine a refractive state of the eye based on light reflected by the eye.
19. The method of claim 16, further comprising:
capturing a real-time image of the eye using an alignment camera positioned offset from a line of sight of the eye, wherein the image of the eye is reflected toward the alignment camera by a second beam splitter positioned on a line of sight of the eye;
displaying a graphical user interface, GUI, using a controller display in communication with the controller, the graphical user interface showing the real-time image of the eye captured by the alignment camera, wherein the GUI further shows a fixed reticle graphic overlaid on the real-time image of the eye; and
The eye is aligned with the optometry device by adjusting at least one of a chin rest assembly and a position of the subject's head until the GUI shows anatomical features of the eye within at least a portion of the fixed reticle pattern.
20. The method of claim 16, further comprising:
obtaining measurements from the refractor means regarding pupil diameter and gaze displacement of the eye; and
only if the pupil diameter exceeds a minimum diameter threshold and the gaze displacement is less than a maximum displacement threshold, then an accommodation response of the eye is determined.
21. The method of claim 16, further comprising:
placing an infrared filter in front of the eye;
placing each of a plurality of test lenses having different diopter strengths in sequence in front of the eye;
directing the subject to look at the far display while measuring the refractive condition of the eye for each of the test lenses using the refractor means;
fitting a line to data points plotted for different diopter strengths of the test lens relative to the average value of the measured refractive states using regression techniques; and
The slope of the line is calculated to be used as a calibration factor.
22. The method of claim 21, further comprising calculating, using the controller, an accommodation response of the eye using the formula:
regulatory response = CF (X F -X N )
Wherein CF is the calibration factor, wherein X F Is an average teledioptric value calculated using measurements taken during the first duration when the stimulus target is displayed on the teledisplay, and wherein X N Is an average near refractive value calculated using measurements taken during the second duration when the stimulus target is displayed on the near display.
23. A method for objectively assessing ocular accommodation in a subject, the method comprising:
displaying the stimulation target at the remote display in a plurality of first rotational orientations for a first duration;
displaying the stimulation target at a near display in a plurality of second rotational orientations for a second duration, wherein the stimulation target displayed at the near display is projected onto a first beam splitter;
receiving, at a controller, a plurality of user inputs applied by the subject to an input device, wherein the user inputs correspond to the first rotational orientation and the second rotational orientation, and wherein the input device is in communication with the controller;
Obtaining, at the controller, measurements from an optometry device in communication with the controller regarding the refractive state of the eye during the first and second durations; and
an accommodation response of the eye is determined based in part on the respective refractive state and the user input using the controller.
24. The method of claim 23, further comprising translating the near display and the first beam splitter to a stimulation position via a motorized stage prior to displaying the stimulation target.
25. The method of claim 23, further comprising:
generating an illumination beam using a refractor light source of the refractor apparatus, wherein the illumination beam is diverted to the eye of the subject by an angled mirror and a heat mirror positioned above the angled mirror;
capturing or detecting light reflected by the eye in response to the illumination beam using a refractor camera of the refractor apparatus, wherein light reflected by the eye is diverted back toward the refractor apparatus via the hot mirror and the angled mirror; and
One or more processors, using the optometry device, determine a refractive state of the eye based on light reflected by the eye.
26. The method of claim 23, further comprising:
capturing a real-time image of the eye using an alignment camera positioned offset from a line of sight of the eye, wherein the image of the eye is reflected toward the alignment camera by a second beam splitter positioned on a line of sight of the eye;
displaying a Graphical User Interface (GUI) using a controller display in communication with the controller, the GUI showing the real-time image of the eye captured by the alignment camera, wherein the GUI further shows a fixed reticle graphic overlaid on the real-time image of the eye; and
the eye is aligned with the optometry device by adjusting at least one of a chin rest assembly and a position of the subject's head until the GUI shows anatomical features of the eye within at least a portion of the fixed reticle pattern.
27. The method of claim 23, further comprising:
obtaining measurements from the refractor means regarding pupil diameter and gaze displacement of the eye; and
Only if the pupil diameter exceeds a minimum diameter threshold and the gaze displacement is less than a maximum displacement threshold, then an accommodation response of the eye is determined.
28. The method of claim 23, further comprising:
placing an infrared filter in front of the eye;
placing each of a plurality of test lenses having different diopter strengths in sequence in front of the eye;
directing the subject to look at the far display while measuring the refractive condition of the eye for each of the test lenses using the refractor means;
fitting a line to data points plotted for different diopter strengths of the test lens relative to the average value of the measured refractive states using regression techniques; and
the slope of the line is calculated to be used as a calibration factor.
29. The method of claim 28, further comprising calculating, using the controller, an accommodation response of the eye using the formula:
regulatory response = CF (X F -X N )
Wherein CF is the calibration factor, wherein X F Is an average teledioptric value calculated using measurements taken during the first duration when the stimulus target is displayed on the teledisplay, and wherein X N Is an average near refractive value calculated using measurements taken during the second duration when the stimulus target is displayed on the near display.
30. The method of claim 23, wherein the input device is a joystick, and wherein each of the user inputs is a joystick movement initiated by the subject in a direction associated with a rotational orientation of the stimulation target displayed on the near display or the far display.
CN202280025971.4A 2021-04-06 2022-03-21 Apparatus, system and method for objectively assessing ocular accommodation Pending CN117119946A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US63/171,320 2021-04-06
US202163261801P 2021-09-29 2021-09-29
US63/261,801 2021-09-29
PCT/US2022/021212 WO2022216451A1 (en) 2021-04-06 2022-03-21 Apparatus, systems, and methods for objectively assessing accommodation in an eye

Publications (1)

Publication Number Publication Date
CN117119946A true CN117119946A (en) 2023-11-24

Family

ID=88797080

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280025971.4A Pending CN117119946A (en) 2021-04-06 2022-03-21 Apparatus, system and method for objectively assessing ocular accommodation

Country Status (1)

Country Link
CN (1) CN117119946A (en)

Similar Documents

Publication Publication Date Title
CN110573061B (en) Ophthalmologic examination method and apparatus
US11497561B2 (en) Real-time surgical reference indicium apparatus and methods for astigmatism correction
US7478911B2 (en) Perimeter
EP2152143B1 (en) System and method for illumination and fixation with ophthalmic diagnostic instruments
JP4503354B2 (en) Optometry equipment
US20220313080A1 (en) Apparatus, systems, and methods for objectively assessing accommodation in an eye
JP2023506515A (en) Systems and methods for determining refractive characteristics of both first and second eyes of a subject
US9572486B2 (en) Device and method for checking human vision
WO2016046202A1 (en) Visual field measuring device and system
WO2004086952A2 (en) Application of neuro-ocular wavefront data in vision correction
JP2017124173A (en) Optometry system and optometry method
US20160095512A1 (en) Method of evaluating quality of vision in examinee's eye and storage medium
JP4494075B2 (en) Optometry equipment
CN117119946A (en) Apparatus, system and method for objectively assessing ocular accommodation
US20230255473A1 (en) Integrated apparatus for visual function testing and method thereof
JP2022066325A (en) Ophthalmologic apparatus
JP2022000168A (en) Ophthalmologic device
US20220192482A1 (en) Vision Testing System and Method
JP7166080B2 (en) ophthalmic equipment
KR101133255B1 (en) Simulator for training of pupillary reflex and fundus examination
WO2023214274A1 (en) A device and a method for automated measurements of eyeball deviation and/or vertical and horizontal viewing angles
IL305329A (en) Method, system and computer program product for determining optometric parameters
JP2009118881A (en) Ophthalmic apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination