EP4294251A1 - Système pour mesurer un alignement binoculaire au moyen de dispositifs d'affichage et de suiveurs oculaires réglables - Google Patents

Système pour mesurer un alignement binoculaire au moyen de dispositifs d'affichage et de suiveurs oculaires réglables

Info

Publication number
EP4294251A1
EP4294251A1 EP22756883.9A EP22756883A EP4294251A1 EP 4294251 A1 EP4294251 A1 EP 4294251A1 EP 22756883 A EP22756883 A EP 22756883A EP 4294251 A1 EP4294251 A1 EP 4294251A1
Authority
EP
European Patent Office
Prior art keywords
eye
infrared
display
binocular alignment
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22756883.9A
Other languages
German (de)
English (en)
Inventor
Jeffrey P. Krall
Aric Plumley
Ronnie Barnard
Zachary Dios
Thomas Henry Holt
Vivek Labhishetty
Ali Jiong-Fung Lee
Ferenc Raksi
Jason Robert Ryan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neurolens Inc
Original Assignee
Neurolens Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/179,402 external-priority patent/US20210169322A1/en
Application filed by Neurolens Inc filed Critical Neurolens Inc
Publication of EP4294251A1 publication Critical patent/EP4294251A1/fr
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/08Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing binocular or stereoscopic vision, e.g. strabismus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • A61B3/005Constructional features of the display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/1015Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for wavefront analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/103Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining refraction, e.g. refractometers, skiascopes

Definitions

  • This invention relates generally to methods and systems for measuring vision acuity, and more particularly, to measuring binocular alignment.
  • Presbyopia is a natural deterioration of near vision, caused by loss of flexibility in the eye's crystalline lenses as one ages. Presbyopia can be partially compensated by wearing “reading” glasses that correct near-vision refraction errors, so that the eye does not have to focus as strongly when gazing at near objects. Presbyopic persons need different optical corrections for near vision and for distance vision However using two eyeglasses and changing them frequently is distracting. To avoid continually exchanging eyeglasses, bifocals may be used that offer different optical corrections for near-vision and for distance-vision. The transition between these two vision regions can be abrupt or gradual. The latter eyeglasses are called Progressive Addition Lenses (PALs). Abrupt change bifocals have a visible line separating the two vision regions, while PALs have no lines or edges visible between the regions with different dioptric powers.
  • PALs Progressive Addition Lenses
  • FIGS. 1-4 illustrate the basic problem of binocular misalignment.
  • FIG. 1A illustrates that when we look at a near object, like the shown cross, our vision accommodates in two ways. First, we accommodate the optical power of our eyes 1-1 and 1-2 to image the near object at a distance L onto the retina of each eye. This is often called the accommodative response A. Second, we rotate our eyes 1-1 and 1-2 inward by an angle ⁇ , so that the visual axes 2-1 and 2- 2 of the eyes are pointing at the same near object. This response is often called the accommodative convergence AC.
  • the ratio of the accommodative convergence AC to the accommodative response A, AC/A is a geometrically well-defined function, depending on the object distance L and the pupil distance PD of the two eyes.
  • FIGS. 1B-C illustrate that eyes often display various forms of accommodative misalignments.
  • the two eyes each turn inward, but to a lesser degree that geometry would require. This leads to the accommodative convergence angle a being less than geometrically necessary by a misalignment angle ⁇ .
  • the visual axes of the eyes 2-1 and 2-2 should point into the direction denoted as the necessary accommodative alignment to properly see the near object, but, instead, they turn inward to a lesser degree and instead point to the direction denoted as relaxed or natural accommodative alignment.
  • FIG. 1C illustrates a case, when this lesser turn is asymmetrical.
  • the visual axis 2-1 of the first eye 1-1 properly points to the direction of the necessary accommodative alignment, while the visual axis 2-2 of the second eye 1-2 is turned inward only to the direction of the relaxed or natural accommodative alignment, that is misaligned by the accommodative misalignment angle ⁇ .
  • the projections into the two eyes start to differ.
  • the discrepancy between the visual perceptions of the two eyes exceeds a threshold, and the brain stops fusing the two images into a single perception.
  • Objects with such difference in distance, angle, or shape are called non-fusible objects, presenting non-fusible images.
  • FIGS. 2A-D illustrate the concept of fixation disparity, as measured by a test device, often called the Mallet box.
  • the Mallet box displays two vertically aligned bars, and an “X O X” horizontal “anchor”.
  • the two bars can be shifted sideways.
  • adjustable mirrors or prisms are placed in front of the patient’s eye to achieve the same horizontal shift.
  • the anchor and only one of the bars is shown for the first eye 1-1 as a centered bar 5-1-c
  • the same anchor plus only the other bar is shown for the second eye 1-2 as a centered bar 5-2-c.
  • the anchor and the centered bars 5-1-c and 5-2-c are clearly fusible. Accordingly, the brains of patients without accommodative misalignment problems will properly fuse these images.
  • FIG. 2B illustrates that patients with accommodative misalignments will not fuse the images properly. What is typically observed is that, while the images of the anchor, seen by both eyes, are properly fused into a single image, the bars are perceived as shifted. The first eye 1-1 perceives a shifted bar 5-1-s, while the second eye 1-2 perceives a shifted bar 5-2-s. The angle ⁇ between the line to the image center and one of the visual axes 2-1 and 2-2 is called fixation disparity.
  • FIGS. 2C-D illustrate ways to measure the angle needed to counteract, or compensate, the fixation disparity.
  • the two bars are counter-shifted.
  • a counter- shifted bar 5-1-x is shown for the first eye 1-1
  • a counter-shifted bar 5-2-x is shown for the second eye 1-2.
  • the bars are counter-shifted until the patient perceives the two bars as aligned.
  • the angle corresponding to these counter-shifts, ⁇ *, between the visual axes and line to the counter-shifted bars is measured and is typically referred to as an associated phoria.
  • the bars are not counter-shifted. Instead, adjustable, or exchangeable prisms 7 are inserted in front of the patient’s eyes. These prisms are adjusted or exchanged until the two bars are perceived as aligned by the patient. Then the prism angles, or the refraction angles of the refracted visual axes, are reported as the associated phoria ⁇ *.
  • FIG. 3 illustrates how increasing a partial associated phoria partially compensates fixation disparity. Strictly speaking, the (full) associated phoria, that fully compensates fixation disparity, is given by the intersect of this curve with the partial associated phoria axis. If human vision were a purely optical process, the partial associated phoria would be simply equal to the negative of the partially compensated fixation disparity. Accordingly, the curve would be a straight line through the origin, tilted by -45 degrees, pointing from the upper left comer to the lower right comer. However, FIG. 3 illustrates that human vision is much more complex, and perception and image processing play crucial roles in it. FIG.
  • FIGS. 4A-C illustrate a related visual misalignment called disassociated phoria.
  • disassociated phoria To characterize disassociated phoria, an experiment similar to that in FIGS. 2A-D can be carried out, with the difference that instead of showing fusible images 5-1 and 5-2, the optometrists show non-fusible images 6-1-s and 6-2-s for the first eye 1-1 and the second eye 1-2.
  • these non-fusible images are the cross and the bar.
  • FIG. 4B illustrates, once the eyes are unable to fuse the images, often one or both of the visual axes rotate outward.
  • the visual axis 2-2 of the second eye 1-2 rotates outward by an accommodative misalignment angle ⁇ .
  • This angle ⁇ of the outward rotation is measured and called disassociated phoria.
  • the disassociated phoria is distributed over the two eyes evenly, thus the disassociated phoria per eye equaling ⁇ /2.
  • the disassociated phoria ⁇ may manifest itself unevenly and has to be distributed between the eyes accordingly.
  • FIG. 4C shows a particularly clear case, when simply no image is shown for the second eye 1-2, the view of the second eye 1-2 is blocked. This is an extreme case of non-fusible images.
  • the visual axis 2-2 of the second eye 1-2 rotates outward by a measurable disassociated phoria angle ⁇ .
  • the AC/ A is a ratio of the accommodative convergence angle reduced by the fixation disparity, a- ⁇ /2, (expressed with its tangent, in terms of “prism diopters” A), divided by the accommodative distance L, expressed in diopters D.
  • a startling fact of the corresponding field of optometry is that the associated phoria angles and the disassociated phoria angles, determined by experienced practitioners, show remarkably wide variations.
  • a prism diopter of 1 ⁇ corresponds to a 1cm prism refraction at 1 meter distance.
  • the large variability of these methods precludes the effective determination and compensation of accommodative misalignments.
  • some embodiments include a system to determine a binocular alignment, comprising a first optical unit, including a first display, to display images for a first eye, actuatable along a longitudinal direction according to a simulated distance and an optical power of the first eye, and a first eye tracker assembly, to track a gaze direction of the first eye, adjustable in a horizontal lateral direction to accommodate a pupillary distance of the first eye; and a second optical unit, including a second display, to display images for a second eye, actuatable along the longitudinal direction according to a simulated distance and an optical power of the second eye, and a second eye tracker assembly, to track a gaze direction of the second eye, adjustable in the horizontal lateral direction to accommodate a pupillary distance of the second eye; and a computer, coupled to the first optical unit and the second optical unit, to determine the binocular alignment based on the gaze directions of the first eye and the second eye.
  • a first optical unit including a first display, to display images for a first eye, actu
  • FIGS. 1A-C illustrates various accommodative misalignments.
  • FIGS. 2A-D illustrate method to determine types of accommodative misalignments.
  • FIGS. 4A-C illustrate methods to determine disassociated phoria.
  • FIG. 7 illustrates an IR image by the eye tracker.
  • FIGS. 18A-B illustrate embodiments of a first optical unit.
  • FIG. 19 illustrates a view of the system for determining binocular alignment.
  • FIG. 20 illustrates a perspective view of the first optical unit.
  • FIG. 22 illustrates an embodiment of the system for determining binocular alignment with a graphical user interface and a patient communication interface.
  • FIG. 23 illustrate an embodiment with an auto-refractor.
  • FIG. 24 illustrate an embodiment of a system for determining binocular alignment.
  • the systems described in the present patent document address the above articulated medical needs at least in the following aspects.
  • the described system and method determine the prismatic corrections only by objective measurements, without subjective input from the patient. This aspect alone greatly reduces the patient-to-patient and practitioner-to-practitioner variations of the results.
  • studies on large samples of patients using Applicant’s system and method determined prismatic corrections with a standard deviation reduced from the above- mentioned 3A to well below 1A. This significant reduction of the results’ standard deviation alone established the here-described method to the status of quantitatively predictive diagnostic methods.
  • the system and method use both central and peripheral test images, because of a newly developed understanding of how the peripheral and the central prismatic corrections are connected.
  • the described system and method is a promising platform to determine an optimal compromise prismatic prescription that strikes the best compromise for compensating both central and peripheral accommodative misalignments.
  • the described method has two stages, thus it determines the eventual prismatic correction in a second stage by building on the important misalignment information acquired in the first stage. As such, the method integrates knowledge determined by different methods and benefits from the information determined by all of them.
  • One of the stages of the method involves moving test images. Therefore, the eventually determined prismatic corrections capture and integrate the dynamic prismatic response of the eye as well.
  • the reliable repeatability and small variability of the above- mentioned large scale study provided a compelling argument that Applicants’ method combined the outputs of different methods in an objective and effective manner to produce a single optimized and objective prismatic correction.
  • the here-described five aspects provide advantages individually and in combinations.
  • FIGS. 5-10 illustrate a system 10 for determining a binocular alignment
  • FIGS. 11-16 illustrate a corresponding method 100 for determining the binocular alignment.
  • FIG. 5 illustrates that in some embodiments, the system 10 for determining a binocular alignment can comprise a stereo display 20, to project visible images for a first eye 1-1 and a second eye 1-2; an accommodation optics 30, to modify the projected visible images according to an apparent distance; an eye tracker 40, to track an orientation of the first eye 1-1 and the second eye 1-2; and a computer 50, coupled to the stereo display 20, the accommodation optics 30 and the eye tracker 40, to manage a determination of the binocular alignment.
  • the eyes will be labeled as first eye 1-1 and second eye 1-2. This labeling can correspond to a left eye and a right eye, or vice versa.
  • FIG. 6A shows a detailed illustration of some embodiments of the system 10.
  • the eye tracker 40 can include infrared light emitting diodes, or IR LEDs, 42-1 and 42-2, positioned close to a front of the system 10, to project infrared eye-tracking beams on the first eye 1-1 and the second eye 1-2, as well as infrared light sources 44-1 and 44-2, to illuminate the first eye 1-1 and the second eye 1-2 with an infrared imaging light.
  • the infrared eye-tracking beams and the infrared imaging light get both reflected from the eyes 1-1 and 1-2.
  • the eye tracker 40 can further include infrared (IR) telescopes 46-1 and 46-2, with infrared (IR) cameras 48-1 and 48-2, to detect the infrared eye-tracking beams and the infrared imaging light, reflected from the first eye 1-1 and the second eye 1-2.
  • IR infrared
  • IR infrared
  • the infrared telescopes 46-1 and 46-2 are included in pairs, e.g., the infrared telescopes 46-1 and 46-2.
  • pairs e.g., the infrared telescopes 46-1 and 46-2.
  • the infrared telescope 46 abbreviating “the infrared telescopes 46-1 and 46-2.”
  • FIG. 7 illustrates a resulting IR image 49, as detected, or sensed, by the IR camera 48.
  • the “-1”... “-4” notation here refers to the four IR LEDs, all projecting IR eye tracking beams onto the same eye.
  • the four IR LEDs 42-1, ... 42-4 project four IR eye-tracking beams onto the eye, which reflect from the cornea, creating four so called Purkinje spots P1-1, ... P1-4 in the IR image 49.
  • the accommodation optics lenses 34 — mirror 24 — IR telescope 46 axis for each eye is typically referred to as the main optical pathway in this embodiment. Also, for clarity’s sake, in figures where the optical paths and beam are shown, some labels have been simplified.
  • FIG. 8A illustrates another embodiment, where the location of the stereo display screens 22 and the IR telescopes 46 is exchanged.
  • FIG. 8B illustrates that this embodiment can include visible-transmissive infrared (IR) mirrors 24’-1 and 24’-2, to redirect the reflected infrared eye tracking beam and the reflected infrared imaging light, together 45-1 and 45-2, reflected from the first eye 1-1 and the second eye 1-2, toward the IR telescopes 46-1 and 46-2.
  • the visible-transmissive infrared mirrors 24’-1 and 24’-2 can transmit the projected visible images 26-1 and 26-2, from the stereo display screens 22-1 and 22-2 of the stereo display 20 to the first eye 1-1 and the second eye 1-2.
  • the stereo display 20 can be positioned in the main optical pathway of the system 10, and the infrared telescopes 46 of the eye tracker 40 can be positioned peripheral to the main optical pathway of the system 10.
  • the accommodation optics lenses 34 -- mirror 24 - stereo display screen 22 axis for each eye is typically referred to as the main optical pathway in this embodiment.
  • the eye tracker 40 may include small implementations of the IR cameras 48, positioned close to the front of the system 10, slanted at a sufficiently large angle so that the IR cameras 48 do not block the projections by the stereo display screens 22.
  • the image recognition system 52 of such implementations of the eye tracker 40 can include a geometric transformation unit to determine the direction of the eye visual axes from a substantially slanted IR image 49 and Purkinje spots P1, ... P4, possibly some spots even being obscured by the slant.
  • the accommodation optics 30 can include, in place of the phoropter wheel 32, or in combination with the phoropter wheel 32, curved mirrors, trial lenses, flip in/flip out lenses, adjustable liquid lenses, deformable mirrors, z-directionally movable mirrors, rotating diffractive optical elements, translating diffractive optical elements, variable focus Moire lenses, or focusing lens groups.
  • FIGS. 10A-B illustrate that for the second technical solution, the accommodation optics 30 can include a pair of rotatable deflectors 36, rotatable prisms 38, or adjustable prisms 38 (only one shown), to deflect the projection of the images 26-1 and 26-2 to the first eye 1-1 and the second eye 1-2, to simulate a vergence of the apparent distance for the first eye and the second eye.
  • the accommodation optics 30 can include a pair of rotatable deflectors 36, rotatable prisms 38, or adjustable prisms 38 (only one shown), to deflect the projection of the images 26-1 and 26-2 to the first eye 1-1 and the second eye 1-2, to simulate a vergence of the apparent distance for the first eye and the second eye.
  • the vergence can be simulated not by the above optical elements, but by shifting the projecting of the projected visible images 26-1 and 26-2 with the stereo display screens 22-1 and 22-2 towards each other, in other words, projecting them closer to each other.
  • the accommodation optics 30 and the stereo display 20 can be combined into a single light field display that includes a microlens array, where the projected visible images 26-1 and 26-2 shown on the stereo display screens 22-1 and 22-2, combined with the optical characteristics of the microlens array can be used to vary the apparent distance of the projected visible images 26-1 and 26-2 as seen by a patient.
  • the accommodation optics 30 and the stereo display 20 can be combined into a single light field display that includes a mems scanner, a focus modulator, or a light source.
  • FIGS. 11-16 illustrate a method 100 of how to use the above-described embodiments of the system 10 to determine a binocular alignment of the eyes 1-1 and 1-2.
  • FIG. 11 illustrates that some embodiments of the method 100 can include a measuring 120 of a disassociated phoria of the first eye 1-1 and the second eye 1-2 of a patient at an apparent distance, and a determining 140 of an accommodative convergence of the first eye 1-1 and the second eye 1-2 at the apparent distance using the measured disassociated phoria.
  • the method 100 is a two-stage method, and thus its results integrate the information and knowledge revealed by the two different stages.
  • the measuring 120 can include projecting non-fusible visible images 26-1 and 26-2 for the first eye 1-1 and the second eye 1-2 using the stereo display 20 of the system 10.
  • the visible images 26-1 and 26-1 of FIGS. 5-10 will be simply referred to as images 26-1 and 26-2 in what follows.
  • Examples of projecting non-fusible images in order to determine a disassociated phoria have been described, e.g., in relation to FIGS. 2C-D.
  • the two non-fusible images 6-1-s and 6-2-s were of comparable appearance, or dominance.
  • Some embodiments of the method 100 also involve projecting such non-fusible images of comparable dominance.
  • Achieving this relaxed state can be inferred, for example, by the eye tracker 40 determining that the movement of the eye 1-2 slowed below a threshold, or changed from a directional movement to a random jitter, or came to a halt.
  • the disassociated phoria can be measured by measuring an orientation of at least one of the first eye 1-1 and the second eye 1-2 by the eye tracker 40.
  • FIG. 12 describes implementations of these steps in more detail, and FIGS. 13A-D illustrate these steps in a particular embodiment.
  • the measuring 120 can include the followings.
  • FIG. 13A left panel illustrates that the projecting of a centered image step 122 can include projecting a centered image 201-1, a cross in this case, on the stereo display screen 22- 1 of the stereo display 20 of the system 10.
  • the projecting 122 can be done with an apparent distance vergence 206.
  • a reference axis 202-1 is introduced for reference as a central normal that connects a center of the first eye 1-1 with a center of the stereo display screen 22-1.
  • the centered image 201-1 is centered in the sense that it is moved off the center of the stereo display screen 22-1 only by the apparent distance vergence angle a(L) to simulate the apparent distance vergence 206. For brevity’s sake, sometimes this angle will be only referred to as the vergence angle a.
  • the definition of the first eye visual axis 204-1 can incorporate a lens or any other relevant portion of the accommodation optics 30-1, through which the first eye 1-1 is observing the centered image 201-1.
  • FIG. 13A right panel illustrates the projecting of a distributed image step 124 for the second eye 1-2, in this case, a set of irregularly placed balls or spheres of random size and position, without an apparent center.
  • the centered image 201-1 is an example of a dominant image
  • the distributed image 201-2 is an example of a non-dominant image.
  • the centered, dominant image 201-1 and the distributed, non-dominant image 201-2 are examples of non- fusible images.
  • the stereo display screen 22-2 can be simply darkened as another embodiment of the non-fusible distributed image 201-2, instead of the irregularly placed balls, in analogy to the block in FIG. 4C.
  • FIG. 13B illustrates that, as described earlier, the second eye 1-2 will initially also turn inward by approximately the same apparent distance vergence angle ⁇ as the first eye 1-1, but, after the brain fails to fuse the non-fusible central image 201-1 and distributed image 201-2, the second eye 1-2 wanders away.
  • the eye tracker 40 can execute the tracking step 126 of the second eye 1-2 until the optometrist, or an automated program, determines that the wandering second eye 1-2 reached a relaxed state from a stabilization of the tracked rotation in the identifying step 128.
  • This stabilization can be defined in various ways: from the eye coming to a stop, or an amplitude of the eye’s jitter becoming less than a threshold, or a directional rotation of the eye evolving into a directionless wandering.
  • the tracking step 126 may involve tracking a rotation of the first eye 1-1, the second eye 1-2, or both.
  • the disassociated phoria 208 can be defined from measuring 130 a first eye phoria angle ⁇ -1, a second eye phoria angle ⁇ -2, and determining the disassociated phoria ⁇ as some type of a mean of ⁇ -1 and ⁇ -2.
  • FIGS. 13A-B illustrated that the steps 122-130 of the overall measuring step 120 can be performed as a near vision distance, e.g., L being in the range of 40cm-100cm.
  • L can be in the 1m-10m range.
  • the method 100 can be performed at near vision distances corresponding to 1-3D, at distance vision distances corresponding to 0-0.5D.
  • the result of the measuring step 120, the first stage of the method 100 is the disassociated phoria 208, with its disassociated phoria angle ⁇ .
  • the second stage of the method 100, the determining step 140, carries out additional tests of the prismatic misalignment that build on the just determined disassociated phoria 208. Therefore, the overall method 100 is a combination of the first and second stages and thus the method 100 integrates two distinct tests of prismatic misalignments, and thus integrates knowledge and data about two different types of the binocular alignment. Doing so promises a qualitatively more complete treatment and a qualitatively better improvement of the visual acuity.
  • FIG. 14 illustrates that the determining step 140 can include a presenting step 142 of a first image for the first eye and a second image for the second eye, with the apparent distance vergence, corrected with the measured disassociated phoria, using the stereo display; wherein the first image and the second image are fusible.
  • Tracking 146 an adjustment of the first eye in response to the projecting of the first added central image, and tracking an adjustment of the second eye in response to the projecting of the second added central image, using an eye tracker; projecting 148 a shifted first added central image with a first iterative associated phoria, to reduce the adjustment of the first eye, and projecting a shifted second added central image with a second iterative associated phoria, to reduce the adjustment of the second eye, in an alternating manner, using the stereo display and a computer; tracking 150 an adjustment of the first eye in response to the projecting of the shifted first added central image, and tracking an adjustment of the second eye in response to the projecting of the shifted second added central image using the eye tracker; determining 152 whether an effective adjustment of the first and second eye is less than an adjustment threshold, and returning to the projecting the shifted first added central image step if the effective adjustment of the first and second eye is greater than the adjustment threshold; identifying 154 a stabilized associated phoria from the last first it
  • the angles will be referenced to the apparent distance vergence corrected by the disassociated phoria, having the angle ⁇ - ⁇ /2, instead of the reference axis 202.
  • the determining step 152 can be performed to determine whether an effective adjustment of the first and second eye is less than an adjustment threshold. Using the above framework, the determining step 152 may evaluate whether the change of the adjustment angle
  • the effective adjustment can be defined in various ways. It can involve the change of the adjustment angle of only one of the eyes:
  • the binocular alignment can be again characterized by the AC/ A ratio, the ratio of the accommodative convergence AC to the accommodative response A, to characterize the binocular alignment.
  • This AC/ A ratio can be determined for a single distance, or can be formed from AC and A values for multiple distances.
  • the fully corrected accommodative convergence AC will be simply referred to as accommodative convergence AC.
  • the method 100 can include determining a distance vision accommodative convergence AC(L d ) as an accommodative convergence resulting from performing the method 100 at a distance vision apparent distance L d ; and determining a near vision accommodative convergence AC(L n ) as an accommodative convergence resulting from performing the method at a near vision apparent distance L n .
  • the method 100 does not use the patient’s subjective responses as key inputs.
  • Embodiments of the system for determining binocular alignment 310 do not involve phoropter wheels 32-1 and 32-2: they perform both of the above functions by having made the displays 322-1 and 322-2 actuatable along the longitudinal direction according to the simulated distance and the optical power of the eyes 1-1 and 1-2.
  • the elimination of the phoropter wheels 32-1 and 32-2 makes the physical size of the system for determining binocular alignment 310 notably smaller than that of the system for determining binocular alignment 10 which uses phoropter wheels 32-1 and 32-2. This is an advantage in an optometrist’s crowded office where physical space is at a premium.
  • the first display 322-1 and the second display 322-2 can travel over a longitudinal range in the 50-200 mm range, in some embodiments, in the 75-125 mm range.
  • the closest longitudinal distance of the first and second displays 322-1 and 322-2 to the first and second eye tracker assemblies 340-1 and 340-2 can be in the 5-40 mm range, in others, in the 10-30 mm range.
  • the system for determining binocular alignment 310 can simulate prescription optical powers in a range of -20D to +20D, or less, in others in a range of -10D to +10D, or less, in yet other embodiments in an asymmetric range, such as -10D to +20D, or less.
  • the first and second eye tracker assemblies 340-1 and 340-2 when a nearer object is simulated by displaying images for the first and second eyes 1-1 and 1-2 shifted closer to the center of the system 310, the first and second eye tracker assemblies 340-1 and 340-2, together with their frontal lenses, can be horizontally laterally actuated so that the patient is still looking at the nearer objects through a center of the frontal lens of the system to determine a binocular alignment 310, thereby avoiding the unintended prismatic effect.
  • FIG. 18A illustrates that in some embodiments of the system for determining binocular alignment 310, within the first optical unit 315-1, the first eye tracker assembly 340-1 can include one or more first infrared light emitting diodes (IR LEDs) 342-1, to project an infrared (IR) eyetracking beam 342b- 1 on the first eye 1-1. Further, the first eye tracker assembly 340-1 can also include a first infrared (IR) light source 344-1, to illuminate the first eye 1-1 with an infrared (IR) imaging light 344b-1.
  • IR LEDs infrared light emitting diodes
  • the second eye tracker assembly 340-2 can include one or more second infrared (IR) light emitting diodes 342-2, to project an infrared (IR) eye-tracking beam 342b-2 on the second eye 1-2, a second infrared (IR) light source 344-2, to illuminate the second eye 1-2 with an infrared imaging light 344b-2, and a second infrared (IR) camera 348-2, to detect the IR eye-tracking beam 342b-2 after reflection from the eye 1-2, and the IR imaging light 344b-2, after reflection from the second eye 1-2, collectively labeled reflected IR beam and IR light 345b-2, through a second IR optics 346-2.
  • IR infrared
  • the second eye tracker assembly 340-2 is analogous to the first eye tracker assembly 340-1, there is no need to show it expressly.
  • the xyz coordinate system of FIG. 17 is also shown, from a perspective rotated relative to that of FIG. 17.
  • the number of the first and second IR LEDs 342-1 and 342-2 can be in the range of 1-10, in some embodiments in the range of 2-4.
  • the first infrared light source 344-1 can include a set of individual infrared light emitting diodes, spatially distributed in order to illuminate the first eye 1-1 with a dispersed infrared imaging light 344b- 1; and the second infrared light source 344-2 can include a set of individual infrared light emitting diodes, spatially distributed in order to illuminate the second eye 1-2 with a dispersed infrared imaging light 344b-2.
  • FIGS. 18A-B illustrate that the one or more first infrared (IR) light emitting diodes 342-1 can be positioned at different positions in the first eye-tracker assembly 340-1.
  • the first infrared (IR) light emitting diodes 342-1 are positioned at a frontal area of the first eye tracker assembly 340-1, close to the first eye 1-1.
  • the IR eye tracking beam 342b-1 may make a larger angle with the main optical axis of the first optical unit 315-1, possibly complicating the centering of the reflected IR light.
  • FIG. 18A the first infrared (IR) light emitting diodes 342-1 are positioned at a frontal area of the first eye tracker assembly 340-1, close to the first eye 1-1.
  • the IR eye tracking beam 342b-1 may make a larger angle with the main optical axis of the first optical unit 315-1, possibly complicating the centering of the reflected IR light.
  • the one or more first infrared (IR) light emitting diodes 342-1 are positioned much higher upstream along the optical path, in the proximity of the first infrared (IR) camera 348-1, often close to its central first IR optics 346- 1.
  • the IR eye tracking beam 342b-1 can be well aligned with the main optical axis of the first optical unit 315-1.
  • the IR beam 342b- 1 is often directed by the first IR LEDs 342-1 to reflect from the apex of the cornea to yield a central P1 Purkinje reflection.
  • the determination of the gaze direction can also involve determining one of the pupillary attributes, such as the location of the pupil center, or how much ellipticity the image of the pupil has.
  • the eye optical axis is aligned with a main optical axis of the first eye tracker assembly 340-1, then the pupil of the eye 1-1 will appear as a circle for typical eyes.
  • the gaze direction of the eye 1-1 turns away from this main optical axis by a rotation angle, the same pupil will appear as an ellipse.
  • Analyzing the ellipticity of this ellipse, as, e.g., given by the ratio of its minor axis to its major axis, and determining the directions of these axes delivers important information about the gaze direction’s rotation angle.
  • Yet other pupillary attributes can involve imaging the iris and recording the location of a specific feature of the iris. Determining the pupillary attributes can involve edge recognition software to identify the precise edges of the pupils.
  • the image analysis system 352 is often operated by first instructing the patient to look straight ahead, and then registering and recording the location of the Purkinje reflection P1 and the pupil center of the patient by the first and second IR cameras 348-1 and 348-2. (As at other loci in this document, since the second eye tracker assembly 340-2 is analogous to the first eye tracker assembly 340-1, for brevity it is not illustrated in a separate, repetitive figure.) In addition, the ellipticity and other pupillary attributes of the eye can be also recorded.
  • the image analysis system 352 can use the location of the centers of the pupil in the xy plane, as determined from the IR image, formed from the reflected IR lights 344b-1 and 344b-2, and the locations of the Purkinje reflections P1 from the apex of the cornea, as determined from the reflected IR beams 342b- 1 and 342b-2. If the pupil centers overlap, or coincide, with the corneal apexes in the xy plane, then the eye is looking straight forward, as in the reference IR images. When the pupil centers and the comeal apexes are offset in the xy plane, then from the direction and magnitude of the offsets the image analysis system 352 can determine the rotational angle of the gaze direction of each eye relative to the reference direction.
  • the image analysis system 352 can take the locations of the pupil center and comeal apex in an image of a rotated eye, then subtract the reference locations of these, and from the so-constructed differences, determine the rotational angle of the gaze direction of the eyes 1- 1 and 1-2 by which the eyes responded to the projected visible images 326-1 and 326-2.
  • Other embodiments can determine the gaze directions by other methods, such as other pupillary attributes and/or other Purkinje reflections.
  • Yet other embodiments can use multiple pupillary attributes without Purkinje reflections. Yet others can do the opposite: use multiple Purkinje reflections without pupillary attributes.
  • the eyes perform quick saccadic motions many times a second, the gaze directions rapidly vary in time. Therefore, the above-mentioned Purkinje reflections and pupil centers, and possibly other pupillary attributes, are representative of a specific gaze direction if they are measured close to each other in time. And in reverse: if they are measured with a substantial time difference, bigger than 0.1 second, or 1 second, or more, then the gaze direction computed by the image analysis system 352 may be less and less accurate.
  • the one or more first infrared light emitting diodes 342-1 project the infrared eye-tracking beam (IR beam) 342b- 1 in an alternating manner with the first infrared light source 344-1 illuminating with the infrared imaging light 344b- 1; and the one or more second infrared light emitting diodes 342-2 project the infrared eye-tracking beam 342b-2 in an alternating manner with the second infrared light source 344-2 illuminating with the infrared imaging light 344b-2.
  • the frequency of the alternation can be in the 1-1,000 Hz range, in some embodiments in the 10-150 Hz range, in some embodiments in the 60-120 Hz range.
  • the first and second IR cameras 348-1 and 348-2 can determine the Purkinje reflections and pupil centers, and possibly other pupillary attributes, within 1-1,000 milliseconds of each other, in other embodiments within 6-100 milliseconds, in yet others 8-16 milliseconds. Determining the Purkinje reflections and pupil centers, and possibly other pupillary attributes, so close to each other advantageously increases the accuracy of the computation of the gaze direction by the image analysis system 352. As mentioned before, in some embodiments of the system for determining binocular alignment 310, only multiple pupillary attributes are determined, in other embodiments of system 310 only multiple Purkinje reflections. Determining either of these with the above repetition rates also increases the accuracy of the determination of the gaze directions.
  • the first eye tracker assembly 340-1 also includes a first visible-transmissive infrared mirror 324-1, positioned to transmit images from the first display 322-1 along the longitudinal direction to the first eye 1-1; and to redirect the reflected infrared eye-tracking beam 342b- 1 and the infrared imaging light 344-1, together labeled 345b-1, from the first eye 1-1 to the first infrared camera 348-1 in a lateral direction; and the second eye tracker assembly 340-2 includes a second visible- transmissive infrared mirror 324-2, positioned to transmit images from the second display 322- 2 along the longitudinal direction for the second eye 1-2; and to redirect the reflected infrared eye-tracking beam and the infrared imaging light, together 345b-2, from the second eye 1-2 to the second infrared camera 348-2 in the lateral direction.
  • a first visible-transmissive infrared mirror 324-1 positioned to transmit images from the first display 322-1 along the longitudinal direction to the
  • the first infrared camera 348-1 is positioned relative to the first visible-transmissive infrared mirror 324-1 in one of a vertical lateral and a horizontal lateral direction; and the second infrared camera 348-2 is positioned relative to the second visible-transmissive infrared mirror 324-2 in one of the vertical lateral and the horizontal lateral direction.
  • the horizontal lateral direction corresponds to the x axis
  • the vertical lateral direction corresponds to the y axis of the xyz coordinate system of FIGS. 17-18.
  • Such occlusion problems by the eyelashes are avoided in the present system for determining binocular alignment 310 by making the reflected IR beams and IR imaging lights 345b-1 and 345b-2 share the main optical path, leaving the eye in a normal/z/longitudinal direction, and then redirected by the first and second visible transparent IR mirrors 324-1 and 324-2.
  • the first display 322-1 is actuatable to a first longitudinal position according to the simulated distance, wherein the first longitudinal position is dynamically corrected according to the optical power of the first eye 1-1; and the second display 322-2 is actuatable to a second longitudinal position according to the simulated distance, wherein the second longitudinal position is dynamically corrected according to the optical power of the second eye 1-2.
  • the first and second displays 322-1 and 322-2 are actuatable continuously along the longitudinal/z direction, which allows for a more precise correction of the simulated distance according to the optical power, or prescription, of the eyes 1-1 and 1-2 of the patient.
  • the horizontal lateral position of the images can be moved accordingly on the first and second displays 322-1 and 322-2 by the computer 350.
  • FIGS. 18A-B also illustrate that the first optical unit 315-1 can include a first lens assembly 360-1 to receive and guide the infrared eye-tracking beam and the infrared imaging light, both reflected from the first eye and together labeled 345b- 1, towards the first infrared camera 348-1, and to reduce at least one of a chromatic aberration, an optical aberration, an optical astigmatism, and a wavefront distortion; and the second optical unit 315-2 can include a second lens assembly 360-2 to receive and guide the infrared eye-tracking beam and the infrared imaging light, both reflected from the first eye and together labeled 345b-2, towards the first infrared camera 348-2, and to reduce at least one of a chromatic aberration, an optical aberration, an optical astigmatism, and a wavefront distortion.
  • the elements of the second optical unit 315- 2 are not shown explicitly for brevity - they are analogous to those of the first optical unit 315-
  • the first infrared camera 348-1 and the first lens assembly 360-1 are adjustable together; and the second infrared camera 348-2 and the second lens assembly 360-2 are adjustable together.
  • the infrared cameras 348-1 and 348-2 need to be much larger, so as to be able to retain the high resolution and low distortion of the images even if the first and second lens assemblies 360-1 and 360-2 have been adjusted to an off-center, misaligned position.
  • the first and second infrared cameras 348-1 and 348-2 can be made much smaller since the collinearity with the first and lens assemblies 360-1 and 360-2 is maintained in spite of the adjustments.
  • the smaller size of the first and second infrared cameras 348-1 and 348- 2 advantageously reduces the size of the entire system to determine a binocular alignment 310.
  • FIG. 19 illustrates an embodiment of the system for determining binocular alignment 310. It shows the same elements as FIGS. 17-18, from the top, y direction, or vertical lateral direction looking down, similarly to FIG. 17. In particular, the directions of the longitudinal/z directional actuation, and the horizontal lateral/x direction are well-demonstrated.
  • FIG. 20 illustrates an embodiment of the first optical unit 315-1 of the system for determining binocular alignment 310 from a perspective view. Besides the previously described elements, the further element of a first z actuator 347-1 is visible, configured to actuate the first display 322-1 along the longitudinal/z direction. Further, a first coupling 354-1 to the computer 350 is also visible, coupling the first display 322-1 to the computer 350 with a set of flexible or deformable communication lines.
  • the first display 322-1 can be configured to display images for the first eye 1-1 modified according to at least one of an optical power, a cylinder, and a prism of the first eye 1-1; and the second display 322-2 can be configured to display images for the second eye 1-2 modified according to at least one of an optical power, a cylinder, and a prism of the second eye 1-2.
  • the first display 322- 1 and the second display 322-2 may include a liquid crystal display, a light emitting diode (LED) display, an organic LED display, a quantum dot LED display, a microlens array, a digital mirror device, and a scanning projector micro- electrical-mechanical system.
  • LED light emitting diode
  • organic LED display organic LED display
  • quantum dot LED display a microlens array
  • digital mirror device a scanning projector micro- electrical-mechanical system.
  • FIG. 21 illustrates a frontal, z directional view of the system for determining binocular alignment 310. This is what is visible for the patient.
  • the first and second lens assemblies 360- 1 and 360-2 are shown. Beyond that, some embodiments include a nose bridge 370, located centrally between the first optical unit 315-1 and the second optical unit 315-2, configured to receive and immobilize a patient’s nose.
  • Such embodiments provide progress relative to related diagnostic systems. Quite a few related diagnostic systems intend to immobilize the patient’s head and eyes with a variant of a chin rest, where the patient rests her/his chin. However, the chin still acts as an axis of rotation for the patient’s head, and therefore the eyes can still rotate around the rested chin with the chineye distance as a radius, causing rotational misalignment with the diagnostic apparatus. This remaining rotational misalignment can be minimized or eliminated by immobilizing the patient’s head and eyes at the nose instead of at the chin.
  • FIGS. 17-21 Another advantage is demonstrated by FIGS. 17-21. Denoting the center of the system for determining binocular alignment 310 as center 311, for a fraction of patients the pupil center of their first eye 1-1 and that of their second eye 1-2 are not at an equal distance from the center of symmetry of their heads. These differences can be 1-2 mm, enough to cause notable errors if the measurements are analyzed assuming a symmetric positioning of the eyes 1-1 and 1-2.
  • PD overall pupillary distance
  • first/left eye mono-pupillary distance 4-1 defined relative to the center 311, adjustable independently from the second/right eye mono- pupillary distance 4-2, again defined relative to the center 311.
  • this is realized by making the first eye optical unit 315-1 adjustable in the horizontal lateral/x direction relative to the nose bridge 370 to accommodate the mono-pupillary distance 4-1 of the first eye 1-1, as indicated with a block arrow; and making the second optical unit 315-2 adjustable in the horizontal lateral/x direction relative to the nose bridge 370 to accommodate the mono-pupillary distance 4-2 of the second eye 1-2.
  • FIG. 22 illustrates further features of embodiments of the system for determining binocular alignment 310.
  • Some embodiments can include a graphical user interface 380, configured for a medical operator to interact with the computer 350 to manage the determination of the binocular alignment.
  • This graphical user interface 380 can show to the medical operator, such as an optometrist or technician, the infrared images captured by the first and second IR cameras 348-1 and 348-2, the movement of the eyes 1-1 and 1-2, the available diagnostic steps to choose from, and parameters of the diagnostic procedure to set, among others.
  • Yet- other embodiments of the system for determining binocular alignment 310 can include a patient communication interface 385, such as a loudspeaker, to instruct a patient to follow steps of the determination of the binocular alignment. These instructions can come from a remote operator, or they can be pre-recorded, and synchronized with the computer 350 projecting specific visible images 326-1 and 326-2.
  • Other embodiments of the patient communication interface 385 can include a patient feedback portal, to receive feedback from the patient. Examples include a push-button, a track wheel, a touchpad, a microphone, and an audio- interactive device. With any of these patient feedback portals, the patient can select feedback in response to a step of the diagnostic process.
  • the computer 350 may start adjusting the longitudinal/z direction of the fist display 322-1, and the loudspeaker of the patient communication interface 385 can convey the pre-recorded instruction to the patient: “indicate when the image is clear by pushing the button”.
  • the computer 350 can record the longitudinal/z position of the first display 322-1 that is informative regarding the patient’s prescription, or optical power of the eye 1-1.
  • the computer can move projected visible images 326-1 and 326-2 in a horizontal lateral/x direction on the first and second displays 322-1 and 322-2, and ask the patient to indicate through a push-button when the two images 326-1 and 326-2 are fused, or when the fusion of the two images is broken.
  • the horizontal lateral/x positions of the two images 326-1 and 326-2 are informative regarding the binocular alignment of the patient’s eyes 1-1 and 1-2.
  • FIG. 23 illustrates that in some embodiments, the first eye-tracker assembly 340-1 can include a first auto-refractor 400-1, to determine refractive information about the first eye 1-1; and the second eye-tracker assembly 340-2 can include a second auto-refractor 400-2, to determine refractive information about the second eye 1-2.
  • the second auto-refractor 400-2 can be analogous to the first auto-refractor 400-1 and thus does not need to be shown expressly.
  • the refractive information can be simply the refractive power of the investigated eye, needed to perform the method 100.
  • the prescription of the patient may have changed unbeknownst to her/him since the last examination by the optometrist.
  • the optometrist may want to track the degree of accommodation in response to moving the first display 322- 1 in the longitudinal/z direction.
  • the optometrist may want to check a higher order astigmatism or aberration.
  • the first auto-refractor 400-1 can include a first wavefront (WF) infrared (IR) light source 402-1, to project a WF IR light 402b- 1 into the first eye 1-1.
  • This first WF IR light source 402-1 can have many different embodiments, including a LED, a LED array, a superluminescent LED called SLED, and an expanded beam laser, among others.
  • the WF IR light 402b- 1 can be guided through a first collimator 404-1, and a first polarizing beam splitter 406-1, whose transmitting polarization plane is aligned with the polarization plane of the first WF IR light source 402-1.
  • the first WF IR light 402b-1 can be coupled into the optical pathway of the first eye tracker assembly 340-1 through a first beam splitter 410-1, optionally through an optional first refractor lens 408-1. From here, the WF IR light 402b- 1 can be guided to the first eye 1-1 via the main optical pathway of the first eye tracker assembly 340-1 that includes the first visible transparent IR mirror 324-1 and the first lens assembly 360-1, as shown in FIG. 23. The (typically pencil -beam-like) WF IR light 402b- 1 then reflects from the retina of the first eye 1-1 into a wider spatial angle as a reflected WF IR light 402r-1.
  • the reflected WF IR light 402r-1 propagates through the lens and cornea of the first eye 1-1, its expanding wavefront gets modified by refraction through the lens and the cornea, and thus acquires information about the refractive properties of the lens and cornea of the first eye 1-1.
  • the reflected WF IR light 402r- 1 propagates back through the main optical pathway of the first eye tracker assembly 340-1, gets split out of it by the first beam splitter 410-1, and is eventually guided by the first polarizing beam splitter 406-1 towards a first microlens array 412-1.
  • This first microlens array 412-1 is configured to receive and split the reflected WF IR light 402r-1 from the first eye 1-1 into beamlets.
  • the beamlets are then captured by a first wavefront camera 414-1 to be analyzed to determine the refractive information they cany about the first eye 1-1.
  • the above-described embodiment of the autorefractor 400- 1 broadly follows the design of the Shack-Hartmann wavefront analyzers.
  • Other embodiments can use other wavefront analyzing designs, such as Talbot-Moire interferometry, slit lamps technology, Tscheming aberrometry, lensometer technology, and the alike.
  • Lensometer devices can, in fact, capture optical characteristics of the eye beyond the sphere/refractive power. These characteristics include the cylinder power and axis information, among others.
  • a class of binocular alignment problems is called “accommodation lag”. This refers to the phenomenon when a patient is presented by an object at a presentation distance dl, but the patient’s eyes focus at a different distance d2 that does not equal dl. Often d2 is larger than dl: d2>dl. Systems 310 with an autorefractor 400-1 can recognize and diagnose such an accommodation lag.
  • a primary goal of the systems for determining binocular alignment 310 is to diagnose and characterize the cooperation and crosslinking of two systems that control human vision: the focusing system that focuses the crystalline lens at the objects at their actual distance by engaging the ciliary muscles; and the vergence system that rotates both eyes to look at the objects at their actual distance by engaging the six extraocular muscles.
  • the focusing system that focuses the crystalline lens at the objects at their actual distance by engaging the ciliary muscles
  • the vergence system that rotates both eyes to look at the objects at their actual distance by engaging the six extraocular muscles.
  • embodiments of the system to determine a binocular alignment of 310 are configured to determine a vergence response and an accommodative response in an integrated manner by the first display 322-1 and the first eye tracker assembly 340-1, and the second display 322-2 and the second eye tracker assembly 340-2 being configured to determine the vergence response; and the first display 322-1 and the first auto-refractor 400-1, and the second display 322-2 and the second auto-refractor 400-2 being configured to determine the accommodative response.
  • the computer 350 can be configured to carry out steps of this method 100.
  • the computer 350 can be configured to determine a Fixation Disparity of a patient as an amount of angular misalignment between a central target and a peripheral fusion lock of moving targets around an image with a blank center, as part of the determining of the binocular alignment.
  • the computer 350 can be also configured to determine a Gross Phoria as an average amount of angular misalignment between the first eye 1-1 and the second eye 1-2 when the first display 322-1 and the second display 322-2 display dissimilar images with one of the eyes fixated on a target at a time, as part of the determining of the binocular alignment.
  • a Gross Phoria as an average amount of angular misalignment between the first eye 1-1 and the second eye 1-2 when the first display 322-1 and the second display 322-2 display dissimilar images with one of the eyes fixated on a target at a time, as part of the determining of the binocular alignment.
  • FIG. 25 illustrates this latter vertical embodiment in more detail, concentrating on the first eye 1-1.
  • the elements of the system to determine a binocular alignment 310 that are related to the second eye 1-2 are analogous and are not shown for clarity.
  • the first eye tracker assembly 340-1 can include one or more first infrared light emitting diodes 342-1, to project an infrared eye-tracking beam 342b- 1 on the first eye 1-1; a first infrared light source 344-1 - possibly including several individual LEDs, to illuminate the first eye 1-1 with an infrared imaging light 344b-1; a first infrared camera 348-1, positioned along a longitudinal direction to detect the infrared eye-tracking beam and the infrared imaging light, both reflected from the first eye and collectively labeled 345b- 1; and a first infrared-transmissive visible mirror 324’-1, to transmit the reflected infrared eye
  • the second eye tracker assembly 340-2 can include (not shown for clarity) one or more second infrared light emitting diodes 342-2, to project an infrared eye-tracking beam 342b-2 on the second eye 1-2; a second infrared light source 344-2, to illuminate the second eye 1-2 with an infrared imaging light 344b- 2; a second infrared camera 348-2, positioned along the longitudinal direction to detect the infrared eye-tracking beam and the infrared imaging light, both reflected from the second eye, collectively labeled 345b-2; and a second infrared-transmissive visible mirror 324’ -2, to transmit the reflected infrared eye-tracking beam and the infrared imaging light 345b- 1 from the second eye 1-2 to the second infrared camera 348-2 along the longitudinal direction; and to redirect images from the lateral actuation direction of the second display 322-2 to the longitudinal direction towards the second eye 1-2.
  • the beams to and from the eyes 1-1 and 1-2 are propagating through a first and second lens assemblies 360-1 and 360-2.
  • the many variants and modifications of the embodiments of FIGS. 17-23 can have analogous implementations in the embodiment of FIGS. 24-25.
  • the horizontal adjustability can be implemented only for the first and second eye tracker assemblies 340-1 and 340-2, or for these assemblies together with the first and second displays 322-1 and 322-2, with or without the first and second lens assemblies 360-1 and 360-2, just like it was described for the embodiments of FIGS. 17-23.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Optics & Photonics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

La présente invention concerne un système pour déterminer un alignement binoculaire, lequel système comprend une première unité optique, comprenant un premier dispositif d'affichage, pour afficher des images pour un premier œil, pouvant être actionné le long d'une direction longitudinale selon une distance simulée et une puissance optique du premier œil, et un premier ensemble suiveur oculaire, pour suivre une direction de regard du premier œil, pouvant être réglé dans une direction latérale horizontale pour s'adapter à un écart interpupillaire du premier œil ; et une seconde unité optique, comprenant un second dispositif d'affichage, pour afficher des images pour un second œil, pouvant être actionné le long de la direction longitudinale selon une distance simulée et une puissance optique du second œil, et un second ensemble suiveur oculaire, pour suivre une direction de regard du second œil, pouvant être réglé dans la direction latérale horizontale pour s'adapter à un écart interpupillaire du second œil ; et un ordinateur, pour déterminer l'alignement binoculaire sur la base des directions de regard.
EP22756883.9A 2021-02-19 2022-02-17 Système pour mesurer un alignement binoculaire au moyen de dispositifs d'affichage et de suiveurs oculaires réglables Pending EP4294251A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/179,402 US20210169322A1 (en) 2017-09-05 2021-02-19 System for measuring binocular alignment with adjustable displays and eye trackers
PCT/US2022/016694 WO2022178055A1 (fr) 2021-02-19 2022-02-17 Système pour mesurer un alignement binoculaire au moyen de dispositifs d'affichage et de suiveurs oculaires réglables

Publications (1)

Publication Number Publication Date
EP4294251A1 true EP4294251A1 (fr) 2023-12-27

Family

ID=82931686

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22756883.9A Pending EP4294251A1 (fr) 2021-02-19 2022-02-17 Système pour mesurer un alignement binoculaire au moyen de dispositifs d'affichage et de suiveurs oculaires réglables

Country Status (7)

Country Link
EP (1) EP4294251A1 (fr)
JP (1) JP2024510104A (fr)
CN (1) CN116981391A (fr)
AU (1) AU2022223284A1 (fr)
CA (1) CA3208519A1 (fr)
MX (1) MX2023009664A (fr)
WO (1) WO2022178055A1 (fr)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090153796A1 (en) * 2005-09-02 2009-06-18 Arthur Rabner Multi-functional optometric-ophthalmic system for testing diagnosing, or treating, vision or eyes of a subject, and methodologies thereof
US8684526B2 (en) * 2010-07-02 2014-04-01 Amo Wavefront Sciences, Llc Compact binocular adaptive optics phoropter
US10420467B2 (en) * 2017-09-05 2019-09-24 eyeBrain Medical, Inc. Method and system for measuring binocular alignment

Also Published As

Publication number Publication date
JP2024510104A (ja) 2024-03-06
WO2022178055A1 (fr) 2022-08-25
CN116981391A (zh) 2023-10-31
AU2022223284A1 (en) 2023-08-17
MX2023009664A (es) 2023-08-25
CA3208519A1 (fr) 2022-08-25

Similar Documents

Publication Publication Date Title
AU2018330035B2 (en) Method and system for measuring binocular alignment
US20210169322A1 (en) System for measuring binocular alignment with adjustable displays and eye trackers
EP1444945B1 (fr) Dispositif d'optometrie
AU2004291042B2 (en) Ophthalmic binocular wafefront measurement system
US8684526B2 (en) Compact binocular adaptive optics phoropter
US11903645B2 (en) Method and system for measuring binocular alignment
US20110134389A1 (en) Ophthalmic diagnostic instrument
IL298199A (en) Methods and systems for diagnosing and treating diseases
US9693679B2 (en) Miniature simultaneous vision simulator instrument
US20230414100A1 (en) Headset-based system for measuring binocular alignment
KR20140111263A (ko) 복수의 시선 방향에서 피검자의 적어도 하나의 시각 매개변수를 결정하기 위한 장치
KR102474483B1 (ko) 원근조절 및 이접의 공동 결정
JP4494075B2 (ja) 検眼装置
WO2022178055A1 (fr) Système pour mesurer un alignement binoculaire au moyen de dispositifs d'affichage et de suiveurs oculaires réglables
US20210338077A1 (en) Means and Methods of Measuring Refraction
EP4181761A1 (fr) Moyens et procédés de mesure de réfraction
WO2023233411A1 (fr) Dispositif d'examen oculaire et méthode d'examen oculaire
Joubert The excess of automatic refraction over subjective refraction: Dependence on age

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230912

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)