WO2024103084A1 - Process and system for determining an eye position data set - Google Patents

Process and system for determining an eye position data set Download PDF

Info

Publication number
WO2024103084A1
WO2024103084A1 PCT/AT2022/060398 AT2022060398W WO2024103084A1 WO 2024103084 A1 WO2024103084 A1 WO 2024103084A1 AT 2022060398 W AT2022060398 W AT 2022060398W WO 2024103084 A1 WO2024103084 A1 WO 2024103084A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
data
person
light source
image data
Prior art date
Application number
PCT/AT2022/060398
Other languages
French (fr)
Inventor
Eren CERMAN
Original Assignee
Cerman Eren
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cerman Eren filed Critical Cerman Eren
Priority to PCT/AT2022/060398 priority Critical patent/WO2024103084A1/en
Publication of WO2024103084A1 publication Critical patent/WO2024103084A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0008Apparatus for testing the eyes; Instruments for examining the eyes provided with illuminating means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements

Definitions

  • the invention relates to a process and a system according to the independent claims.
  • Eye position data sets for example so called Hess diagrams or Hess screen diagrams, are used to analyze a person’s eye movement and can be used to detect different kinds of Strabismus.
  • Strabismus is a disorder in which both eyes do not line up in the same direction. Therefore, they do not look at the same object at the same time.
  • Tests to determine an individual eye position data set like the Hess test or Hess screen test are useful tests for quantifying limited eye movements by measuring ocular deviations in every gaze directions of the eye and for identifying paretic muscles; they are frequently implemented in clinical ophthalmology.
  • the original Hess test uses a screen with a square-meter tangent chart.
  • the test relies on color dissociation using red/green glasses. A red target is illuminated or projected at the juncture where each tangent line crosses. A green light is projected by the patient and each plot is recorded. Then the test is repeated for the opposite eye resulting in a chart showing an inner and outer range of ocular rotation for each eye.
  • the subject In digital Hess tests, the subject still has to fixate on a Hess diagram screen. In some versions of the digital Hess test, the subjects must press a button indicating at which location they are seeing the shown target with the other eye. .
  • At least one camera is attached on the head of the subject or the head is fixed with a head chin rest apparatus, so that the relative position of head to camera is fixed.
  • the invention relates to a Process for determining an eye position data set of a person, in particular an ocular motility disturbance data set like a Hess diagram or Hess screen diagram. In particular, without requiring an active cooperation of the person.
  • an optical receiving device like a digital camera, simultaneously captures image data of each eye and of the face of the person.
  • the image data of each eye and of the face of the person is captured for different eye gaze directions and eventually for different head directions.
  • a dedicated light source is directed at the person’s eyes to produce a glint on the person’s eye surface through reflection of the light source.
  • the light source can be an infrared light source.
  • the optical receiving device can receive or detect infrared light, in particular the glint, produced by an infrared light source.
  • the image data is being processed by a computational device.
  • the computational device determines a first eye gaze direction data for several eye gaze directions of the first eye of the person by processing the image data of the first eye.
  • the computational device determines a second eye gaze direction data for several eye gaze directions of the second eye of the person by processing the image data of the second eye.
  • the computational device determines a head direction data for several head directions by processing the image data of the face.
  • the computational device generates the eye position data set by combining the first eye gaze direction data, the corresponding second eye gaze direction data and the corresponding head direction data for different eye directions.
  • a graphical representation of the eye position data set is shown on a display device.
  • the eye gaze direction data of an eye in a specific eye gaze direction is determined by processing image data, in particular by processing a captured image pattern, of the respective eye in the computational device.
  • the eye gaze direction data of the eye is determined by processing the captured image pattern:
  • the eye gaze direction data is determined by using the centre of the corneal ring, in particular the centre of the limbus, and the relative location of the glint with regard to this centre.
  • the centre is defined by the geometric centre of a rectangle, whose sides run tangential to the contour of the corneal ring or the limbus.
  • the two sides of the rectangle preferably run horizontally and two sides preferably run vertically.
  • the light source is a light source ring arranged around the optical receiving device.
  • the head direction data is determined by processing a captured image pattern of the face, like a face landmark pattern.
  • the first and the second eye gaze direction data are transformed to a head based coordinate system, by compensating the first and the second eye gaze direction data with the head direction data.
  • the first and the second eye gaze direction data are defined as points on a virtual surface extending perpendicular to the eye gaze directions in front of the person.
  • the invention can relate to a system for determining an eye position data set of a person, in particular an ocular motility disturbance data set like a Hess diagram or Hess screen diagram.
  • the system might be used to execute the process described in the claims and/or the description.
  • the system comprises an optical receiving device, like a digital camera, that simultaneously captures image data of each eye and of the face of the person, wherein the image data of each eye and of the face of the person is captured for different eye gaze directions and eventually for different face directions.
  • an optical receiving device like a digital camera
  • the system comprises a dedicated light source, which is directed at the person’s eyes to produce a glint on the person’s eye surface through reflection of the light source, while capturing the image data.
  • a dedicated light source which is directed at the person’s eyes to produce a glint on the person’s eye surface through reflection of the light source, while capturing the image data.
  • the system comprises a computational device, which processes the image data.
  • the computational device determines a first eye gaze direction data for several eye gaze directions of the first eye of the person by processing the image data of the first eye.
  • the computational device determines a second eye gaze direction data for several eye gaze directions of the second eye of the person by processing the image data of the second eye.
  • the computational device determines a head direction data for several head directions by processing the image data of the face.
  • the computational device generates the eye position data set by combining the first eye gaze direction data, the corresponding second eye gaze direction data and the corresponding head direction data for different eye directions.
  • the system might comprise a display device for showing a graphical representation of the eye position data set.
  • the light source is configured to produce only one single glint on each eye.
  • the optical receiving device is a part of hand-held device which can be manually pointed towards the person.
  • the light source is also a part of the same hand-held device.
  • the optical receiving device is connected to the computational device via a data connection such as a data cable or a wireless data connection.
  • a data connection such as a data cable or a wireless data connection.
  • the display device is connected to the computational device via a data connection such as a data cable or a wireless data connection.
  • a data connection such as a data cable or a wireless data connection.
  • it comprises only one single optical receiving device, in particular one single camera.
  • the process and the device can automatically detect heterophoria and hetretropia of the persons, without the need of an active involvement and participation of the persons, by using a visible light blocker and infrared pass filter as a cover
  • the present invention might relate to a computerized device and to a method for processing of real-time data coming from a live camera and gives a real-time eye movement diagram of a subject.
  • Fig. 1 shows a schematic drawing of an eye and a part of the system.
  • Fig. 2 shows a schematic drawing of an eye with a pupil and a corneal ring.
  • Fig. 3 shows a schematic drawing of patterns of an eye with a corneal ring.
  • Fig. 4 shows a schematic drawing of a part of the system and an eye.
  • Fig. 5 shows another schematic drawing of parts of the system and an eye.
  • Fig. 6 shows another schematic drawing with the head of the person and a possible visual representation of the data set.
  • Fig. 7 shows a schematic drawing of the system and a head of a person.
  • Fig. 8 shows the schematic flow chart of an embodiment of the process.
  • the numerals correspond to the following components or features: Eye position data set 1 , person 2, optical receiving device 3, first eye 4, second eye 5, face 6, eye gaze direction 7, head direction 8, light source 9, glint 10, computational device 11 , first eye gaze direction data 12, image data (of the first eye) 13, second eye gaze direction data 14, image data (of the second eye) 15, head direction data 16, image data (of the face) 17, display device 18, corneal ring 19, limbus 20, centre (of the corneal ring or the limbus) 21 , centre (of the rectangle) 22, rectangle 23, sides (of the rectangle) 24, hand-held device 25, data connection 26, optical axis 27, iris 28, pupil 29, middle point 30, fovea 31 , target 32, visual axis 33, raw image data 34, processing unit 35, corneal ring pattern 36, glint pattern 37, angle 38, glint distance 39, distance Fx 40, field of view 41 , distance (camera to eye) 42, diameter (of the numerals
  • the optical axis 27 is an imaginary line that defines the path along which light propagates through an optical system of a person 2, where the axis passes through the center of curvature of each surface, and coincides with the axis of rotational symmetry.
  • the front part of the eye 4, 5 is composed of the transparent cornea, which has the limbus 20 as the outer border and the iris 28, which has an aperture in the middle that is called pupil 29.
  • the optical axis 27 of the eye 4, 5 is therefore the imaginary line which propagates along the middle of the pupil 29, and the imaginary middle point 30 of the posterior part of the retina.
  • the fovea 31 is located temporally or outside relative to the imaginary middle point 30 of the retina.
  • the axis, where the eye 4, 5 is looking at the actual target 32, is different than the optical axis 27 and is called the visual axis 33.
  • the visual axis 33 is directed to the target 32 that the eye 4, 5 is looking at, if the eye 4, 5 is looking at an optical receiving device 3 like a camera, from the viewpoint of the optical receiving device 3 the eye is turned rotationally outward.
  • This angle 38, in particular the outward rotation angle 38, between the visual axis 33 and the optical axis 27 is called angle kappa K.
  • a dedicated light source 9 is used, in particular a ring light around the optical receiving device 3. Thereby, it is possible to create a reflection on the cornea, called glint 10.
  • the optical axis 27 is directed towards an optical receiving device 3 with a light source 9, preferably a ring light around it, the glint 10 would be in the middle point 30 of the pupil 29, if the eye 4, 5 and pupil 29 would be perfectly symmetrical.
  • the optical axis 27 that combines the imaginary line between the middle point 30 of the posterior of the eye 4, 5 and the middle of the pupil 29, is turned outward, relative to the visual axis 33.
  • the visual axis 33 targets the camera and the light rays create the glint point medially relative to the pupil 29 middle point 30.
  • a computational device 11 analyzes the image data 13, 15 of this eye 4, 5.
  • the pupil 29 is used to determine the gaze angle.
  • the corneal ring 19 and in particular the limbus 20 is used to determine the optical axis 27 and the gaze angle.
  • the optical axis 27 is going through the centre 21 of the corneal ring 19 or the limbus 20, as the symmetry of the eyeball does not change and the location where the fovea 31 is placed is fixed.
  • the limbus centre 21 instead of the middle of the pupil 29 can result in better estimations. Even in normal subjects, it delivers better results, as the pupil 29 is a dynamic shape and it does change during dim or bright light conditions, which may create pupil 29 decentralization.
  • a best fit estimation of ellipse has been used to find the middle point 30 of the pupil 29 or the corneal ring 19.
  • artificial intelligence can be used, or other regression of shape based methods can give an estimation of the gaze angle.
  • the eye gaze direction data 12, 14 is determined by using the centre 21 of the corneal ring 19, in particular the centre of the limbus 20, and the relative location of the glint 10 with regard to this centre.
  • the centre 21 of the limbus 20 is defined by the geometric centre 22 of a rectangle 23, whose sides 24 Rx 24’ and Ry 24” run tangential to the contour of the corneal ring 19 or the limbus 20, wherein two sides 24 of the rectangle 23 preferably run horizontally and two sides 24 preferably run vertically.
  • the main function used in the method preferably includes the middle of the horizontal side Rx 24’ and the middle if the vertical side 24” Ry axes of the cornea and the horizontal glint distance Gx 39’ and the vertical glint distance Gy 39” and in particular the distance of the glint 10 from the centre 21 of the cornea:
  • 0 f ( Rx,Ry,Gx,Gy) The distances and also the radius are preferably measured in pixels.
  • the calculation of the distance eye 4, 5 to the optical receiving device 3, in particular the camera, is preferably done automatically. There is no need for multiple cameras or multiple light sources 9.
  • the method can calculate the distance d as follows.
  • the camera creates a screen with a distance Fx 40, preferably in pixels, which is related to the field of view 41 .
  • the diameter 43 of the limbus is related to the distance 42.
  • the diameter 43 in mm is a fixed distance 40, it might give a scale for pixel to mm conversion too.
  • this method eliminates the need to fix the head or the camera, there is no need using two or more cameras or light sources 9 to determine the distance 42, as the function of d is included already in the main function, it gives accurate results while the head or the eyes 4, 5 or the camera can move freely. This might be applicable for all embodiments.
  • angle 38 kappa 0 as:
  • any gaze angle detected while the subject is looking at the camera is the angle 38 kappa.
  • the calibration process is preferably done automatically.
  • the process and the device can be used as a replacement for a traditional Hess test and can be used to create an eye position data set 1 as a Hess diagram, as illustrated in fig. 6
  • the advantages are that there is no need of cooperation of the person 2, no need for looking at known targets 32 and no need for red-green glasses.
  • the method described here can measure a point where the first eye 4 is directed and at that moment where the second eye 5 is directed, it can calculate the difference of gaze angles strabismus for a given moment and for a given direction.
  • the same conditions of the Hess test will apply.
  • the person 2 would have the same Xf, Xc, Yc, Yf and Y.
  • the patients input eye or right eye in the example has a decrease in muscle force (Xf) on the right eye, which is followed by an increased command (Xc) from the brain, as only with an increased command the eye would turn to the target.
  • Increased command on right eye increases the left eye command (Yc), according to the Hering rule.
  • An increased command (Yc) causes an increased force on the output or left eye in the example (Yf) which causes an increased movement that results as a gaze direction of the output eye (Y) which is different than the input eye (X).
  • the traditional Hess Test Diagram is a Diagram of all Y for all X for both eyes, while the binocular vision is blocked (phoria) with red green glasses in fron of the eyes.
  • the proposed method determines a Y for all given X for both eyes simultanously. If no visible light blocker is used, proposed method draws a diagram of tropia. If a visible light blocker is used as a cover on an eye, than the proposed method draws all Y on that covered (output) eye, as the infrared light on the proposed device still can see the output eye through the visible light blocker and determine its directions.
  • the proposed method of this application therfore can draw a diagram of phoria, exactly the same of traditional Hess test.
  • this method can determine the same results as a Hess test, without the need of the person 2 perceiving a target 32 and showing it, while they are carrying red-green glasses.
  • weighted function calculates for every direction so that every look direction nearer to the centre around the given point on a Hess test calculates to a statistically more weighted estimation of the phoria.
  • the horizontal axis of the eye gaze direction can be determined, for example, by the following formula:
  • Target_eye_x aO + a1 ⁇ Corneal_Diameter_x A 2.5 + a2 ⁇ GlareDistX ⁇ cos A 2(GlareDistY) + a3 ⁇ GlareDistX ⁇ Corneal_Diametery A 2 + a4 ⁇ sin A 2(GlareDistX) ⁇ sin A 2(GlareDistY) ⁇ sin(Corneal_Diameterx) + a5 ⁇ sin(GlareDistX) + a6 ⁇ GlareDistY A 3
  • the vertical axis of the eye gaze direction can be determined, for example, by the following formula:
  • Target_eye_y aO + aT Corneal_Diametery A 2.5 + a2- GlareDistY- cos A 2(GlareDistX) + a3- GlareDistY- Corneal_Diameterx A 2 + a4- sin A 2(GlareDistY)- sin A 2(GlareDistX)- sin(L_Corneal_DiameterX) + a5- sin(GlareDistY) + a6- GlareDistX A 3
  • the values aO to a6 are system specific values or constants, which can be determined in a simple calibration sequence.
  • Corneal_Diameter_x corresponds to Rx
  • GlareDistX corresponds to Gx
  • Corneal_Diameter_y corresponds to Ry
  • GlareDistY corresponds to Gy.
  • the optical axis 27 is going through the point L, as the symmetry of the eyeball does not change and the location where the fovea 31 is placed is fixed.
  • Fig. 7 shows a schematic drawing of the system and the face 6 of a person 2 for determining an eye position data set 1 .
  • the eye position data set 1 can be displayed on a display device 18.
  • the display device 18 is connected to a computational device 11.
  • the system comprises an optical receiving device 3, which preferably is a digital camera.
  • the optical receiving device 3 preferably is part of a hand-held device 25.
  • This hand-held device 25 can be operated by a person who conducts the process, in particular a specialist for conducting such eye tests, like a doctor.
  • the hand-held device 25 can be freely moved by the person who is conducting the test, in particular the doctor.
  • the hand-held device 25 can be connected to the computational device 11 via a data connection 26.
  • the data connection 26 can be a cable connection or a wireless connection.
  • the specialist directs the optical receiving device 3 to the face 6 of the person 2, in particular towards the face 6, the first eye 4 and the second eye 5.
  • the optical receiving device 3 records image data 13, 15 and transmits this data 13, 15 to the computational device 11.
  • the computational device 11 analyses the image data 13, 15 and calculates or determines an eye position data set 1 .
  • the eye position data set 1 in particular the optical representation of this eye position data set 1 is a Hess diagram.
  • the optical receiving device 3 can be freely moved.
  • the person 2 and the head of the person 2 can be freely moved.
  • the person 2 can freely look in different directions using the mobility of their head and their eyes 4, 5, without the need of following specific instructions.
  • the face 6 is observed by the optical receiving device 3, generating image data of different face and eye directions.
  • the system further comprises a light source 9, which is a dedicated light source 9 of the system and preferably not the normal light source of the environment.
  • the light source 9 is located symmetrically to the optical receiving device 3.
  • the light source 9 can be a ring light source, namely a light source, which has the form of a ring. This ring can be located surrounding the optical receiving device
  • This light source 9 produces a glint 10 on the eyes
  • the computational device 11 determines different eye gaze directions 7 from the image data 13, 15. Apart from that, the computational device 11 determines a head direction 8 from the image data 13, 15.
  • the optical receiving device 3 receives image data 13, 15 of both eyes 4, 5 and image data 17 of the face 6, to calculate the head direction 8.
  • This head direction 8 is used to compensate the eye gaze direction 7 of the eyes 4, 5.
  • the eye gaze direction 7 can be transformed to a head-based coordinate system.
  • the system comprises only one single optical receiving device 3.
  • the system comprises only one single light source 9, which might be arranged symmetrically to the optical receiving device 3, in particular the light source 9 can be a ring surrounding the optical receiving device 3.
  • Fig. 8 shows a flow chart of a possible embodiment of the process.
  • the face 6, the first eye 4 and the second eye 5 of the person 2 are being monitored by an optical receiving device 3 to generate raw image data 34.
  • This raw image data 34 is being transmitted to the computational device 11 , particularly via a data connection 26.
  • a first processing unit 35 processes the data flow.
  • Image data 13 of the first eye 4 is being processed by a processing unit 35 to analyze the corneal ring pattern 36 and the glint pattern 37. These two data are input data for another processing unit 35, which determines the first eye gaze direction data 12.
  • the same is being processed for the second eye 5.
  • the image data 15 of the second eye 5 is being processed by a processing unit 35 and the extracted data, in particular the corneal ring pattern 36 and the glint pattern 37 are used to calculate the second eye gaze direction data 14.
  • the image data 17 of the face 6 of the person 2 is being processed by a processing unit 35 to calculate a head direction data 16.
  • the head direction data 16 concerns the direction of the orientation of the face 6 of the person 2.
  • the determined data in particular the first eye gaze direction data 12, the second eye gaze direction data 14 and the head direction data 16 are combined to calculate the eye position data set 1 .
  • This eye position data set 1 in particular a graphical representation of the eye position data set 1 , can be displayed on a display device 18.
  • This display device 18 can be connected to the computational device 11 by a data connection 26.
  • the data can be processed in real-time.
  • the eye position data set 1 of the person 2 can be determined in real-time, when the optical receiving device 3 is actively directed at the person 2.
  • this real-time analysis allows the person conducting the test to perform special movements and special tests in real-time.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

Process and device for determining an eye position data set (1 ) of a person (2), in particular an ocular motility disturbance data set like a Hess diagram, - wherein an optical receiving device (3) simultaneously captures image data (13, 15, 17) of each eye (4,5) and of the face (6) of the person (2), - wherein a computational device (11 ) generates the eye position data set (1 ) by combining the first eye gaze direction data (12), the corresponding second eye gaze direction data (14) and the corresponding head direction data (16) for different eye directions.

Description

Process and system for determining an eye position data set
The invention relates to a process and a system according to the independent claims.
Eye position data sets, for example so called Hess diagrams or Hess screen diagrams, are used to analyze a person’s eye movement and can be used to detect different kinds of Strabismus.
Strabismus is a disorder in which both eyes do not line up in the same direction. Therefore, they do not look at the same object at the same time.
Tests to determine an individual eye position data set like the Hess test or Hess screen test are useful tests for quantifying limited eye movements by measuring ocular deviations in every gaze directions of the eye and for identifying paretic muscles; they are frequently implemented in clinical ophthalmology.
The original Hess test uses a screen with a square-meter tangent chart. The test relies on color dissociation using red/green glasses. A red target is illuminated or projected at the juncture where each tangent line crosses. A green light is projected by the patient and each plot is recorded. Then the test is repeated for the opposite eye resulting in a chart showing an inner and outer range of ocular rotation for each eye.
In digital Hess tests, the subject still has to fixate on a Hess diagram screen. In some versions of the digital Hess test, the subjects must press a button indicating at which location they are seeing the shown target with the other eye. .
On other digital eye movement analyzing devices at least one camera is attached on the head of the subject or the head is fixed with a head chin rest apparatus, so that the relative position of head to camera is fixed.
Conventional processes and devices for determining an individual eye position data set of a person therefore are complicated and require assistance and cooperation of the person. This might not be possible, for example, when the person is a child or a person lacking cognitive or physical skills.
It is now an object of the invention to facilitate the determination of an individual eye position data set. This might comprise a simple device, which is easy to use and easy to manufacture. This might comprise a simple test procedure for the person.
The object of the invention is solved by the features defined in the independent claims.
The invention relates to a Process for determining an eye position data set of a person, in particular an ocular motility disturbance data set like a Hess diagram or Hess screen diagram. In particular, without requiring an active cooperation of the person.
Preferably, an optical receiving device, like a digital camera, simultaneously captures image data of each eye and of the face of the person.
Preferably, the image data of each eye and of the face of the person is captured for different eye gaze directions and eventually for different head directions. Preferably, while capturing the image data, a dedicated light source is directed at the person’s eyes to produce a glint on the person’s eye surface through reflection of the light source.
Preferably, in all embodiments, the light source can be an infrared light source. Preferably, in all embodiments, the optical receiving device can receive or detect infrared light, in particular the glint, produced by an infrared light source.
Preferably, the image data is being processed by a computational device.
Preferably, the computational device determines a first eye gaze direction data for several eye gaze directions of the first eye of the person by processing the image data of the first eye.
Preferably, the computational device determines a second eye gaze direction data for several eye gaze directions of the second eye of the person by processing the image data of the second eye.
Preferably, the computational device determines a head direction data for several head directions by processing the image data of the face.
Preferably, the computational device generates the eye position data set by combining the first eye gaze direction data, the corresponding second eye gaze direction data and the corresponding head direction data for different eye directions.
These steps are preferably executed simultaneously.
Optionally, a graphical representation of the eye position data set is shown on a display device.
Optionally, the eye gaze direction data of an eye in a specific eye gaze direction is determined by processing image data, in particular by processing a captured image pattern, of the respective eye in the computational device. Optionally, the eye gaze direction data of the eye is determined by processing the captured image pattern:
- of the contour of the corneal ring, in particular of the contour of the limbus,
- and of the glint of the light source reflection on the cornea of the respective eye.
Optionally, the eye gaze direction data is determined by using the centre of the corneal ring, in particular the centre of the limbus, and the relative location of the glint with regard to this centre.
Optionally, the centre is defined by the geometric centre of a rectangle, whose sides run tangential to the contour of the corneal ring or the limbus.
The two sides of the rectangle preferably run horizontally and two sides preferably run vertically.
Optionally, there is only one single glint on each eye produced by the light source.
Optionally, the light source is a light source ring arranged around the optical receiving device.
Optionally, the head direction data is determined by processing a captured image pattern of the face, like a face landmark pattern.
Optionally, when generating the eye position data set, the first and the second eye gaze direction data are transformed to a head based coordinate system, by compensating the first and the second eye gaze direction data with the head direction data.
Optionally, the first and the second eye gaze direction data are defined as points on a virtual surface extending perpendicular to the eye gaze directions in front of the person.
The invention can relate to a system for determining an eye position data set of a person, in particular an ocular motility disturbance data set like a Hess diagram or Hess screen diagram. The system might be used to execute the process described in the claims and/or the description.
Preferably, the system comprises an optical receiving device, like a digital camera, that simultaneously captures image data of each eye and of the face of the person, wherein the image data of each eye and of the face of the person is captured for different eye gaze directions and eventually for different face directions.
Preferably, the system comprises a dedicated light source, which is directed at the person’s eyes to produce a glint on the person’s eye surface through reflection of the light source, while capturing the image data.
Preferably, the system comprises a computational device, which processes the image data.
Preferably, the computational device determines a first eye gaze direction data for several eye gaze directions of the first eye of the person by processing the image data of the first eye.
Preferably, the computational device determines a second eye gaze direction data for several eye gaze directions of the second eye of the person by processing the image data of the second eye.
Preferably, the computational device determines a head direction data for several head directions by processing the image data of the face.
Preferably, the computational device generates the eye position data set by combining the first eye gaze direction data, the corresponding second eye gaze direction data and the corresponding head direction data for different eye directions.
The system might comprise a display device for showing a graphical representation of the eye position data set.
Optionally, the light source is configured to produce only one single glint on each eye. Optionally, the optical receiving device is a part of hand-held device which can be manually pointed towards the person.
Optionally the light source is also a part of the same hand-held device.
Optionally, the optical receiving device is connected to the computational device via a data connection such as a data cable or a wireless data connection.
Optionally, the display device is connected to the computational device via a data connection such as a data cable or a wireless data connection.
Optionally, it comprises only one single optical receiving device, in particular one single camera.
Preferably, the process and the device can automatically detect heterophoria and hetretropia of the persons, without the need of an active involvement and participation of the persons, by using a visible light blocker and infrared pass filter as a cover
In particular, the present invention might relate to a computerized device and to a method for processing of real-time data coming from a live camera and gives a real-time eye movement diagram of a subject.
The invention is now described further by means of exemplary embodiments.
Fig. 1 shows a schematic drawing of an eye and a part of the system.
Fig. 2 shows a schematic drawing of an eye with a pupil and a corneal ring.
Fig. 3 shows a schematic drawing of patterns of an eye with a corneal ring.
Fig. 4 shows a schematic drawing of a part of the system and an eye.
Fig. 5 shows another schematic drawing of parts of the system and an eye.
Fig. 6 shows another schematic drawing with the head of the person and a possible visual representation of the data set.
Fig. 7 shows a schematic drawing of the system and a head of a person.
Fig. 8 shows the schematic flow chart of an embodiment of the process. Unless otherwise indicated, the numerals correspond to the following components or features: Eye position data set 1 , person 2, optical receiving device 3, first eye 4, second eye 5, face 6, eye gaze direction 7, head direction 8, light source 9, glint 10, computational device 11 , first eye gaze direction data 12, image data (of the first eye) 13, second eye gaze direction data 14, image data (of the second eye) 15, head direction data 16, image data (of the face) 17, display device 18, corneal ring 19, limbus 20, centre (of the corneal ring or the limbus) 21 , centre (of the rectangle) 22, rectangle 23, sides (of the rectangle) 24, hand-held device 25, data connection 26, optical axis 27, iris 28, pupil 29, middle point 30, fovea 31 , target 32, visual axis 33, raw image data 34, processing unit 35, corneal ring pattern 36, glint pattern 37, angle 38, glint distance 39, distance Fx 40, field of view 41 , distance (camera to eye) 42, diameter (of the limbus) 43.
As shown in Fig. 1, the optical axis 27 is an imaginary line that defines the path along which light propagates through an optical system of a person 2, where the axis passes through the center of curvature of each surface, and coincides with the axis of rotational symmetry.
The front part of the eye 4, 5 is composed of the transparent cornea, which has the limbus 20 as the outer border and the iris 28, which has an aperture in the middle that is called pupil 29. The optical axis 27 of the eye 4, 5 is therefore the imaginary line which propagates along the middle of the pupil 29, and the imaginary middle point 30 of the posterior part of the retina.
As the main functional point of the central vision, the fovea 31 is located temporally or outside relative to the imaginary middle point 30 of the retina. The axis, where the eye 4, 5 is looking at the actual target 32, is different than the optical axis 27 and is called the visual axis 33.
As the visual axis 33 is directed to the target 32 that the eye 4, 5 is looking at, if the eye 4, 5 is looking at an optical receiving device 3 like a camera, from the viewpoint of the optical receiving device 3 the eye is turned rotationally outward. This angle 38, in particular the outward rotation angle 38, between the visual axis 33 and the optical axis 27 is called angle kappa K. With the current process and system, during gaze tracking a dedicated light source 9 is used, in particular a ring light around the optical receiving device 3. Thereby, it is possible to create a reflection on the cornea, called glint 10.
If the optical axis 27 is directed towards an optical receiving device 3 with a light source 9, preferably a ring light around it, the glint 10 would be in the middle point 30 of the pupil 29, if the eye 4, 5 and pupil 29 would be perfectly symmetrical.
As the fovea 31 is placed not in the symmetrical middle of the retina and as it is placed laterally, the optical axis 27 that combines the imaginary line between the middle point 30 of the posterior of the eye 4, 5 and the middle of the pupil 29, is turned outward, relative to the visual axis 33.
If the eye 4, 5 is looking at the camera with a ring light on it, the visual axis 33 targets the camera and the light rays create the glint point medially relative to the pupil 29 middle point 30.
To determine the gaze angle of an eye 4, 5, a computational device 11 analyzes the image data 13, 15 of this eye 4, 5.
According to a known method, the pupil 29 is used to determine the gaze angle.
According to this method if 0 = gaze angle = visual axis 33, o = optical angle = optical axis 27.
K = angle kappa = the angle between visual axis 33 and optical axis 27. For any given gaze angle 0 = o - K. During calculation of the gaze angle with Hirschberg and other similar methods, the absolute units (such as mm) are used. Hirschberg function: 0 = These previously described methods and publications claim it to be a linear function. These methods need a calibration for accurate results.
According to the invention, the following method might be used. With this method, the corneal ring 19 and in particular the limbus 20 is used to determine the optical axis 27 and the gaze angle. In situations even with great pupil 29 deformity, as shown in fig. 2, the optical axis 27 is going through the centre 21 of the corneal ring 19 or the limbus 20, as the symmetry of the eyeball does not change and the location where the fovea 31 is placed is fixed.
Therefore, using the limbus centre 21 instead of the middle of the pupil 29 can result in better estimations. Even in normal subjects, it delivers better results, as the pupil 29 is a dynamic shape and it does change during dim or bright light conditions, which may create pupil 29 decentralization.
Even the variation of anatomy of the pupil 29 can influence the calculation of the middle point 30 of the pupil 29 and therefore the gaze angle.
In some methods, a best fit estimation of ellipse has been used to find the middle point 30 of the pupil 29 or the corneal ring 19. To determine the best fit ellipse, artificial intelligence can be used, or other regression of shape based methods can give an estimation of the gaze angle.
However, artificial intelligence methods for such shape based methods must be trained with a large number of pictures, and to be accurate these methods require a fast computing performance.
According to fig. 3, another method might be used. The eye gaze direction data 12, 14 is determined by using the centre 21 of the corneal ring 19, in particular the centre of the limbus 20, and the relative location of the glint 10 with regard to this centre.
The centre 21 of the limbus 20 is defined by the geometric centre 22 of a rectangle 23, whose sides 24 Rx 24’ and Ry 24” run tangential to the contour of the corneal ring 19 or the limbus 20, wherein two sides 24 of the rectangle 23 preferably run horizontally and two sides 24 preferably run vertically.
The main function used in the method preferably includes the middle of the horizontal side Rx 24’ and the middle if the vertical side 24” Ry axes of the cornea and the horizontal glint distance Gx 39’ and the vertical glint distance Gy 39” and in particular the distance of the glint 10 from the centre 21 of the cornea:
0 = f ( Rx,Ry,Gx,Gy) The distances and also the radius are preferably measured in pixels.
As illustrated in fig. 4, the calculation of the distance eye 4, 5 to the optical receiving device 3, in particular the camera, is preferably done automatically. There is no need for multiple cameras or multiple light sources 9.
The method can calculate the distance d as follows.
The camera creates a screen with a distance Fx 40, preferably in pixels, which is related to the field of view 41 .
For a known field of view 41 , the diameter 43 of the limbus is related to the distance 42. As the diameter 43 in mm is a fixed distance 40, it might give a scale for pixel to mm conversion too.
As:
R = f(Rx,Ry) and d = f(Fx, R)
Using this method, it eliminates the need to fix the head or the camera, there is no need using two or more cameras or light sources 9 to determine the distance 42, as the function of d is included already in the main function, it gives accurate results while the head or the eyes 4, 5 or the camera can move freely. This might be applicable for all embodiments.
As the anatomy of the eye 4, 5 does not change during the measurement, the fovea 31 lateralization and the kappa angle 38 remain the same during the test.
Therefore, it might be adequate to measure the angle 38 kappa only once for a given subject. As with the function described above, angle of optical axis 27 o when the subject is looking at the camera, it can be seen that the gaze angle is 0 as the visual axis 33 is directed to the camera: 0 = 0 as:
0 = o - K K = o
Therefore, preferably any gaze angle detected while the subject is looking at the camera is the angle 38 kappa.
As it is usually not possible for a person 2 not to use a smooth pursuit function of the eye 4, 5 without looking at the target 32, it is possible to determine for a device the angle 38 kappa when the person 2 tracks a moving camera, as illustrated in fig. 5.
If 0 < 7° and it remains to be 0 < 7° despite the move of the camera relative to the subject, then it means that the subject is following or using their smooth pursuit to track the camera, and during this movement the gaze angle must be kappa, for anatomical and physiological reasons, and angle 38 kappa can be determined. Therefore, the calibration process is preferably done automatically.
The process and the device can be used as a replacement for a traditional Hess test and can be used to create an eye position data set 1 as a Hess diagram, as illustrated in fig. 6 The advantages are that there is no need of cooperation of the person 2, no need for looking at known targets 32 and no need for red-green glasses.
As the method described here can measure a point where the first eye 4 is directed and at that moment where the second eye 5 is directed, it can calculate the difference of gaze angles strabismus for a given moment and for a given direction.
Yet one or the only difference might be that it measures the tropia for the given direction.
When the light source 9 is infrared, and the person 2 has a visual spectrum blocker in front of one eye 4, 5, then the same conditions of the Hess test will apply. For a given direction of gaze of the right eye, the person 2 would have the same Xf, Xc, Yc, Yf and Y.
If the Xf<Xt then XC = Xf + Xdiff and therefore the Yc would be also increased, and as the infrared light and camera can see through the visible light blocker, it can determine the Y for X for every direction.
With a traditional Hess Test, the patients input eye or right eye in the example has a decrease in muscle force (Xf) on the right eye, which is followed by an increased command (Xc) from the brain, as only with an increased command the eye would turn to the target. Increased command on right eye increases the left eye command (Yc), according to the Hering rule. An increased command (Yc) causes an increased force on the output or left eye in the example (Yf) which causes an increased movement that results as a gaze direction of the output eye (Y) which is different than the input eye (X). The traditional Hess Test Diagram is a Diagram of all Y for all X for both eyes, while the binocular vision is blocked (phoria) with red green glasses in fron of the eyes. The proposed method determines a Y for all given X for both eyes simultanously. If no visible light blocker is used, proposed method draws a diagram of tropia. If a visible light blocker is used as a cover on an eye, than the proposed method draws all Y on that covered (output) eye, as the infrared light on the proposed device still can see the output eye through the visible light blocker and determine its directions.
The proposed method of this application therfore can draw a diagram of phoria, exactly the same of traditional Hess test.
Therefore, this method can determine the same results as a Hess test, without the need of the person 2 perceiving a target 32 and showing it, while they are carrying red-green glasses.
As this method does also not need the long performance from a doctor to show targets 32 and draw Hess test results manually, it is much easier.
As the method might not draw Hess test results for every given angle 38 separately, as there are infinite directions, it uses a separate function of weighted results for certain directions, for example every 25 directions, of the standard Hess test. For example, weighted function calculates for every direction so that every look direction nearer to the centre around the given point on a Hess test calculates to a statistically more weighted estimation of the phoria.
As a result, 0 = f (Rx,Ry,Gx,Gy) and d = f(Fx, R).
The horizontal axis of the eye gaze direction can be determined, for example, by the following formula:
Target_eye_x = aO + a1 ■ Corneal_Diameter_xA2.5 + a2 ■ GlareDistX ■ cosA2(GlareDistY) + a3 ■ GlareDistX ■ Corneal_DiameteryA2 + a4 ■ sinA2(GlareDistX) ■ sinA2(GlareDistY) ■ sin(Corneal_Diameterx) + a5 ■ sin(GlareDistX) + a6 ■ GlareDistYA3
The vertical axis of the eye gaze direction can be determined, for example, by the following formula:
Target_eye_y = aO + aT Corneal_DiameteryA2.5 + a2- GlareDistY- cosA2(GlareDistX) + a3- GlareDistY- Corneal_DiameterxA2 + a4- sinA2(GlareDistY)- sinA2(GlareDistX)- sin(L_Corneal_DiameterX) + a5- sin(GlareDistY) + a6- GlareDistXA3
The values aO to a6 are system specific values or constants, which can be determined in a simple calibration sequence. Corneal_Diameter_x corresponds to Rx, GlareDistX corresponds to Gx, Corneal_Diameter_y corresponds to Ry, GlareDistY corresponds to Gy.
In situations even with great pupil 29 deformity, the optical axis 27 is going through the point L, as the symmetry of the eyeball does not change and the location where the fovea 31 is placed is fixed.
Therefore, using the limbus 20 middle point 30 point L instead of P results in better estimations. Even in normal subjects it gives better results, as the pupil 29 is a dynamic shape and it does change during dim or bright light conditions which may create pupil 29 decentralization.
It is also important to note that even the variation of anatomy of the pupil 29 can affect the calculation of the middle point 30 of the pupil 29 and therefore the gaze angle.
Fig. 7 shows a schematic drawing of the system and the face 6 of a person 2 for determining an eye position data set 1 . The eye position data set 1 can be displayed on a display device 18. The display device 18 is connected to a computational device 11. To determine the eye position data set 1 , the system comprises an optical receiving device 3, which preferably is a digital camera. The optical receiving device 3 preferably is part of a hand-held device 25. This hand-held device 25 can be operated by a person who conducts the process, in particular a specialist for conducting such eye tests, like a doctor.
Preferably, in all embodiments, the hand-held device 25 can be freely moved by the person who is conducting the test, in particular the doctor. The hand-held device 25 can be connected to the computational device 11 via a data connection 26. The data connection 26 can be a cable connection or a wireless connection.
To perform the test, the specialist directs the optical receiving device 3 to the face 6 of the person 2, in particular towards the face 6, the first eye 4 and the second eye 5. The optical receiving device 3 records image data 13, 15 and transmits this data 13, 15 to the computational device 11. The computational device 11 analyses the image data 13, 15 and calculates or determines an eye position data set 1 . In this embodiment, the eye position data set 1 , in particular the optical representation of this eye position data set 1 is a Hess diagram.
In this embodiment, preferably in all embodiments, the optical receiving device 3 can be freely moved. Preferably in all embodiments, also the person 2 and the head of the person 2 can be freely moved. In particular, the person 2 can freely look in different directions using the mobility of their head and their eyes 4, 5, without the need of following specific instructions. When moving the head and the eyes 4, 5, the face 6 is observed by the optical receiving device 3, generating image data of different face and eye directions.
Preferably, the system further comprises a light source 9, which is a dedicated light source 9 of the system and preferably not the normal light source of the environment.
Preferably, the light source 9 is located symmetrically to the optical receiving device 3. For example, the light source 9 can be a ring light source, namely a light source, which has the form of a ring. This ring can be located surrounding the optical receiving device
3, as schematically shown in Fig. 8. This light source 9 produces a glint 10 on the eyes
4, 5 of the person 2, as described before.
The computational device 11 determines different eye gaze directions 7 from the image data 13, 15. Apart from that, the computational device 11 determines a head direction 8 from the image data 13, 15.
To determine the eye gaze directions 7 of both eyes 4, 5, the optical receiving device 3 receives image data 13, 15 of both eyes 4, 5 and image data 17 of the face 6, to calculate the head direction 8. This head direction 8 is used to compensate the eye gaze direction 7 of the eyes 4, 5. Through this compensation, the eye gaze direction 7 can be transformed to a head-based coordinate system.
Preferably, the system comprises only one single optical receiving device 3. Preferably, the system comprises only one single light source 9, which might be arranged symmetrically to the optical receiving device 3, in particular the light source 9 can be a ring surrounding the optical receiving device 3.
Fig. 8 shows a flow chart of a possible embodiment of the process. The face 6, the first eye 4 and the second eye 5 of the person 2 are being monitored by an optical receiving device 3 to generate raw image data 34.
This raw image data 34 is being transmitted to the computational device 11 , particularly via a data connection 26.
A first processing unit 35 processes the data flow. Image data 13 of the first eye 4 is being processed by a processing unit 35 to analyze the corneal ring pattern 36 and the glint pattern 37. These two data are input data for another processing unit 35, which determines the first eye gaze direction data 12. The same is being processed for the second eye 5. In particular, the image data 15 of the second eye 5 is being processed by a processing unit 35 and the extracted data, in particular the corneal ring pattern 36 and the glint pattern 37 are used to calculate the second eye gaze direction data 14. The image data 17 of the face 6 of the person 2 is being processed by a processing unit 35 to calculate a head direction data 16.
In particular, the head direction data 16 concerns the direction of the orientation of the face 6 of the person 2. The determined data, in particular the first eye gaze direction data 12, the second eye gaze direction data 14 and the head direction data 16 are combined to calculate the eye position data set 1 . This eye position data set 1 , in particular a graphical representation of the eye position data set 1 , can be displayed on a display device 18. This display device 18 can be connected to the computational device 11 by a data connection 26.
Preferably, in all embodiments, the data can be processed in real-time. In particular, the eye position data set 1 of the person 2 can be determined in real-time, when the optical receiving device 3 is actively directed at the person 2. Although it is not necessary that the person 2 follows special instructions, this real-time analysis allows the person conducting the test to perform special movements and special tests in real-time.

Claims

Claims
1. Process for determining an eye position data set (1 ) of a person (2), in particular an ocular motility disturbance data set like a Hess diagram,
- wherein an optical receiving device (3), like a digital camera, simultaneously captures image data (13, 15, 17) of each eye (4,5) and of the face (6) of the person (2),
- wherein the image data (13, 15, 17) of each eye (4, 5) and of the face (6) of the person (2) is captured for different eye gaze directions (7) and eventually for different head directions (8),
- wherein, while capturing the image data (13, 15, 17), a dedicated light source (9) is directed at the person’s (2) eyes (4, 5) to produce a glint (10) on the person’s (2) eye surface through reflection of the light source (9),
- wherein the image data (13, 15, 17) is being processed by a computational device (11 ),
- wherein the computational device (11 ) determines a first eye gaze direction data (12) for several eye gaze directions (7) of the first eye (4) of the person (2) by processing the image data (13) of the first eye (4),
- wherein the computational device (11 ) determines a second eye gaze direction data (14) for several eye gaze directions (7) of the second eye (5) of the person (2) by processing the image data (15) of the second eye (5),
- wherein the computational device (11 ) determines a head direction data (16) for several head directions (8) by processing the image data (17) of the face (6),
- wherein the computational device (11 ) generates the eye position data set (1 ) by combining the first eye gaze direction data (12), the corresponding second eye gaze direction data (14) and the corresponding head direction data (16) for different eye directions.
2. The process according to claim 1 , wherein a graphical representation of the eye position data set (1 ) is shown on a display device (18).
3. The process according to one of claims 1 or 2, wherein the eye gaze direction data (12, 14) of an eye (4, 5) in a specific eye gaze direction (7) is determined by processing image data (13, 15, 17), in particular by processing a captured image pattern, of the respective eye in the computational device (11 ). The process according to one of claim 1 to 3, wherein the eye gaze direction data (12, 14) of the eye (4, 5) is determined by processing the captured image pattern:
- of the contour of the corneal ring (19), in particular of the contour of the limbus
(20),
- and of the glint (10) of the light source (9) reflection on the cornea of the respective eye (4, 5). The process according to claim 4, wherein the eye gaze direction data (12, 14) is determined by using the centre (21 ) of the corneal ring (19), in particular the centre
(21 ) of the limbus (20), and the relative location of the glint (10) with regard to this centre. The process according to claim 5,
- wherein the centre (22) is defined by the geometric centre of a rectangle (23), whose sides (24) run tangential to the contour of the corneal ring (19) or the limbus (20),
- wherein two sides (24) of the rectangle (23) preferably run horizontally and two sides (24) preferably run vertically. The process according to claims 1 to 6,
- wherein there is only one single glint (10) on each eye (4, 5) produced by the light source (9),
- and/or wherein, the light source (9) is a light source ring arranged around the optical receiving device (3),
- and/or wherein the light source is an infrared light source. The process according to one of claims 1 to 7, wherein the head direction data (16) is determined by processing a captured image pattern of the face (6), like a face landmark pattern. The process according to one of claims 1 to 8, wherein, when generating the eye position data set (1 ), the first and the second eye gaze direction data (12, 14) are transformed to a head based coordinate system, by compensating the first and the second eye gaze direction data (12, 14) with the head direction data (16). The process according to one of claims 1 to 9, wherein the first and the second eye gaze direction data (12, 14) are defined as points on a virtual surface extending perpendicular to the eye gaze directions (7) in front of the person (2). System for determining an eye position data set (1 ) of a person (2), in particular an ocular motility disturbance data set like a Hess diagram, comprising:
- an optical receiving device (3), like a digital camera, that simultaneously captures image data (13, 15, 17) of each eye (4, 5) and of the face (6) of the person (2), wherein the image data (13, 15, 17) of each eye (4, 5) and of the face (6) of the person (2) is captured for different eye gaze directions (7) and eventually for different face directions,
- a dedicated light source (9), which is directed at the person’s (2) eyes (4, 5) to produce a glint (10) on the person’s (2) eye surface through reflection of the light source (9), while capturing the image data (13, 15),
- a computational device (11 ), which processes the image data (13, 15, 17),
- wherein the computational device (11 ) determines a first eye gaze direction data (12) for several eye gaze directions (7) of the first eye (4) of the person (2) by processing the image data (13) of the first eye (4),
- wherein the computational device (11 ) determines a second eye gaze direction data (14) for several eye gaze directions (7) of the second eye (5) of the person (2) by processing the image data (15) of the second eye (5),
- wherein the computational device (11 ) determines a head direction data (16) for several head directions (8) by processing the image data (17) of the face (6),
- wherein the computational device (11 ) generates the eye position data set (1 ) by combining the first eye gaze direction data (12), the corresponding second eye gaze direction data (14) and the corresponding head direction data (16) for different eye directions. The system according to claim 11 , comprising a display device (18) for showing a graphical representation of the eye position data set (1 ). The system according to claims 11 or 12,
- wherein the light source (9) is configured to produce only one single glint (10) on each eye (4, 5), - and/or wherein the light source (9) is a light source ring arranged around the optical receiving device (3)
- and/or wherein the light source is an infrared light source. The system according to one of claims 11 to 13,
- wherein the optical receiving device (3) is a part of hand-held device (25) which can be manually pointed towards the person (2),
- wherein preferably the light source (9) is also a part of the same hand-held device (25). The system according to one of claims 11 to 14,
- wherein the optical receiving device (3) is connected to the computational device (11 ) via a data connection (26) such as a data cable or a wireless data connection (26). The system according to one of claims 11 to 15, wherein the display device (18) is connected to the computational device (11 ) via a data connection (26) such as a data cable or a wireless data connection (26). The system according to one of claims 11 to 16, wherein it comprises only one single optical receiving device (3), in particular one single camera.
PCT/AT2022/060398 2022-11-15 2022-11-15 Process and system for determining an eye position data set WO2024103084A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/AT2022/060398 WO2024103084A1 (en) 2022-11-15 2022-11-15 Process and system for determining an eye position data set

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/AT2022/060398 WO2024103084A1 (en) 2022-11-15 2022-11-15 Process and system for determining an eye position data set

Publications (1)

Publication Number Publication Date
WO2024103084A1 true WO2024103084A1 (en) 2024-05-23

Family

ID=84359699

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AT2022/060398 WO2024103084A1 (en) 2022-11-15 2022-11-15 Process and system for determining an eye position data set

Country Status (1)

Country Link
WO (1) WO2024103084A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6595641B1 (en) * 1998-06-23 2003-07-22 Plusoptix Ag Device for examining ocular motility
US20140044321A1 (en) * 2012-08-10 2014-02-13 EyeVerify LLC Spoof Detection for Biometric Authentication
US9775512B1 (en) * 2014-03-19 2017-10-03 Christopher W. Tyler Binocular eye tracking from video frame sequences
WO2018000020A1 (en) * 2016-06-29 2018-01-04 Seeing Machines Limited Systems and methods for performing eye gaze tracking

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6595641B1 (en) * 1998-06-23 2003-07-22 Plusoptix Ag Device for examining ocular motility
US20140044321A1 (en) * 2012-08-10 2014-02-13 EyeVerify LLC Spoof Detection for Biometric Authentication
US9775512B1 (en) * 2014-03-19 2017-10-03 Christopher W. Tyler Binocular eye tracking from video frame sequences
WO2018000020A1 (en) * 2016-06-29 2018-01-04 Seeing Machines Limited Systems and methods for performing eye gaze tracking

Similar Documents

Publication Publication Date Title
KR101966164B1 (en) System and method for ophthalmolgic test using virtual reality
US9439592B2 (en) Eye tracking headset and system for neuropsychological testing including the detection of brain damage
CN109310314B (en) Mobile device application for eye skew measurement
US9596986B2 (en) Method of measuring and analyzing ocular response in a subject using stable pupillary parameters with video oculography system
US9101312B2 (en) System for the physiological evaluation of brain function
RU2634682C1 (en) Portable device for visual functions examination
KR101812249B1 (en) Automatic measurement machine of strabismus
US20220354414A1 (en) Imaging Device, Ocular Movement Data Processing System, and Control Method
CN113080836B (en) Visual detection and visual training equipment for non-central fixation
US20240049963A1 (en) Cover-Uncover Test in a VR/AR Headset
Miller et al. Videographic Hirschberg measurement of simulated strabismic deviations.
CN114340472A (en) Joint determination of accommodation and vergence
US20220360706A1 (en) Imaging Device, Ocular Movement Data Processing System, and Control Method
WO2024103084A1 (en) Process and system for determining an eye position data set
Seo et al. The automated diagnosis of strabismus using an infrared camera
RU2531132C1 (en) Method for determining complex hand-eye reaction rate of person being tested and device for implementing it
JPH03205026A (en) Eye-ball movement data calibration method in eye-ball movement analysis device
RU2217039C1 (en) Method for carrying out ophthalmologic examination of vision field
CN214906728U (en) Strabismus detection device
US20230013357A1 (en) Examination device and eye examination method
JPH03215243A (en) Eyeballs movement analyzer
Sarès et al. Analyzing head roll and eye torsion by means of offline image processing
Buapong et al. A Low-Complexity and Efficient VR-Based System with Infrared Image Processing for Assessing the Angle of Ocular Misalignment in Strabismus Patients
WO2023214274A1 (en) A device and a method for automated measurements of eyeball deviation and/or vertical and horizontal viewing angles
CN117320614A (en) Eye examination device with camera and display screen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22805757

Country of ref document: EP

Kind code of ref document: A1