WO2020184775A1 - Dispositif et procédé pour mesurer un angle de strabisme - Google Patents

Dispositif et procédé pour mesurer un angle de strabisme Download PDF

Info

Publication number
WO2020184775A1
WO2020184775A1 PCT/KR2019/004103 KR2019004103W WO2020184775A1 WO 2020184775 A1 WO2020184775 A1 WO 2020184775A1 KR 2019004103 W KR2019004103 W KR 2019004103W WO 2020184775 A1 WO2020184775 A1 WO 2020184775A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
eyeball
target
eye
gaze
Prior art date
Application number
PCT/KR2019/004103
Other languages
English (en)
Korean (ko)
Inventor
허환
박규해
이난미아오
전준영
Original Assignee
전남대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 전남대학교산학협력단 filed Critical 전남대학교산학협력단
Publication of WO2020184775A1 publication Critical patent/WO2020184775A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Definitions

  • Embodiments of the present invention relate to a deviation angle measurement technique.
  • Strabismus is a disease in which the eyes of both eyes cannot point straight at an object, and the exact cause has not yet been identified. Strabismus is a common disease in childhood and appears at a rate of about 4 people per 100 people, and is divided into esotropia in which the pupil returns to the inside, exotropia that returns to the outside, superior strabismus in which the pupil rises, and inferior strabismus in which the pupil descends.
  • strabismus is not treated early, it is very important to diagnose and correct strabismus early because it can lead to amblyopia in which strabismus eyes are rarely used.
  • strabismus surgery since accurate measurement of the deviation angle determines the success or failure of the operation, it is essential to derive an objective and accurate deviation angle by analyzing the movement of the eyeball.
  • Embodiments of the present invention are to provide an apparatus and method for measuring a perspective angle using a 3D stereoscopic image displaying a virtual target.
  • An apparatus for measuring a deviation angle includes: an image display unit that displays a 3D target image including a target; An eye photographing unit for photographing an eyeball of a subject gazing at the 3D target image; The first eyeball of the 3D target image and a reference image photographing the first eye of the test subject while blocking the second image displayed for the second eye of the test subject among the 3D target images
  • a gaze tracking unit configured to determine a gaze position of the first eyeball in each of the comparison images taken of the first eyeball while blocking the displayed first image; It is determined whether the gaze position of the first eyeball in the comparison image and the reference image coincide, and if not, the gaze position of the first eyeball in the comparison image is the gaze position of the first eyeball in the reference image
  • An image control unit for moving the position of the target until it coincides with; And a deviation angle determining unit for determining a deviation angle based on an initial position and a final position of the target when the position of the first eyeball in the comparison image coincides with the position of the
  • the gaze position of the first eyeball may be a pupil center position of the first eyeball.
  • the gaze tracker may determine an image of the first eyeball photographed while the first image is blocked as the comparison image.
  • the gaze tracker may determine the comparison image based on a difference between each image of the first eyeball and the reference image in a state in which the first image is blocked.
  • the image controller may move the position of the target based on fuzzy logic.
  • the deviation angle determination unit may calculate a movement distance of the target based on the initial position and the final position, and determine the deviation angle using the movement distance.
  • a method for measuring a deviation angle includes the steps of: (a) displaying a 3D target image including a target; (b) acquiring a reference image photographing the first eye of the test subject while blocking the second image displayed for the second eye of the test subject among the 3D target images; (c) determining a gaze position of the first eyeball in the reference image; (d) releasing the blocking of the second image and obtaining a comparison image of the first eyeball while blocking the first image displayed for the first eyeball among the 3D target images; (e) determining a gaze position of the first eyeball in the comparison image; (f) determining whether the line of sight positions of the first eyeball in the comparison image and the reference image coincide; (g) moving the position of the target when the position of the first eyeball gaze does not match in the comparison image and the reference image; And (h) determining a bevel angle based on an initial position and a final position of the target when the position of the eyeball of the first eyeball in the comparison image and the reference image and the reference
  • the gaze position of the first eyeball may be a pupil center position of the first eyeball.
  • an image of the first eyeball being photographed while the first image is blocked may be determined as the comparison image.
  • the comparison image may be determined based on a difference between each image of the first eyeball and the reference image in a state in which the first image is blocked.
  • the position of the target may be moved based on a fuzzy logic.
  • the step (h) may include calculating a moving distance of the target based on the initial position and the final position; And determining the bevel angle using the moving distance.
  • the eyeball of the test subject while alternately blocking the left-eye image and the right-eye image of a three-dimensional target image displaying a virtual target, the eyeball of the test subject is photographed to determine the gaze position, It is possible to accurately and quickly measure the angle of view by determining the angle of view using the change in the position of the eyeball's line of sight.
  • FIG. 1 is a configuration diagram of a perspective angle measuring apparatus according to an embodiment of the present invention
  • FIG. 2 is an exemplary view showing a form of an image display unit according to an embodiment of the present invention
  • FIG. 3 is a diagram for explaining a comparison image determination process according to an embodiment of the present invention
  • 4 to 7 are exemplary diagrams for explaining a process of moving a target according to a gaze position
  • FIG. 9 is a flowchart of a method for measuring a bevel angle according to an embodiment of the present invention
  • FIG. 10 is a block diagram illustrating and describing a computing environment including a computing device suitable for use in example embodiments.
  • FIG. 1 is a block diagram of an apparatus for measuring a perspective angle according to an embodiment of the present invention.
  • a deviation angle measuring apparatus 100 includes an image display unit 110, an eye photographing unit 120, a gaze tracking unit 130, an image control unit 140, and a deviation angle. It includes a determination unit 150.
  • the image display unit 110 displays a 3D target image including a target.
  • the 3D target image uses an image for the left eye displayed for the left eye of the test subject and an image for the right eye displayed for the right eye of the test subject. It may be a 3D stereoscopic image configured to be displayed in 3D.
  • the image display unit 110 may be implemented in the form of a Head Mounted Display (HMD) 210 configured to display a 3D stereoscopic image using an image for the left eye and an image for the right eye, as shown in FIG. 2.
  • HMD Head Mounted Display
  • the image display unit 110 includes two displays including a display screen 220 that displays an image for the left eye of the test subject's left eye and a display screen 230 that displays an image for the right eye of the test subject's right eye.
  • a three-dimensional image can be displayed through the screen.
  • the shape of the image display unit 110 is not necessarily limited to the example illustrated in FIG. 2, and may be implemented in various forms capable of displaying a 3D stereoscopic image in addition to the example illustrated in FIG. 2.
  • the eye photographing unit 120 photographs the eyeball of the test subject gazing at the 3D target image.
  • the eye photographing unit 120 may include one or more cameras for continuously photographing the left eye of the test subject and one or more cameras for continuously photographing the right eye of the test subject. have.
  • each camera may be, for example, an infrared camera, but is not necessarily limited to a specific type of camera.
  • each camera may be disposed in front of the left eye or the right eye to generate an eye photographing image for each eye.
  • the camera for photographing the right eye is within the HMD 210 It may be disposed between the display screen 220 displaying the image for the right eye and the right eye of the subject, and the camera for photographing the left eye is the display screen 230 and the display screen 230 displaying the image for the left eye in the HMD 210. It can be placed between the examiner's left eye.
  • the position of the camera is not necessarily limited to a specific position, and may be disposed at various positions according to exemplary embodiments in consideration of the shape of the image display unit 110.
  • the gaze tracking unit 130 alternately blocks one of the left eye image and the right eye image of the 3D target image displayed by the image display unit 110, and the eye gaze in the eyeball photographed image for each eyeball continuously photographed. Determine the location.
  • blocking of one of the left-eye image and the right-eye image may be performed by, for example, displaying a black image instead of an image to be blocked.
  • blocking of the left-eye image may be performed by displaying a black image instead of the left-eye image.
  • blocking of one of the left-eye image and the right-eye image may be performed through a separate shielding film that opens and closes between a display screen displaying an image to be blocked and an eye of a subject.
  • blocking of the left eye image may be performed through a shielding film that blocks the gaze of the left eye of the examinee between the display screen displaying the left eye image and the left eye of the examinee.
  • the blocking method for the left-eye image and the right-eye image may be performed in various ways according to embodiments in addition to the above-described examples, and is not necessarily limited to a specific method.
  • the gaze tracking unit 130 blocks only the image displayed for the second eye of the test subject among the left eye image and the right eye image constituting the 3D target image.
  • a reference image may be determined from among images continuously photographed of the eyeball, and a gaze position of the first eye may be determined from the determined reference image.
  • the gaze tracker 130 determines one or more of the images continuously photographed of the first eyeball while blocking only the image displayed for the first eye among the left-eye image and the right-eye image as a comparison image, In each of the determined comparison images, a gaze position of the first eye may be determined.
  • the gaze tracking unit 130 is, for example, the size of the first eyeball among images continuously photographed with the first eyeball while blocking only the image displayed for the second eyeball among the left-eye image and the right-eye image
  • the image with the largest may be determined as the reference image.
  • the gaze tracking unit 130 is one of the images continuously photographed of the first eyeball while blocking only the image displayed for the first eye among the left-eye image and the right-eye image.
  • the above image may be determined as a comparison image.
  • the eye tracker 130 may determine a comparison image based on a difference between each of the images continuously photographed of the first eyeball and the reference image while only the image displayed for the first eyeball is blocked.
  • FIG. 3 is a diagram illustrating a process of determining a comparison image according to an embodiment of the present invention.
  • the gaze tracking unit 130 blocks images 320 of each frame continuously photographing the first eye while only blocking the reference image 310 and the image displayed for the first eye. Difference images 340 and 350 between 330 may be generated.
  • the gaze tracker 130 calculates the sum (or average) of intensity values of all pixels from the generated difference images 340 and 350, and compares the image of the frame whose calculated value is less than a preset threshold. Can be determined by
  • the sum of the intensity values of all pixels calculated in the difference image 340 between the image 320 and the reference image 310 taken while the first eyeball is completely closed is the threshold value. Therefore, the image 320 may be removed as noise.
  • the corresponding image 330 May be determined as a comparison image.
  • the gaze tracking unit 130 may determine the center of the pupil as the gaze position in each of the reference image and the comparison image.
  • the gaze tracker 130 may estimate an iris candidate region from each of the reference image and the comparison image using, for example, a fast radial symmetric transformation technique. Thereafter, the gaze tracker 130 extracts the outline of each image using an edge detection technique such as a Canny Edge Detector, and then, for example, uses a Delaunay Triangulation technique. Through this, it is possible to create a plurality of triangles connecting the points of the outline. Thereafter, the gaze tracking unit 130 may detect a portion having an ellipse shape among the generated lines constituting each triangle, and estimate the detected portion as the iris position. On the other hand, when the iris position is estimated, the gaze tracking unit 130 determines the pupil region by applying an ellipse fitting technique based on the estimated iris position, and gazes at the center of the determined pupil region. Can be determined by location.
  • an edge detection technique such as a Canny Edge Detector
  • the image controller 140 breaks whether or not the gaze position of the first eyeball in the reference image for the first eye and the comparison image coincide, and if they do not match, the first eyeball in the comparison image The position of the target is moved until the gaze position coincides with the gaze position of the first eyeball in the reference image.
  • 4 to 7 are exemplary diagrams for explaining a process of moving a target according to a gaze position.
  • the gaze tracking unit 130 blocks the image 420 for the right eye in a state in which the target 410 is located at an initial position (eg, in front of the test subject) in the 3D target image. Then, the gaze position 441 of the left eye is determined from the acquired reference image 440 for the left eye.
  • the gaze estimating unit 130 blocks the left eye image 430 while the target 410 is located at the initial position, as shown in FIG. 5, and the left eye in the obtained comparison image 450 for the left eye.
  • the position of the gaze of the left eye does not change even if the left eye image 430 is blocked, but in the case of a strabismus patient, when the left eye image 430 is blocked, as shown in FIG.
  • the gaze location of the left eye is changed, resulting in an eyeball deviation.
  • the image controller 140 moves the position of the target 410 as in the example shown in FIG. 6.
  • the gaze estimating unit 130 determines the gaze position 461 of the left eye from the comparison image 460 for the left eye obtained while the target 410 is moved.
  • the image controller 140 additionally moves the position of the target 410.
  • the image controller 140 controls the target 410 until the left eye gaze position 471 and the gaze position 441 of the reference image 440 match. Move the position of.
  • the image control unit 140 is based on a fuzzy logic until the position of the first eyeball in the comparison image coincides with the position of the first eyeball in the reference image.
  • the position of the target can be moved repeatedly.
  • the image control unit 140 first moves the target based on pre-set eyeball deviation information (for example, the eyeball deviation angle based on the average eyeball diameter of a male or infant), and then the gaze position in the reference image and the comparison image. It is possible to repeatedly perform the process of changing the position of the target up, down, left, and right while reducing the moving distance until the difference (for example, the distance between the pupil centers) converges to 0.
  • pre-set eyeball deviation information for example, the eyeball deviation angle based on the average eyeball diameter of a male or infant
  • the deviation angle measurement unit 150 determines the deviation angle based on the initial position and the final position of the target when the position of the first eyeball in the comparison image and the reference image for the first eye coincide. Decide.
  • the image controller 140 when an eyeball deviation occurs, the image controller 140 repeatedly moves the position of the target until the position of the gaze in the reference image and the comparison image coincide.
  • the initial position of the target refers to the initial position of the target before starting the position movement
  • the final position may refer to the position of the target when the gaze position in the reference image and the comparison image coincide with the reference image after starting the position movement.
  • the bevel angle measurement unit 150 may calculate a moving distance of the target based on the initial position and the final position of the target, and determine the bevel angle based on the calculated moving distance. .
  • FIG. 8 is an exemplary diagram for explaining determination of a deviation angle according to an embodiment of the present invention.
  • d 1 is the left and right movement distance between the initial position and the final position of the target
  • d 2 is the distance between the center point between the two eyes and the initial position of the target
  • PD is the distance between the two eyes
  • Denotes a deviation angle, and the deviation angle may be calculated using, for example, Equation 1 below.
  • d 2 and PD may use preset values.
  • d 2 may be set as the average value of an adult
  • PD may be set to 6 m or 0.33 m based on a commonly performed strabismus diagnosis manual.
  • FIG. 8 is for explaining an example of the left and right strabismus angles for left and right strabismus, but in addition to the left and right strabismus, the vertical angles for upper and lower strabismus may be determined in a similar manner.
  • FIG. 9 is a flowchart of a method for measuring a deviation angle according to an embodiment of the present invention.
  • the method shown in FIG. 9 may be performed, for example, by the perspective angle measuring apparatus 100 shown in FIG. 1.
  • the apparatus 100 for measuring a deviation angle first displays a 3D target image including a target (910).
  • the deviation angle measurement apparatus 100 acquires a reference image photographing the first eye of the test subject while blocking the second image displayed for the second eye of the test subject among the 3D target images (920). .
  • the deviation angle measuring apparatus 100 determines the position of the first eyeball in the reference image (930 ).
  • the deviation angle measurement apparatus 100 releases the blocking of the second image, and the first eyeball is photographed in a state in which the first image displayed for the first eye of the test subject among the 3D target images is blocked. An image is acquired (940).
  • the deviation angle measuring apparatus 100 determines the position of the first eyeball in the obtained comparison image (950).
  • the deviation angle measuring apparatus 100 determines whether the position of the first eyeball in the comparison image and the reference image coincide (960 ).
  • the deviation angle measurement apparatus 100 moves the position of the target (970) and repeats steps 940 to 970 until the position of the eyeball matches.
  • the deviation angle measurement apparatus 100 determines the deviation angle based on the initial position and the final position of the target (980 ).
  • the method is described by dividing the method into a plurality of steps, but at least some of the steps are performed in a different order, combined with other steps, performed together, omitted, or divided into detailed steps. Or, one or more steps not shown may be added and performed.
  • each component may have different functions and capabilities in addition to those described below, and may include additional components in addition to those not described below.
  • the illustrated computing environment 10 includes a computing device 12.
  • the computing device 12 may be one or more components included in the oblique angle measurement device 100.
  • the computing device 12 includes at least one processor 14, a computer-readable storage medium 16 and a communication bus 18.
  • the processor 14 may cause the computing device 12 to operate according to the exemplary embodiments mentioned above.
  • the processor 14 may execute one or more programs stored in the computer-readable storage medium 16.
  • the one or more programs may include one or more computer-executable instructions, and the computer-executable instructions are configured to cause the computing device 12 to perform operations according to an exemplary embodiment when executed by the processor 14 Can be.
  • the computer-readable storage medium 16 is configured to store computer-executable instructions or program code, program data, and/or other suitable form of information.
  • the program 20 stored in the computer-readable storage medium 16 includes a set of instructions executable by the processor 14.
  • the computer-readable storage medium 16 includes memory (volatile memory such as random access memory, nonvolatile memory, or a suitable combination thereof), one or more magnetic disk storage devices, optical disk storage devices, flash It may be memory devices, other types of storage media that can be accessed by computing device 12 and store desired information, or a suitable combination thereof.
  • the communication bus 18 interconnects the various other components of the computing device 12, including the processor 14 and computer-readable storage medium 16.
  • Computing device 12 may also include one or more input/output interfaces 22 and one or more network communication interfaces 26 that provide interfaces for one or more input/output devices 24.
  • the input/output interface 22 and the network communication interface 26 are connected to the communication bus 18.
  • the input/output device 24 may be connected to other components of the computing device 12 through the input/output interface 22.
  • the exemplary input/output device 24 includes a pointing device (such as a mouse or trackpad), a keyboard, a touch input device (such as a touch pad or a touch screen), a voice or sound input device, and various types of sensor devices and/or a photographing device.
  • Input devices and/or output devices such as display devices, printers, speakers, and/or network cards.
  • the exemplary input/output device 24 may be included in the computing device 12 as a component constituting the computing device 12, and may be connected to the computing device 12 as a separate device distinct from the computing device 12. May be.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

La présente invention concerne un dispositif de mesure d'un angle de strabisme qui comprend : une unité d'affichage d'image pour afficher une image cible tridimensionnelle comprenant une cible ; une unité d'imagerie de globe oculaire pour collecter l'image des globes oculaires d'un sujet fixant l'image cible tridimensionnelle ; une unité de suivi de ligne de visée pour collecter l'image du premier globe oculaire du sujet tout en bloquant une seconde image affichée relative au second globe oculaire du sujet dans l'image cible tridimensionnelle, ce qui permet d'obtenir une image de référence, collecter l'image du premier globe oculaire tout en bloquant une première image affichée relative au premier globe oculaire parmi l'image cible tridimensionnelle, ce qui permet d'obtenir une image comparative, et déterminer la position de ligne de visée du premier globe oculaire de chacune parmi l'image de référence et l'image comparative ; une unité de commande d'image pour déterminer si la position de ligne de visée du premier globe oculaire dans l'image comparative est identique ou non à celle dans l'image de référence et, si elles ne sont pas identiques, déplacer la position de la cible jusqu'à ce que la position de ligne de visée du premier globe oculaire dans l'image comparative soit identique à celle dans l'image de référence ; et une unité de détermination d'angle de strabisme pour déterminer un angle de strabisme sur la base de la position initiale de la cible et de sa position finale, si la position de ligne de visée du premier globe oculaire dans l'image comparative est identique à celle dans l'image de référence.
PCT/KR2019/004103 2019-03-11 2019-04-05 Dispositif et procédé pour mesurer un angle de strabisme WO2020184775A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190027754A KR102224209B1 (ko) 2019-03-11 2019-03-11 사시각 측정 장치 및 방법
KR10-2019-0027754 2019-03-11

Publications (1)

Publication Number Publication Date
WO2020184775A1 true WO2020184775A1 (fr) 2020-09-17

Family

ID=72427641

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/004103 WO2020184775A1 (fr) 2019-03-11 2019-04-05 Dispositif et procédé pour mesurer un angle de strabisme

Country Status (2)

Country Link
KR (1) KR102224209B1 (fr)
WO (1) WO2020184775A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102683166B1 (ko) 2022-01-28 2024-07-09 (주)뉴로니어 사시 측정용 기구 세트

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013035086A1 (fr) * 2011-09-07 2013-03-14 Improved Vision Systems (I.V.S.) Ltd. Méthode et système de traitement de déficience visuelle
WO2016139662A1 (fr) * 2015-03-01 2016-09-09 Improved Vision Systems (I.V.S.) Ltd. Système et procédé de mesure de motilité oculaire
KR101761635B1 (ko) * 2015-04-27 2017-08-04 한양대학교 산학협력단 사시각 측정 방법 및 장치
KR101825830B1 (ko) * 2016-08-26 2018-02-05 한양대학교 산학협력단 사시각 측정 시스템 및 방법, 그리고 컴퓨터로 읽을 수 있는 저장매체
KR20180083069A (ko) * 2017-01-12 2018-07-20 고려대학교 산학협력단 가상현실을 이용한 안과 검사 시스템 및 방법

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100612586B1 (ko) 1998-07-21 2007-12-04 엘지전자 주식회사 헤드마운트디스플레이의사시테스트장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013035086A1 (fr) * 2011-09-07 2013-03-14 Improved Vision Systems (I.V.S.) Ltd. Méthode et système de traitement de déficience visuelle
WO2016139662A1 (fr) * 2015-03-01 2016-09-09 Improved Vision Systems (I.V.S.) Ltd. Système et procédé de mesure de motilité oculaire
KR101761635B1 (ko) * 2015-04-27 2017-08-04 한양대학교 산학협력단 사시각 측정 방법 및 장치
KR101825830B1 (ko) * 2016-08-26 2018-02-05 한양대학교 산학협력단 사시각 측정 시스템 및 방법, 그리고 컴퓨터로 읽을 수 있는 저장매체
KR20180083069A (ko) * 2017-01-12 2018-07-20 고려대학교 산학협력단 가상현실을 이용한 안과 검사 시스템 및 방법

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
P(E-POSTER)-144, Retrieved from the Internet <URL:http://www.ophthalmology.org/abstract/2018_119/green/view.html?exp=%ED%8F%AC%EC%8A%A4%ED%84%BO%28e-poster%29no=1190287&en=> *

Also Published As

Publication number Publication date
KR102224209B1 (ko) 2021-03-05
KR20200108721A (ko) 2020-09-21

Similar Documents

Publication Publication Date Title
EP3651457B1 (fr) Procédé de mesure de distance pupillaire, équipement oculaire pouvant être porté et support d&#39;informations
CN109690553A (zh) 执行眼睛注视跟踪的系统和方法
KR102099223B1 (ko) 사시 진단 시스템 및 방법, 시선 영상 획득 시스템, 컴퓨터 프로그램
CN103927250B (zh) 一种终端设备用户姿态检测方法
WO2016114496A1 (fr) Procédé procurant une interface utilisateur au moyen d&#39;un affichage porté sur la tête, utilisant la reconnaisance oculaire et un biosignal, appareil l&#39;utilisant et support d&#39;enregistrement lisible par ordinateur
KR101255219B1 (ko) 시선 추적 방법 및 이를 적용한 장치
JP2014530730A (ja) 目の症状を特定するためのシステムおよび方法
JP7168953B2 (ja) 自動キャリブレーションを行う視線計測装置、視線計測方法および視線計測プログラム
TWI570638B (zh) 凝視分析方法與裝置
WO2019190076A1 (fr) Procédé de suivi des yeux et terminal permettant la mise en œuvre dudit procédé
CN105765608A (zh) 用于根据闪耀点进行眼睛检测的方法和设备
CN113662506A (zh) 一种角膜表面形态的测量方法、装置、介质及电子设备
WO2020184775A1 (fr) Dispositif et procédé pour mesurer un angle de strabisme
CN115590462A (zh) 一种基于摄像头的视力检测方法和装置
JP2014061085A (ja) 瞳孔部分を近似する楕円の検出を行う方法
WO2012002601A1 (fr) Procédé et appareil permettant de reconnaître une personne à l&#39;aide d&#39;informations d&#39;image 3d
WO2020230908A1 (fr) Application de diagnostic de strabisme et appareil de diagnostic de strabisme comportant cette dernière
KR101526557B1 (ko) 다중 카메라 기반 시선 추적 장치 및 방법
JP3726122B2 (ja) 視線検出システム
KR20140014868A (ko) 시선 추적 장치 및 이의 시선 추적 방법
JP2006285531A (ja) 視線方向の検出装置、視線方向の検出方法およびコンピュータに当該視線方向の視線方法を実行させるためのプログラム
WO2014011014A1 (fr) Appareil ophtalmique, et méthode de mesure d&#39;un site de traitement pour l&#39;appareil
JP4354067B2 (ja) 虹彩画像入力装置
KR100686517B1 (ko) 동공 모양 모델링 방법
Sun et al. An auxiliary gaze point estimation method based on facial normal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19919218

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19919218

Country of ref document: EP

Kind code of ref document: A1