WO2021125992A1 - Dispositif pour déterminer la direction du regard - Google Patents

Dispositif pour déterminer la direction du regard Download PDF

Info

Publication number
WO2021125992A1
WO2021125992A1 PCT/RU2019/000950 RU2019000950W WO2021125992A1 WO 2021125992 A1 WO2021125992 A1 WO 2021125992A1 RU 2019000950 W RU2019000950 W RU 2019000950W WO 2021125992 A1 WO2021125992 A1 WO 2021125992A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
gaze
camera
coordinate system
coordinates
Prior art date
Application number
PCT/RU2019/000950
Other languages
English (en)
Russian (ru)
Inventor
Андрей Владимирович НОВИКОВ
Владимир Николаевич ГЕРАСИМОВ
Роман Александрович ГОРБАЧЕВ
Никита Евгеньевич ШВИНДТ
Владимир Иванович НОВИКОВ
Андрей Евгеньевич ЕФРЕМЕНКО
Дмитрий Леонидович ШИШКОВ
Михаил Нилович ЗАРИПОВ
Филипп Александрович КОЗИН
Алексей Михайлович СТАРОСТЕНКО
Original Assignee
федеральное государственное автономное образовательное учреждение высшего образования "Московский физико-технический институт (национальный исследовательский университет)"
Общество С Ограниченной Ответственностью "Нейроассистивные Технологии"
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by федеральное государственное автономное образовательное учреждение высшего образования "Московский физико-технический институт (национальный исследовательский университет)", Общество С Ограниченной Ответственностью "Нейроассистивные Технологии" filed Critical федеральное государственное автономное образовательное учреждение высшего образования "Московский физико-технический институт (национальный исследовательский университет)"
Priority to PCT/RU2019/000950 priority Critical patent/WO2021125992A1/fr
Publication of WO2021125992A1 publication Critical patent/WO2021125992A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to devices and methods for determining the direction of gaze and can be used in various fields of technology, including robotics.
  • RF patent RU2696042 describes the invention related to technologies used to determine the areas of gaze fixation during eye movement, and can be used for an objective assessment of the processes of visual attention, control of computer interfaces through the direction of the gaze, in operator activity, marketing, etc.
  • gaze during eye movement includes an infrared emitter covering the area of the respondent's eye, a reflective filter that transmits visible light and reflects infrared radiation, installed at the eye level of the respondent, a stimulating image presentation unit located behind the reflective filter, a unit for recording infrared radiation reflected from the pupil , in the form of an image of the pupil of the eye, a video camera, and an image processing unit from a video camera.
  • the disadvantage of this technical solution is that the fixation of the gaze when moving the eyes is carried out with a fixed position of the respondent's head relative to the presented image.
  • the accuracy of determining the direction of gaze is not high enough, which is especially important when it is necessary to determine the direction to a specific object, located among many others. objects.
  • US patent US 10234940 describes a gaze tracking device, as well as a gaze tracking method comprising the following steps: recording video images of a human eye such that the pupil of the eye and the glare on the eyeball caused by the light source are recorded; processing video images to calculate an offset between a position of a predetermined spatial feature and a predetermined position with respect to the flare; and by a light source such as a display, emitting light from the light pattern at a location selected from a plurality of previously configured locations of light patterns, to the human eye.
  • the location is controlled by a feedback signal; controlling the location of the light pattern among the predetermined locations of the light patterns in response to the displacement, so that the predetermined position relative to the flare caused by the light source tracks the spatial feature of human eyes.
  • the gaze tracker is configured to filter video images to identify one or more highlights that may arise from the light pattern, wherein a predetermined position relative to the highlight is computed relative to the identified one or more points.
  • This device allows to improve the accuracy of recognition of the position of the gaze and the formation of the resulting image, but imposes increased requirements on the hardware.
  • US patent US9830512 describes a method and apparatus for eye tracking.
  • the method comprises the following steps: determining the position of the central point of the cornea using at least two points of light reflection detected in the region of the eyeball of the first image of the user's face; calculating the first vector with respect to at least two fixed feature points detected from the first face image and the position of the center point of the cornea; calculating the position of the center point of the cornea relative to the eyeball region of the second face image using the position of the feature point detected from the second face image and the first vector when at least two light reflection points are not detected from the eyeball area of the second face image of the user; calculating a second vector using the calculated position of the center point of the cornea and the position of the center point of the pupil detected from the eyeball region of the second face image; and tracking the user's gaze using the second vector.
  • This known method and system for eye tracking can improve the accuracy of recognition of the position of the gaze and the formation of the resulting image, but are quite complex and demanding on computational resources.
  • a gaze tracking system based on adaptive tomographic matching according to US patent US9684827 can be selected.
  • the system contains at least four light sources for generating corneal reflections in the form of flares from the subject's eye; a camera for capturing the current image containing highlights; a gaze detector for receiving a current glare-containing image and evaluating the gaze of the subject's eye.
  • the system also includes a head misalignment corrector to correct misalignment by comparing features corresponding to highlights and data related to the subject's pupil to obtain corrected gaze information, the gaze misalignment corrector using one or more variables representing head positions relative to the calibration position.
  • This known gaze tracking system based on adaptive tomographic matching also improves the accuracy of gaze position recognition and the formation of the resulting image, but it is quite complex and demanding on computational resources, and, in addition, requires additional actions from the user.
  • the technical result of the claimed invention is to improve the accuracy of determining the direction of gaze, reduce the weight of the wearable device and the possibility of using high-performance high-resolution cameras to further improve the accuracy of determining the direction of gaze.
  • the problem is solved, and the technical result is achieved in the claimed device for determining the direction of the user's gaze, which contains a housing, a camera of the left eye and a camera of the right eye, a scene camera, a plurality of light sources of the left eye and light sources of the right eye, a computing module, a control module and a power supply ...
  • the body is essentially made in the form of glasses and has a left rim and a right rim, a nose and two side pressure bars.
  • Left eye cameras and right eye cameras are installed in the lower parts of the respective rims below the corresponding eye and are designed to acquire images of the left eye and right eye, i. E. unlike analogs in the claimed device, images are obtained from both eyes, and not one.
  • the scene camera is mounted on a nosepiece and is designed to obtain an image of the surrounding scene. Multiple lights are installed on the left bezel and right bezel to create highlights on the eyes.
  • the computational module is configured to determine the gaze vector in the coordinate system of the cameras of the left and right eyes (hereinafter also referred to as local coordinates for brevity); converting the gaze vector in local coordinates to the gaze vector in the scene camera coordinate system (hereinafter also called global coordinates for brevity); performing calibration procedures and transmitting information about the coordinates of the gaze direction vector in the scene camera coordinate system to an external device, for example, a robotic arm, which can be controlled by the claimed gaze direction determination device.
  • the control module provides control of the declared device and interaction between the components of the device.
  • the power supply supplies power to the components of the device.
  • the declared design of the device for determining the direction of gaze is simple and light enough so that it can be constantly used, and at the same time provides high accuracy in determining the direction of the direction of the user's gaze in global coordinates (in the coordinate system of the scene camera), in fact, it is a coordinate system of the location of the object of observation in space.
  • the device employs six left-eye light sources and six right-eye light sources.
  • the sources for each eye were installed at substantially equal distances from each other along the perimeter of the corresponding rim, which ensures uniform illumination of the eye, and therefore, increased accuracy of determining the direction of gaze and the possibility of determining it at large angles of rotation of the eyeball.
  • the computing module is configured to determine the local coordinates of the flare and the center of the pupil. In addition or in addition to this, for converting the local coordinates of the gaze vector into global coordinates of the gaze vector, the computing module can be configured to match the local coordinates of the gaze vector and the surrounding scene obtained by the scene camera.
  • the control module can control the brightness of the left eye light sources and the right eye light sources to provide optimal brightness depending on the surrounding conditions, hence the required accuracy of determining the gaze vector.
  • the device may additionally contain an information storage unit, in particular, for storing the results of calibration procedures for a specific user, which are necessary, in particular, for converting the local coordinates of the gaze vector into global coordinates of the gaze vector.
  • an information storage unit for storing the results of calibration procedures for a specific user, which are necessary, in particular, for converting the local coordinates of the gaze vector into global coordinates of the gaze vector.
  • FIG. 1 shows a general view of the claimed device
  • in fig. 2 schematically shows a front view of the claimed device
  • in fig. 3 is a schematic rear view of the claimed device
  • Fig. 4 schematically shows the relative position of the left or right eye, the camera of the left or right eye and one light source of the left or right eye, as well as a diagram of the path of the rays.
  • P is the center of the pupil
  • R is the point of refraction
  • Qi is the point of reflection i-ro of the light source 8
  • V - the image of the center of the pupil
  • the device 1 for determining the direction of the user's gaze comprises a body 2, which is essentially made in the form of glasses having a left rim 3.1 and a right rim 3.2, a nosepiece 4 and two lateral pressure bars 5.1, 5.2.
  • the body 2 is preferably made of a durable and lightweight material suitable for permanent or long-term wearing of the device 1, taking into account the anthropometric data of the user's head.
  • replaceable nozzles 4 can be used, which regulate the position of the body 2 in height and fit under the bridge of the nose to prevent it from being pinched. Correct selection of the distance from the eye to the rims 3.1, 3.2 contributes to the correct operation of the eye image recognition system.
  • the camera 6.1 of the left eye and the camera 6.2 of the right eye are installed, hereinafter also referred to as cameras 6.1, 6.2 of the eye.
  • Cameras 6.1, 6.2 eyes are designed to register and record a video sequence that reflects the process of eye movement, and to obtain images of the corresponding eye. At the same time, the movement of both the left and right eyes of the user is recorded, which is necessary to accurately determine the direction of gaze and eliminate false alarms.
  • Cameras 6.1, 6.2 eyes contain light-sensitive matrices and corresponding optical systems (lenses) for the formation of an enlarged focused image of the eyes on light-sensitive matrices.
  • the images of the pupil and flares are recorded and the coordinates of the center of the pupil and each of the flares are determined in the camera coordinates 6.1, 6.2 of the eye (local coordinates).
  • one eye camera 6.1, 6.2 is used to register the image of each eye 11 from one angle - from below (Fig. 4).
  • the chosen angle is determined by the fact that the lower eyelashes are significantly shorter than the upper ones and do not create obstacles for image registration.
  • the angle between the optical axis of the lens and the axis of the direction of view is chosen based on considerations of a compromise between the best image registration angle (corresponds to 0 ° angle) and the angle at which the camera is guaranteed not to fall into the user's field of view (corresponds to 90 ° angle).
  • the most preferred is the value of the specified angle from 45 ° to 60 °.
  • the advantage of using two cameras 6.1, 6.2 eyes is to increase the reliability of device 1 due to the possibility of maintaining its operability in the event of failure of one of the recording channels, including cameras 6.1, 6.2 eyes.
  • the determination of the direction of gaze will be determined according to the data from the camera 6.1, 6.2 of the eye, related to the working recording channel.
  • device 1 will remain operational with an acceptable loss of quality of operation for a number of indicators.
  • CMOS complementary metal-oxide-semiconductor
  • the choice of a CMOS matrix is due to the fact that this type of matrix is distinguished by a small pixel size, low power consumption, the ability to implement a number of processing functions on the chip, low smearing and blurring of the image, as well as the ability to read information both from the entire matrix and from a separate area.
  • the use of a color matrix is due to the fact that the display of an eye in a color palette allows using the procedure for binarization of a color image by the levels of each color component, which, in turn, makes it possible to reliably highlight the detected objects due to their achromatic colors.
  • the format of the photosensitive matrix is selected based on the requirements for light sensitivity.
  • a larger sensor size with the same number of pixels has a higher sensitivity due to a larger pixel area, thereby introducing a lower noise level, providing better image quality and simplifying and speeding up the processing of obtained eye images.
  • the larger size of the matrix provides a maximum viewing angle of 6.1, 6.2 cameras at small lens focal lengths.
  • the optimal solution is to choose a 1/3 ”photosensitive matrix with a side size of 4, 8x3, 6 mm, which ensures sufficient light sensitivity.
  • Such a matrix can be, for example, OmniVision OV4688.
  • the optical system of cameras 6.1, 6.2 eyes (not shown in the figures) is designed to form an image in the plane of the photosensitive matrix.
  • the optical system contains several lenses and a light filter. The distance from the exit pupil of the lens to the eye changes insignificantly and can be considered fixed.
  • the optical scheme of a compact four-component lens with an output aspherical lens can be used to compensate for distortion. There are no special requirements for the depth of field of the image.
  • a scene camera 7 (Fig. 2) is installed on the nozzle 4, which is used to obtain an Image of the surrounding scene.
  • the scene camera 7 is a front-facing high-resolution digital video camera to ensure the fixation of the environment, to which the gaze vector is subsequently linked.
  • the scene camera 7 contains a photosensitive matrix and is preferably located on the upper part of the glasses body 2. Since there are no special requirements for the camera 7 of the scene, you can use any standard compact camera based on a color CMOS sensor and without autofocus.
  • Light sources 8 are installed, which form glare on the corresponding eye (Fig. 3).
  • Light sources 8 are designed to create the minimum required level of illumination and the formation of point glare due to reflections from the cornea of the eye.
  • an IR radiation source in particular, an IR LED, can be used.
  • Off-axis IR illumination of the eye creates a dark pupil effect and forms images of light sources 8 by reflecting radiation from the cornea of the eye.
  • the images of light sources 8 formed by reflection from the cornea are called the first Purkinje images, or flares.
  • the dark pupil and glare are then visualized by the optical system of cameras 6.1, 6.2 eyes and captured by light-sensitive matrices with sufficient sensitivity in the near infrared spectrum.
  • the images of the pupil and highlights move in proportion to the rotation of the eyeball, but in different trajectories.
  • the difference vector between these two features is used to determine the gaze vector.
  • the central emission wavelength of the light sources 8 is preferably selected from the wavelength range from 860 to 990 nm, which in the best embodiment of the claimed device 1 corresponds to the working range of cameras 6.1, 6.2 of the eye (near infrared range).
  • the choice of near-infrared radiation for light sources 8 is due to several reasons, in particular: - near-infrared radiation is invisible to the human eye, does not distract the user's attention and does not cause pupil dilation;
  • the image of the pupil has a high contrast due to reflection from the retina (illumination scheme with a dark pupil);
  • the number of light sources 8 determines the number of highlights on the cornea, relative to which the distance to the center of the pupil will be measured. To improve the accuracy of determining the direction of gaze, two or more highlights are usually used. In the claimed device 1 for determining the direction of gaze, a method of measuring six highlights for each eye is used. This approach improves not only the accuracy, but also the reliability of the device.
  • part of the glare falls on the sclera of the eye and is not detected by cameras 6.1, 6.2 of the eye, but due to the use of six flares, at least four of them fall on the pupil or iris 12 (Fig. 4), and their coordinates are determined with a sufficiently high reliability.
  • Light sources 8 are installed in rims 3.1, 3.2 in the area of the left and right light openings and are located relative to the light opening in such a way that so that six highlights on the cornea of each eye form a control pattern in the form of a hexagon.
  • Light sources 8 can be installed on platforms located in special grooves on rims 3.1, 3.2. The angles of the pads are calculated to ensure uniform illumination of the eye area. The depth of installation of the grooves in the rims 3.1, 3.2 is chosen in such a way as to ensure the unimpeded passage of radiation.
  • the irradiation is performed with non-collimated divergent beams for uniform illumination of the entire analyzed area.
  • the angle between the optical axis of the lens of cameras 6.1, 6.2 eyes and the normal to the light-emitting area of light sources 8 lies in the range from 0 to 90 ° and is much greater than zero (cm Fig. 4).
  • This arrangement allows the use of six separate light sources 8 for each light opening and simplifies the process of assembling the device 1. It is preferable to place the light sources 8 symmetrically relative to the horizontal and vertical axes of the light opening, at equal distances from the optical axes of the camera lenses 6.1, 6.2 eyes.
  • the light sources 8 operate in a modulation mode according to a periodic law to increase the efficiency of the process of extracting useful information against the background of external illumination, extend the service life, reduce power consumption and reduce the irradiation of the cornea and retina of the human eye.
  • the intensity control of the light sources 8 is carried out by modulating the supply current.
  • a single IR LED with a corresponding standard lens, in particular, SFH-4053, can be used as a light source 8.
  • the device 1 also contains a computing module 9, a control module 10, and a power supply unit (not shown in the figures).
  • the location of the computing module 9 and the control module 10 can be any, for example, on the clamping bars 5.1, 5.2, as shown in FIG. 13.
  • Computing module 9 is designed to implement an algorithm for determining the direction of gaze and performs the following functions:
  • the computing module 9 contains a detector of the position and size of the pupil (not shown in the figures; hereinafter also DPS) or is associated with it as with an external device.
  • DPRZ is designed to process the eye image in real time with a frame rate of at least 50 Hz in order to build an ellipse that coincides with the contour of the pupil, and then determine the coordinates of its center. Also DPRZ calculates the position of the glare. Small dimensions, weight and low power consumption of the DPRZ unit should ensure the possibility of placing the computing module 9 on the housing 2, in particular, on the clamping bar 5.1 or 5.2 (as shown in Fig. 2 for an example).
  • DPRZ can be implemented in an FPGA package, which allows not only to reduce power consumption, but also makes it possible to set the shape of the computing module 9 for the size and shape of the package 2, instead of creating a more bulky and less convenient shape around the package 2, the form factor of which is often cannot be changed.
  • the control module 10 provides two functions:
  • Data from cameras 6.1, 6.2 eyes are fed through the control module 10 to the DPRZ.
  • the exchange of data between the DPRZ and the computing module 9 is performed via any interface suitable for these purposes, for example, USB 2.0.
  • the parameters of the ellipse and flare, determined by the DSP from the image of the pupil, are transmitted to the computing module 9, where they are used to calculate the direction of gaze.
  • the power supply can be any suitable power source, rechargeable (battery) or non-rechargeable. It is preferable if the power supply is remote, connected to the components of the device 1 located on the housing 2 by means of a cable.
  • the device 1 may further comprise an information storage unit (not shown in the figures), in particular for storing the results of the gaze direction calibration, which will be discussed below.
  • an information storage unit (not shown in the figures), in particular for storing the results of the gaze direction calibration, which will be discussed below.
  • the device 1 continuously tracks the user's gaze direction and transmits the coordinates of the intersection of the gaze vector with the plane of the image obtained from the scene camera 7, provided that the gaze vector is in the field of view of the scene camera 7. Fixation of a point (area) of attention can be carried out by volitional blinking, keeping attention on this point (area) for a fixed time, or by another method.
  • the method of digital infrared video oculography can be used with the subsequent binding of the found vector of the direction of gaze to the image of the environment.
  • the eye is illuminated with light, in particular infrared light, which is reflected from the cornea and lens of the eye 11, and then recorded using cameras 6.1, 6.2 of the eye.
  • the position of the pupil is calculated as the center of an area of sharp contrast within the iris, which is observed when illuminated by light sources 8. Corneal flare caused by corneal reflection is used as a reference point to measure the direction of gaze.
  • the vector of the difference between the coordinates of the center of the pupil and the glare changes with a change in the direction of gaze and is connected by geometric relationships with the vector of the direction of gaze.
  • the binding of the vector of the direction of gaze to the environment is carried out by overlaying found coordinates of the vector to the image of the surrounding scene, which can be obtained using the camera 7 of the scene.
  • Image processing is performed in real time.
  • Device 1 can operate in two modes: calibration mode and operating mode.
  • An example of a calibration option includes the following steps:
  • the conversion factors are calculated from the local coordinate system to the global coordinate system.
  • the eyes determine the gaze direction vector in the local coordinate system (i.e., in the camera coordinates 6.1, 6.2 eyes), recalculate the local coordinates of the gaze direction vector into global coordinates of the gaze vector (ie, in the coordinate system of the camera 7 of the scene) and get the corresponding fixation point of the gaze on the scene image.
  • the following describes a variant of the main steps for determining the position of the pupil of the eye by means of the claimed device 1.
  • First stage. Obtaining images of the left and right eyes from cameras 6.1, 6.2 eyes.
  • the image of the eye comes from cameras 6.1, 6.2 of the eye with a frequency of at least 50 Hz.
  • the field of view of cameras 6.1, 6.2 eyes is chosen so that the eye always remains in the frame, regardless of variations in the position of the device 1 on the user's head.
  • the position of the eye in the frame of the eye image can change, it is not always located in the center of the frame.
  • the pupil is the largest coherent dark area in the eye image.
  • a preliminary position of the center of the pupil is determined, as well as the number of pixels in the area of the pupil, which is preliminary characterizing its size. Due to the not entirely uniform illumination of the dark area of the pupil, part of the pupil may not be included in the area of the pupil, so that the center and size of the pupil are not determined quite accurately at this stage.
  • Stage three By analyzing the histogram of pixel brightness values in the area selected in the second stage, the computational unit 9 finds the binarization threshold for accurately constructing the pupil ellipse. Next, binarization is used to determine the pupil boundary, and a preliminary pupil ellipse is constructed based on the pupil boundary.
  • Stage four Exact construction of the pupil ellipse.
  • the exact construction of the ellipse of the pupil is carried out according to the nodal points of the pupil boundary, determined in the third stage, filtered so that the nodal points form a convex figure.
  • the construction of the ellipse of the icon is carried out by the least squares method by means of the computational module 9.
  • the modeling of these methods carried out by the authors according to the third and fourth stages shows a high probability of visual coincidence of the precisely constructed pupil ellipse with the pupil boundary visible in the eye images obtained from cameras 6.1, 6.2 of the eye at the first stage.
  • Glare is numbered to determine the vector of the direction of gaze in the global coordinate system, determined by the camera 7 of the scene, for example, in the following order: from one of the upper pair closest to the bridge of the nose, in a circle, away from the bridge of the nose (clockwise for the right eye and counterclockwise for the left).
  • the angles of the direction of the gaze direction vector in the local coordinate system are determined. Then they are recalculated into the global coordinate system determined by the camera 7 of the scene. In this case, the individual characteristics of the user (angles of deviation of the area of best vision from the vector of the eye direction) and the design features of the device for determining the direction of the user's gaze (in particular, the relative position of cameras 6.1, 6.2 and camera 7 of the scene) are taken into account, which are determined at the calibration stage.
  • Calibration stage Calibrating the user's gaze direction consists of two steps. In the first step, the actual calibration is carried out, in the second step, the calibration is checked. Calibration is carried out using, for example, a monitor screen or a tablet on which the ArUco tag moves. During the development of the calibration procedure, it was found that the moving mark gives approximately three times more accurate results than the stationary mark, and also better retains the user's attention.
  • the present invention makes it possible to achieve high accuracy in determining the direction of gaze even with small changes in the position of the system on the user, while remaining compact and consuming little

Abstract

La présente invention concerne des dispositifs pour déterminer la direction du regard et peut être utilisée dans différents domaines de la technique, y compris en robotique. Ce dispositif comprend un corps, une caméra d'oeil gauche et une caméra d'oeil droit pour obtenir une image des yeux, une caméra de scène pour obtenir une image de la scène environnante, une pluralité de sources de lumière d'oeil gauche et d'oeil droit pour générer des clignement des yeux, un module informatique pour déterminer un vecteur de direction du regard, effectuer des procédures d'étalonnage et transmettre des informations sur le vecteur de direction du regard vers un dispositif externe, un module de commande et un module d'alimentation. Le résultat technique consiste en une augmentation de la précision de définition de la direction du regard, un grande vitesse de traitement des données, de faibles exigences en matériel, et une diminution du poids de l'objet portatif.
PCT/RU2019/000950 2019-12-16 2019-12-16 Dispositif pour déterminer la direction du regard WO2021125992A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/RU2019/000950 WO2021125992A1 (fr) 2019-12-16 2019-12-16 Dispositif pour déterminer la direction du regard

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/RU2019/000950 WO2021125992A1 (fr) 2019-12-16 2019-12-16 Dispositif pour déterminer la direction du regard

Publications (1)

Publication Number Publication Date
WO2021125992A1 true WO2021125992A1 (fr) 2021-06-24

Family

ID=76477662

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/RU2019/000950 WO2021125992A1 (fr) 2019-12-16 2019-12-16 Dispositif pour déterminer la direction du regard

Country Status (1)

Country Link
WO (1) WO2021125992A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114067420A (zh) * 2022-01-07 2022-02-18 深圳佑驾创新科技有限公司 一种基于单目摄像头的视线测量方法及装置

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180184958A1 (en) * 2011-05-20 2018-07-05 Google Llc Systems and methods for measuring reactions of head, eyes, eyelids and pupils
RU2678478C2 (ru) * 2014-04-29 2019-01-29 МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи Обращение с бликами в среде отслеживания движения глаз

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180184958A1 (en) * 2011-05-20 2018-07-05 Google Llc Systems and methods for measuring reactions of head, eyes, eyelids and pupils
RU2678478C2 (ru) * 2014-04-29 2019-01-29 МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи Обращение с бликами в среде отслеживания движения глаз

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114067420A (zh) * 2022-01-07 2022-02-18 深圳佑驾创新科技有限公司 一种基于单目摄像头的视线测量方法及装置

Similar Documents

Publication Publication Date Title
JP6159263B2 (ja) 照射特性を調整可能にして少なくとも1つの眼における少なくとも1つのパラメータを捕捉する光学的測定装置及び方法
US9033502B2 (en) Optical measuring device and method for capturing at least one parameter of at least one eye wherein an illumination characteristic is adjustable
JP6308940B2 (ja) 視線追跡シーン参照位置を識別するシステム及び方法
KR100342159B1 (ko) 홍채영상 포착장치 및 홍채영상 포착방법
US11543883B2 (en) Event camera system for pupil detection and eye tracking
WO2019028152A1 (fr) Poursuite oculaire à l'aide d'un multiplexage temporel
US11093034B2 (en) Eye tracking method and system and integration of the same with wearable heads-up displays
JP6957048B2 (ja) 眼部画像処理装置
CN109857253A (zh) 一种眼球追踪装置及方法
WO2021125992A1 (fr) Dispositif pour déterminer la direction du regard
WO2021125993A1 (fr) Procédé de détermination de la direction du regard
RU219079U1 (ru) Устройство определения направления взгляда
RU2815470C1 (ru) Способ определения направления взгляда
US20240160012A1 (en) Camera system for pupil detection and eye tracking
CN114748035A (zh) 基于单摄像头的瞳孔光反射数据采集系统
CN117666136A (zh) 头戴式虚拟现实设备
AU2010352092A1 (en) Device for producing images of irises of the eyes

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19956472

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19956472

Country of ref document: EP

Kind code of ref document: A1