CN110554777A - Eyeball gaze angle measuring method based on electromagnetic wave tracking - Google Patents

Eyeball gaze angle measuring method based on electromagnetic wave tracking Download PDF

Info

Publication number
CN110554777A
CN110554777A CN201910903872.1A CN201910903872A CN110554777A CN 110554777 A CN110554777 A CN 110554777A CN 201910903872 A CN201910903872 A CN 201910903872A CN 110554777 A CN110554777 A CN 110554777A
Authority
CN
China
Prior art keywords
point
sensor
tracking
eye
electromagnetic wave
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910903872.1A
Other languages
Chinese (zh)
Inventor
李杰森
王志强
李国锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201910903872.1A priority Critical patent/CN110554777A/en
Publication of CN110554777A publication Critical patent/CN110554777A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention belongs to the technical field of electromagnetic tracking, and relates to an eyeball gaze angle measuring method based on electromagnetic wave tracking. The first step is as follows: fixing sensors on two sides of the left eye and the right eye respectively; the sensor is two planar circuit boards, each planar circuit board is provided with two groups of OCSR sensors, and the two groups of square receiving pieces are parallel. The second step is that: collecting disturbance of electric field distribution caused by pupils and eyelids through a sensor; the third step: carrying out differential processing on the signals obtained in the second step; the fourth step: constructing an initial position information base; the fifth step: and confirming the received signal position by comparing with the initial position information base. The present invention adopts the electromagnetic wave principle, and the subsequent signal processing and hardware can adopt the mature and low-cost radio chip and related circuit design. The sensor is a simple flat printed circuit board, and is not a glass optical lens module, so that the price is high.

Description

Eyeball gaze angle measuring method based on electromagnetic wave tracking
Technical Field
The invention belongs to the technical field of electromagnetic tracking, and relates to a method for measuring eyeball gaze angle based on electromagnetic wave tracking.
Background
Virtual Reality (VR) and Augmented Reality (AR) technologies have received considerable attention. These technologies represent the next innovative dimension of gaming, education, surgery and daily life. Eye tracking, a key part of the future for AR/VR, is driving the development of next generation human technology. Not only because it enables new forms of interaction, but also brings about a user experience. Eye tracking uses precision trackers to capture content of interest to a user in the field of view and provides data or information to adjust the image. This is very useful information for scientists or manufacturers who try to understand eye movements when people are involved in different activities while using AR glasses or VR Head Mounted Displays (HMDs).
Eye tracking in Virtual Reality (VR) Head Mounted Displays (HMDs) typically involves directing invisible near-infrared light toward the center of the eye (cornea), where there may be detectable different amounts of reflected light in the eye pupil and cornea in a limited space. These light reflections are then captured by a high-definition array camera (because of different viewing and viewing rates, the cornea has different angles, so that at this time, the light source irradiates the cornea with different intensity of corneal reflection). An array of point video cameras is placed in front of the user's eyes to track the scattered glints. This is an optical tracking of the corneal reflection, which is called the pupil center corneal reflection. In most studies, scattered flashes of light with various eye pupil positions have been used as reference points for gaze estimation. Video-based eye trackers operate by illuminating the eye with an infrared light source in a restricted HMD space. Advanced algorithms are used to calculate the position of the eye and accurately determine its focus position. This enables the measurement and study of visual behavior and fine eye movements, since the position of the eye can be mapped multiple times per second. Therefore, a near infrared light source and a high definition camera are required to capture the light falling on the eye and record its direction. Complex post-processing image algorithms are required, reliable light sources and high-definition cameras or arrays of multiple light source detection modules are used to surround the eye.
Disclosure of Invention
in order to solve the above technical problems, the present invention provides a method for measuring an eyeball gaze angle based on electromagnetic wave tracking.
The technical scheme of the invention is as follows:
An eyeball gaze angle measurement method based on electromagnetic wave tracking comprises the following steps:
the first step is as follows: open Complementary Split Ring Resonator (OCSRR) sensors are fixed on each of the left and right eyes. The open-loop-resonator (OCSRR) sensor is a planar resonant cavity for electromagnetic waves, and the resonant cavity is used as a sensor.
the sensor designed by the present invention is two plane circuit boards (as shown in fig. 1) located on both sides of the left and right eyes, and each plane circuit board has two sets of OCSRR sensors, and the two sets of square receiving sheets are parallel, the resonant frequency is f L and f H, the two OCSRR sensors use coplanar waveguide (CPW) to connect with the electromagnetic wave signal source and transmit the electromagnetic wave.
The (OCSRR) sensor is formed by adopting a square receiving sheet, and ports 1 and 3 are respectively arranged at corresponding sides of the square, the sensor only uses the reflection characteristic of a differential mode port, and respectively uses the reflection of the ports 1 and 3 only under a resonance frequency, wherein the reflection coefficient (reflection coefficient) of the port 1 is S11, the reflection coefficient (reflection coefficient) of the port 3 is S33, and the minimum reflection coefficient is a resonance frequency point due to the resonance phenomenon, and is defined as a unit dB of S 11 MIN or S 33 MIN.
The second step is that: the perturbations of the electric field distribution caused by the pupil and the eyelid are collected by the sensor.
Differential techniques (differential methods) may improve the performance of wireless communication and sensor measurement systems. The differential method principle is based on the measurement of cross-mode insertion loss, which is highly sensitive to asymmetric loads. The designed (OCSRR) sensor uses a coplanar waveguide (CPW) transmission line to feed electromagnetic energy.
The difference in ocular protrusion point (pupil) (sensors 1 and 2) and eyelid tissue loss between the eyes create asymmetry that provides higher quantum variation for detection. When one of the (OCSRR) sensors has a closed reference sample (eye pupil or eyelid) and the other is far from the sensor surface, the difference in dielectric constant and electromagnetic losses between the two samples creates asymmetry, causing mode conversion. Furthermore, the electromagnetic measurement of resonant particles near the surface of the body is significantly affected by the surrounding medium, such as heartbeat and respiration (common mode). Differential sensors are an advantageous solution to reduce the effects of environmental conditions and reduce environmental sensitivity. The proposed sensing method is based on a resonator with differential mode variation of the S parameter dB value. Furthermore, the use of differential mode suppresses the volume-varying noise that directly affects the dB value.
The third step: carrying out differential processing on the signals obtained in the second step;
The cross-mode insertion loss may indicate a degree of asymmetry associated with the difference between left-eye and right-eye perception. It can therefore be used as an output variable for tracking purposes.
SDC11=S11 MIN-S33 MIN (1)
S DC 11 is the angle corresponding function defined by us, which is composed of the reflection coefficient value measured by two ports S 11 MIN is the resonance frequency point of the minimum reflection coefficient of the port 1S 33 MIN is the resonance frequency point of the minimum reflection coefficient of the port 3S;
Four case results are shown in FIG. 2. in case1, when the eye looks to the right, the cornea moves closer to sensor 2 and the dB at f H of sensor 2 increases, while the cornea moves away from sensor 1, the f H of sensor 1 decreases the opposite is presented in case 2. in cases 3 and 4, gaze motion causes the eye to contact the eyelids when looking up and down. that is, when the user gazes at (case 4), the eyelids cover the sclera of the eye, this "squeezed skin" affects the f L resonator sensor we designed, placed next to the f H resonator sensor, sensing down. simultaneously, in case 3, when the user lifts his head, the eyelid skin is far away from f L of sensor 1 but close to f L of sensor 2. finally, S DC 11 from (1) can be used as a function of eye tracking acuity.
H L DC11 11 MIN 33 MINtwo sensors for binocular tracking have better acquisition sensing acquisition rates, if one of the sensors fails to acquire eye motion due to the user' S face or fails to match due to poor wear, the other sensor can still work on the same glasses via the Other (OCSRR).
The fourth step: constructing an initial position information base;
4.1 place the 2D markdegree plane wall in front of the user. Horizontal marker angle, vertical marker angle with m degree variation between each two points in the 2D gaze target plan.
4.2 measure S11 and S33 for both ports by the (OCSRR) sensor. Two sets of location information bases for tracking are obtained: a vertical and horizontal position information base.
The fifth step: and confirming the received signal position by comparing with the initial position information base.
The electromagnetic properties and typical equivalent circuit model of a planar OCSRR in a sensor are shown in FIG. 3 the two independent resonant frequencies (f L and f H) in the sensor are controlled independently of the multiplexing region in the sensor platform to achieve the respective functions.
the sensor designed by the present invention is placed close to the face of a person and discrete analysis of each OCSR pixel enables the sensor to extract perturbation information from the multiplexed area according to the location of a unit pixel in the prototype.f L and f H resonators are connected on the CPW line.the lengths of the resonating particles, l1 and l2, are adjusted and the electrical parameter characteristics are quantified independently using two resonance frequencies in the unit pixel (fL and fH). As shown in FIG. 4, in a resonant f L (f H) equivalent circuit, OCSR uses a resonant loop model formed by a capacitance Cres.l (Cres.f) and an inductance Lres.l (Lres.f), where Lres.l (Lres.f) represents the inductance of the metal trace between the rings of each gap and Crel (Cres.f) represents the capacitance of each gap surrounded by a metal square plane.accordingly, based on resonator theory, using (2) and (3) respectively provide the complete resonance frequencies of the sensor in eye tracking, L and H f:
fL=1/2π(Lres,L(Cres,L+Clid)0.5 (2)
fH=1/2π(Lres,HCres,H)0.5 (3)
the S11(dB) reflection amplitude of the sensor depends on the loss conductance in the network, the loss conductance includes metal, substrate and radiation losses, the metal and substrate losses (gres. l and gres. h) are fixed when the same sensor plate is used, the S parameter dB value includes information from the human body and eye tissue when the sensor is close to the face, however, in sensors 1 and 2, the feelings of the human face are considered to be common mode, Clid represents eyelid information, which represents the equivalent capacitive perturbation component in the f L resonator, thus, the f L resonator detects eyelid motion by Clid and glid, the f H resonator detects the angular membrane position by gcoroea.
The S-parameter dB value of the reflection coefficient can be used for contactless sensing resonators in biomedical applications. To simplify the calculation, the GLC shunt circuit model can be calculated to the total loss conductance according to the 2-port network theory and is given by equation (5):
gtotal loss=10S11 or S33 MIN (in dB)/20/(2Zo(1-10S11 or S33 MIN (in dB)/20)) (5)。
The invention has the beneficial effects that:
The present invention adopts the electromagnetic wave principle, so that the subsequent signal processing and hardware can adopt mature and low-cost radio chip and related circuit design. The sensor is a simple flat printed circuit board, and is not a glass optical lens module, so that the price is high. The electromagnetic wave emission and reception are the same circuit board, and the optics need to be provided with a light source and a receiving source (lens) which are separated.
Drawings
Fig. 1 is a prototype diagram of the proposed mirror symmetric OCSRR sensor for eye tracking.
Wherein uncoupled sensors 1 and 2 are respectively adjacent (but not touching) each canthus and eyelid to track movement of the vertical and horizontal eye fixation points, respectively, each sensor has its own two resonant frequencies f L and f H.
FIG. 2 shows four cases of eye movement, corresponding to S11 and S33 responses Case1 is eyeball right (at f H), Case 2 is eyeball left (f H), Case 3 is eyeball up (at f L), Case 4 is eyeball down (at f L).
Fig. 3 shows the design of the OCSRR circuit size and layout thereof.
fig. 4 is a schematic diagram of sensors 1 and 2 in an eye tracker, eye on head structure and an equivalent T-impedance circuit model.
Fig. 5 is a diagram of an eye tracking sensor in a 2D test screen plane. The user wears tracking glasses with a 50 ohm load for ports S22 and S44.
Figure 6 shows the level and SDC11 tracking measurements for a plurality of users.
Figure 7 shows the vertical SDC11 tracking measurements for multiple users.
Detailed Description
the following further describes a specific embodiment of the present invention with reference to the drawings and technical solutions.
first a 2D markedness plane wall is placed in front of the user. The horizontal markers are 50 to 130 degrees, the vertical markers are 40 to 140 degrees (from the user' S eye center (90 degrees), 1.5 meters from the face edge in this 2D gaze target plan, the line from point A to point K (FIG. 5) represents a vertical series with a 10 degree change between each two points similarly, the line from point 1 to point 9 is a horizontal series with a 10 degree change between each two points.
Two sets were collected separately for tracking experiments, vertical and horizontal tracking in horizontal tracking the dB amplitude corresponding to the change in degree data (fig. 6 from user 1) was plotted, noting the effect of angle on the amplitude of the pattern S11 and S33 notches at the resonant frequency f H.
Results of the implementation
1) Horizontal tracking
To simplify the calculations and further tracking, a set of linear models (6) was used to derive the measurement between the degree and magnitude of horizontal tracking and the correlation coefficient (R2 ═ 0.984) used the result that although the resonance frequency shift was not significant, the reflection notch represents a dependency of the point horizontal transfer in sensors 1 and 2.
SDC 11(dB)=0.0776*(H.Degree)–7.262 (6)
The SDC11 line results are normalized because point 5 (90) and point F (90) are horizontally perpendicular representing the central intercept point. The line is moved to 0dB at point 5 and used for calibration at point F. The S-parameter dB values were shifted to high levels (point 9 to point 1 of S11, point 1 to point 9 of S33), which is consistent with the results of the numerical analysis.
For users 2 to 4, the same analysis procedure was followed. We plot the relationship between dB amplitude and eye gaze point, which indicates that the maximum dB variation (50 to 140) value is different for each user. The degree of tracking and amplitude of the reflection exhibit a linear relationship of SDC (H degrees). The results show that the individual data sets are linear function trends. The degree of eye scan mapping indicates the change in Sreflect as the corner of the eye approaches the sensing point. The slope of each fitted line depends on the proximity of the cornea and eyelid to the sensor surface. The results show that various facial conditions of the user directly affect the message. The slope (dB/degree) SDC11 (h. degree) line functions for users 2 to 4 were 0.06(R2 ═ 0.975), 0.182(R2 ═ 0.987) and 0.126(R2 ═ 0.972), respectively. To track the system's predicted eye motion, we can estimate the initial dB values of the boundary points, e.g., points 1 and 9, to find the line. From the results, we can use attributes or appropriate functions for each user.
2) Vertical tracking
for vertical tracking, the cornea moves vertically 40 ° to 140 ° and produces a change of f L.96dB for user 1 based on the differential mode method proposed, S11MIN and S33MIN result in f L from (1) being used to calculate SDC11 (the perpendicularity of each test point), to compare these calculations, a set of normalized (same reason and method with horizontal tracking function) curve functions (7) are then used to derive the measurement between degree and amplitude.
SDC 11(dB)=-0.0011*(V.degree)2+0.1008*(V.Degree)-0.0191 (7)
In multi-user vertical tracking of users 2 through 4, eye tracking from 40 degrees to 140 degrees produces a 13.01,9.44 and 5.81dB change at fL, respectively, in addition, SDC11(V.Degree) differential mode is estimated by resonance notching in F L at each vertical point using equation (1) and the same evaluation procedure.
As expected, different users cause various maximum dB changes of 40 to 140 degrees in the vertical translation of the cornea. The curve curvature for a plurality of users is represented with respect to a second order polynomial function SDC (V degrees). Users 2 to 4 have a curvature (R2, correlation coefficient) of the SDC11(v.degree) second order polynomial curve of-0.099 (R2 ═ 0.993), -0.031(R2 ═ 0.981), and-0.016 (R2 ═ 0.989), clearly indicating that we can find a suitable curve at the initial point (points a to K) for calculation. Then, a suitable second order polynomial function curve is obtained, which is used to predict eye movement for unique eye conditions. The slight difference between the measured data and the estimated data line equation can be seen from the curve fit. This is because of limitations in measurement uncertainty or user eyelashes affect sensing.

Claims (4)

1. An eyeball gaze angle measurement method based on electromagnetic wave tracking is characterized by comprising the following steps:
The first step is as follows: fixing sensors on two sides of the left eye and the right eye respectively;
the sensors are two plane circuit boards, each of which has two sets of OCSR sensors, and two sets of square receiving sheets are parallel, the resonance frequency is f L and f H, the two OCSR sensors use coplanar waveguide to connect the electromagnetic wave signal source and transmit the electromagnetic wave;
the OCSRR sensor is formed by adopting a square receiving sheet, and ports 1 and 3 are respectively arranged on the corresponding sides of the square, the sensor only uses the reflection characteristic of a differential mode port and respectively uses the reflection of the ports 1 and 3 under the resonance frequency, wherein the reflection coefficient of the port 1 is S11, the reflection coefficient of the port 3 is S33, and the minimum reflection coefficient is a resonance frequency point due to the resonance phenomenon, is defined as S 11MIN or S 33MIN and has a unit of dB;
the second step is that: collecting disturbance of electric field distribution caused by pupils and eyelids through a sensor;
The third step: carrying out differential processing on the signals obtained in the second step;
The cross-mode insertion loss may indicate a degree of asymmetry associated with the difference between left-eye and right-eye perception; thus, it can be used as an output variable for tracking purposes;
SDC11 = S11MIN-S33MIN (1)
s DC 11 is the angle corresponding function defined by us, which is composed of the reflection coefficient values measured by two ports, S 11MIN is the point of resonance frequency of the minimum reflection coefficient of the port 1, S 33MIN is the point of resonance frequency of the minimum reflection coefficient of the port 3;
the fourth step: constructing an initial position information base;
4.1 placing the 2D markdegree plane wall in front of the user; a horizontal marker angle, a vertical marker angle having a variation of m degrees between every two points in the 2D gaze target plan;
4.2 measuring S11 and S33 of the two ports by the (OCSRR) sensor; two sets of location information bases for tracking are obtained: a vertical and horizontal position information base;
the fifth step: and confirming the received signal position by comparing with the initial position information base.
2. The method as claimed in claim 1, wherein the discrete analysis of each sensor pixel (OCSRR) of the sensor enables the sensor to extract perturbation information from the multiplexed region according to the position of a single pixel in the prototype.
3. The method as claimed in claim 1, wherein the vertical mark represents a vertical series of lines from point a to point K in the 2D gaze target plan, each point having a 10 degree variation therebetween; the line from point 1 to point 9 is a horizontal series, with a 10 degree change between each point.
4. The method as claimed in claim 2, wherein the vertical mark represents a vertical series of lines from point a to point K in the 2D gaze target plan, each point having a 10 degree variation therebetween; the line from point 1 to point 9 is a horizontal series, with a 10 degree change between each point.
CN201910903872.1A 2019-09-24 2019-09-24 Eyeball gaze angle measuring method based on electromagnetic wave tracking Pending CN110554777A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910903872.1A CN110554777A (en) 2019-09-24 2019-09-24 Eyeball gaze angle measuring method based on electromagnetic wave tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910903872.1A CN110554777A (en) 2019-09-24 2019-09-24 Eyeball gaze angle measuring method based on electromagnetic wave tracking

Publications (1)

Publication Number Publication Date
CN110554777A true CN110554777A (en) 2019-12-10

Family

ID=68741210

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910903872.1A Pending CN110554777A (en) 2019-09-24 2019-09-24 Eyeball gaze angle measuring method based on electromagnetic wave tracking

Country Status (1)

Country Link
CN (1) CN110554777A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113589534A (en) * 2021-08-13 2021-11-02 谷东科技有限公司 Diopter self-adaptive adjustment near-to-eye display device and augmented reality display equipment
CN115933172A (en) * 2022-11-29 2023-04-07 大连海事大学 Human eye sight tracking device and method based on polarization multispectral imaging

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106709398A (en) * 2015-07-29 2017-05-24 财团法人资讯工业策进会 Gaze analysis method and device
CN107014378A (en) * 2017-05-22 2017-08-04 中国科学技术大学 A kind of eye tracking aims at control system and method
CN108968907A (en) * 2018-07-05 2018-12-11 四川大学 The bearing calibration of eye movement data and device
CN109597489A (en) * 2018-12-27 2019-04-09 武汉市天蝎科技有限公司 A kind of method and system of the eye movement tracking interaction of near-eye display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106709398A (en) * 2015-07-29 2017-05-24 财团法人资讯工业策进会 Gaze analysis method and device
CN107014378A (en) * 2017-05-22 2017-08-04 中国科学技术大学 A kind of eye tracking aims at control system and method
CN108968907A (en) * 2018-07-05 2018-12-11 四川大学 The bearing calibration of eye movement data and device
CN109597489A (en) * 2018-12-27 2019-04-09 武汉市天蝎科技有限公司 A kind of method and system of the eye movement tracking interaction of near-eye display device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHIEH-SEN LEE 等: "《Open Complementary Split-Ring Resonator for Eye Tracking》", 《2019 IEEE MTT-S INTERNATIONAL MICROWAVE SYMPOSIUM (IMS)》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113589534A (en) * 2021-08-13 2021-11-02 谷东科技有限公司 Diopter self-adaptive adjustment near-to-eye display device and augmented reality display equipment
CN115933172A (en) * 2022-11-29 2023-04-07 大连海事大学 Human eye sight tracking device and method based on polarization multispectral imaging
CN115933172B (en) * 2022-11-29 2023-09-12 大连海事大学 Human eye sight tracking device and method based on polarized multispectral imaging

Similar Documents

Publication Publication Date Title
US10635900B2 (en) Method for displaying gaze point data based on an eye-tracking unit
US9070017B2 (en) Methods and apparatus for estimating point-of-gaze in three dimensions
Lee et al. 3D gaze tracking method using Purkinje images on eye optical model and pupil
Lai et al. Hybrid method for 3-D gaze tracking using glint and contour features
Hennessey et al. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions
Coutinho et al. Improving head movement tolerance of cross-ratio based eye trackers
Huang et al. Screenglint: Practical, in-situ gaze estimation on smartphones
KR20180072559A (en) Capacitive sensing circuits and methods for determining eyelid position using the same
Rantanen et al. A Wearable, Wireless Gaze Tracker with Integrated Selection Command Source for Human‐Computer Interaction
EP2979156A1 (en) Eye tracking calibration
US11181978B2 (en) System and method for gaze estimation
CN110554777A (en) Eyeball gaze angle measuring method based on electromagnetic wave tracking
Schnieders et al. Reconstruction of display and eyes from a single image
Mestre et al. Robust eye tracking based on multiple corneal reflections for clinical applications
CN105354825A (en) Intelligent device for automatically identifying position of reading material in read-write scene and application of intelligent device
Schofield et al. Sun and sky: Does human vision assume a mixture of point and diffuse illumination when interpreting shape-from-shading?
Nakazawa et al. Non-calibrated and real-time human view estimation using a mobile corneal imaging camera
CN105354822A (en) Intelligent apparatus for automatically identifying position of read-write element in read-write scene and application
Lee et al. Microwave resonator for eye tracking
Park A real-time gaze position estimation method based on a 3-D eye model
Li et al. SmartLens: Sensing eye activities using zero-power contact lens
Bozomitu et al. Methods of control improvement in an eye tracking based human-computer interface
Changyuan et al. The line of sight to estimate method based on stereo vision
Hennessey Point-of-gaze estimation in three dimensions
Zhao Micro-Scanning Mirror based Eye-tracking Technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20191210

WD01 Invention patent application deemed withdrawn after publication