CN107729871A - Infrared light-based human eye movement track tracking method and device - Google Patents

Infrared light-based human eye movement track tracking method and device Download PDF

Info

Publication number
CN107729871A
CN107729871A CN201711067388.7A CN201711067388A CN107729871A CN 107729871 A CN107729871 A CN 107729871A CN 201711067388 A CN201711067388 A CN 201711067388A CN 107729871 A CN107729871 A CN 107729871A
Authority
CN
China
Prior art keywords
msub
pupil
human eye
image
hot spot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711067388.7A
Other languages
Chinese (zh)
Inventor
邹建成
张宏根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
North China University of Technology
Original Assignee
North China University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by North China University of Technology filed Critical North China University of Technology
Priority to CN201711067388.7A priority Critical patent/CN107729871A/en
Publication of CN107729871A publication Critical patent/CN107729871A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

The invention discloses a human eye movement track tracking method and a human eye movement track tracking device based on infrared light, which comprise a human face image capturing module, a human eye image extracting module, a pupil and light spot extracting module, a fixation point determining module and a track determining module, wherein the human face image capturing module is used for collecting a human face image of which the pupil has light spots; the human eye image extraction module is used for extracting a human eye region image from the human face image; the pupil and light spot extraction module is used for extracting pupil images and light spot images from the human eye area images and determining the relation between the central coordinates of the pupils and the light spots and the fixation point; the fixation point determining module is used for determining fixation point coordinates according to the relation between the central coordinates of the pupil and the light spot and the fixation point; and the track determining module is used for determining the motion track of human eyes according to the coordinates of the plurality of fixation points. The invention can realize the tracking of the human eye movement track and improve the tracking accuracy.

Description

Human eye event trace method for tracing and device based on infrared light
Technical field
The present invention discloses a kind of human eye event trace method for tracing and device based on infrared light, belongs to image procossing and people Work technical field of intelligence.
Background technology
Human eye event trace has very high researching value, in man-machine interaction, psychology and behavioral study, pattern-recognition, city The fields such as field research, medical research, highway engineering research, driver training and evaluation are with a wide range of applications.
Pupil-canthus vector method is a kind of method for determining human eye event trace, and it is to utilize pupil-canthus vector and screen Coordinate mapping relations between curtain produce calibrating parameters, and the eye then calculated on screen moves position.This method extensive use In the research of eye-tracking, but because human eye pupil and hot spot are smaller, and can be by surrounding environment or hair glasses Deng occlusion issue, propose more stable and accurate pupil and spot location algorithm for improving whole Arithmetic of Eye-tracking System Robustness is extremely important.
The content of the invention
In view of the foregoing, it is an object of the invention to provide a kind of human eye event trace method for tracing based on infrared light And device, it can realize that human eye event trace is followed the trail of, improve tracking accuracy.
To achieve the above object, the present invention uses following technical scheme:
A kind of human eye event trace follow-up mechanism based on infrared light, including facial image acquisition module, eye image carry Modulus block, pupil and hot spot extraction module, blinkpunkt determining module, track determining module,
Facial image acquisition module, there is the facial image of hot spot for gathering pupil;
Eye image extraction module, for extracting human eye area image from the facial image;
Pupil and hot spot extraction module, for extracting pupil image, light spot image from the human eye area image, and determine The relation of pupil, the centre coordinate of hot spot and blinkpunkt;
Blinkpunkt determining module, for the relation according to pupil, the centre coordinate of hot spot and blinkpunkt, determine that blinkpunkt is sat Mark;
Track determining module, for watching point coordinates attentively according to some, determine human eye event trace.
Face picture is gathered using infrared camera or the ordinary optical camera lens for adding infrared filter, obtains the pupil Facial image with hot spot.
The method of pupil image is extracted from the human eye area image is:
The point that pixel value changes are maximum in the human eye area image is determined using gradient algorithm, as pupil;With this Centered on pupil, a pupil region is divided;OTSU algorithms are utilized to the pupil region, generate pupil image.
The method of light spot image is extracted from the human eye area image is:
The human eye area image is carried out negating processing, generates human eye area image of the inverted;Utilize gradient algorithm The point that pixel value changes are maximum in human eye area image of the inverted is determined, as hot spot point;Centered on the hot spot point, division One spot area;Utilize OTSU algorithms successively to the spot area, generate light spot image.
The centre coordinate of the pupil, hot spot is determined using gravity model appoach, using pupil-cornea vector bounce technique, establishes institute State the centre coordinate of pupil and hot spot and the mapping relations of blinkpunkt.
A kind of human eye event trace method for tracing based on infrared light, including:
Collection pupil has the facial image of hot spot;
Human eye area image is extracted from the facial image;
Pupil image, light spot image are extracted from the human eye area image;
The centre coordinate of pupil, hot spot is determined,
Determine relation of the centre coordinate of pupil, hot spot with watching point coordinates attentively;
According to continuously point coordinates is watched attentively, human eye event trace is obtained.
Face picture is gathered using infrared camera or the ordinary optical camera lens for adding infrared filter, obtains the pupil Facial image with hot spot.
The method of pupil image is extracted from the human eye area image is:
The point that pixel value changes are maximum in the human eye area image is determined using gradient algorithm, as pupil;With this Centered on pupil, a pupil region is divided;OTSU algorithms are utilized to the pupil region, generate pupil image.
The method of light spot image is extracted from the human eye area image is:
The human eye area image is carried out negating processing, generates human eye area image of the inverted;Utilize gradient algorithm The point that pixel value changes are maximum in human eye area image of the inverted is determined, as hot spot point;Centered on the hot spot point, division One spot area;Utilize OTSU algorithms successively to the spot area, generate light spot image.
Using pupil-cornea vector bounce technique, the mapping for establishing the centre coordinate and blinkpunkt of the pupil and hot spot is closed It is to be,
Wherein, (Xgaze, Ygaze) to watch point coordinates attentively,
Wherein, (xp, yp) be pupil centre coordinate, (xc, yc) be hot spot centre coordinate, determine institute using gravity model appoach State the centre coordinate of pupil, hot spot.
It is an advantage of the invention that:
The human eye event trace method for tracing and device based on infrared light of the present invention, it is real based on pupil-canthus vector method Human eye event trace tracing process is showed, there is higher tracking accuracy, can be applied in Miniature Terminal equipment.
Brief description of the drawings
Fig. 1 is the apparatus structure block diagram of the present invention.
Fig. 2 is the method flow schematic diagram of the present invention.
Embodiment
Below in conjunction with drawings and examples, the present invention is described in further detail.
As shown in figure 1, the human eye event trace follow-up mechanism disclosed by the invention based on infrared light, including facial image are picked Modulus block, eye image extraction module, pupil and hot spot extraction module, blinkpunkt determining module, track determining module.
Facial image acquisition module, there is the facial image of hot spot for gathering pupil;
Eye image extraction module, for extracting human eye area image from the facial image;
Pupil and hot spot extraction module, for determining pupil image, light spot image from the human eye area image, and determine The relation of pupil, the centre coordinate of hot spot and blinkpunkt;
Blinkpunkt determining module, for the relation according to pupil, the centre coordinate of hot spot and blinkpunkt, determine that blinkpunkt is sat Mark;
Track determining module, for watching point coordinates attentively according to some, determine human eye event trace.
As shown in Fig. 2 the human eye event trace method for tracing disclosed by the invention based on infrared light, including:
S1:Collection pupil has the facial image of hot spot;
Face picture is gathered using infrared camera or the ordinary optical camera lens for adding infrared filter, obtains pupil tool There is the facial image of hot spot.
S2:The facial image is handled, obtains human eye area image;
Human eye area-of-interest is extracted from facial image using human eye extraction algorithm or deep learning method, is obtained Human eye area image.
S3:Based on human eye area image, pupil image and light spot image are extracted respectively;
S31:Based on human eye area image, pupil image is extracted
The point that pixel value changes are maximum in human eye area image is determined using gradient algorithm, as pupil;
Centered on the pupil, a pupil region is divided;
OTSU algorithms are utilized to the pupil region, generate pupil image.
S32:Based on human eye area image, light spot image is extracted
Human eye area image is carried out to negate processing, generates human eye area image of the inverted;
The point that pixel value changes are maximum in human eye area image of the inverted is determined using gradient algorithm, as hot spot point;
Centered on the hot spot point, a spot area is divided;
Utilize OTSU algorithms successively to the spot area, generate light spot image.
S4:Based on pupil image and light spot image, the centre coordinate of pupil and hot spot is determined respectively;
Solve the centre coordinate of pupil and the centre coordinate of hot spot respectively using gravity model appoach.
S41:Calculate the centre coordinate of pupil
Calculation formula is:
Wherein, (xp, yp) be pupil centre coordinate, xn、ynPixel value respectively is InHorizontal stroke, ordinate, n is pixel Number.
S42:Calculate the centre coordinate of hot spot
Calculation formula is:
Wherein, (xc, yc) be hot spot centre coordinate.
S5:Establish the centre coordinate of pupil and hot spot and the mapping relations of blinkpunkt;
Using pupil-cornea vector bounce technique, the centre coordinate of pupil and hot spot and the mapping relations of blinkpunkt are established.Bag Include:
First, the P-CR vectors between the centre coordinate of pupil and the centre coordinate of hot spot are calculated:
Wherein, (xp, yp) be pupil centre coordinate, (xc, yc) be hot spot centre coordinate, (xe, ye) it is P-CR (Pupil Corneal Reflex) vector.
Then, using pupil-cornea vector bounce technique of six parameters, the mapping established between blinkpunkt and P-CR vectors is closed It is to be:
Wherein, (Xgaze, Ygaze) it is to watch point coordinates attentively in demarcation plane.
S6:Known point coordinate is demarcated on screen, determines the centre coordinate of pupil and hot spot and the mapping relations of blinkpunkt;
The coordinate of known point is demarcated on screen, parametric solution is carried out to formula (4), determines that the center of pupil and hot spot is sat The mapping relations of mark and blinkpunkt.
In a specific embodiment of the invention, parametric solution is carried out using least square method:
Wherein, i is calibration point, respectively 1,2,3,4,5,6,7,8,9.
According to the parameter result of calculation of formula (5), the parameter being calculated is substituted into formula (4), obtains blinkpunkt seat Mark.
S7:According to continuously point coordinates is watched attentively, human eye event trace is obtained.
By the calculating of successive frame, continuous human eye fixation point coordinate is obtained according to formula (4), (5), so as to obtain human eye Event trace.
The human eye event trace method for tracing of the present invention, can be applied to desktop computer, notebook, tablet personal computer, intelligence Mobile phone etc. is configured with the operation terminal of infrared camera, realizes human-computer interaction function.
The technical principle described above for being presently preferred embodiments of the present invention and its being used, for those skilled in the art For, without departing from the spirit and scope of the present invention, any equivalent change based on the basis of technical solution of the present invention Change, the simply obvious change such as replacement, belong within the scope of the present invention.

Claims (10)

1. the human eye event trace follow-up mechanism based on infrared light, it is characterised in that including facial image acquisition module, people's eye pattern Picture extraction module, pupil and hot spot extraction module, blinkpunkt determining module, track determining module,
Facial image acquisition module, there is the facial image of hot spot for gathering pupil;
Eye image extraction module, for extracting human eye area image from the facial image;
Pupil and hot spot extraction module, for extracting pupil image, light spot image from the human eye area image, and determine pupil The relation in hole, the centre coordinate of hot spot and blinkpunkt;
Blinkpunkt determining module, for the relation according to pupil, the centre coordinate of hot spot and blinkpunkt, it is determined that watching point coordinates attentively;
Track determining module, for watching point coordinates attentively according to some, determine human eye event trace.
2. the human eye event trace follow-up mechanism according to claim 1 based on infrared light, it is characterised in that utilize infrared Camera or the ordinary optical camera lens collection face picture for adding infrared filter, obtaining the pupil has the face figure of hot spot Picture.
3. the human eye event trace follow-up mechanism according to claim 1 based on infrared light, it is characterised in that from the people The method of pupil image is extracted in Vitrea eye area image is:
The point that pixel value changes are maximum in the human eye area image is determined using gradient algorithm, as pupil;With the pupil Centered on point, a pupil region is divided;OTSU algorithms are utilized to the pupil region, generate pupil image.
4. the human eye event trace follow-up mechanism according to claim 3 based on infrared light, it is characterised in that from the people The method of light spot image is extracted in Vitrea eye area image is:
The human eye area image is carried out negating processing, generates human eye area image of the inverted;Determined using gradient algorithm The maximum point of pixel value changes in human eye area image of the inverted, as hot spot point;Centered on the hot spot point, one is divided Spot area;Utilize OTSU algorithms successively to the spot area, generate light spot image.
5. the human eye event trace follow-up mechanism according to claim 4 based on infrared light, it is characterised in that utilize center of gravity Method determines the centre coordinate of the pupil, hot spot, using pupil-cornea vector bounce technique, establishes in the pupil and hot spot The mapping relations of heart coordinate and blinkpunkt.
6. the human eye event trace method for tracing based on infrared light, it is characterised in that including:
Collection pupil has the facial image of hot spot;
Human eye area image is extracted from the facial image;
Pupil image, light spot image are extracted from the human eye area image;
The centre coordinate of pupil, hot spot is determined,
Determine relation of the centre coordinate of pupil, hot spot with watching point coordinates attentively;
According to continuously point coordinates is watched attentively, human eye event trace is obtained.
7. the human eye event trace method for tracing according to claim 6 based on infrared light, it is characterised in that utilize infrared Camera or the ordinary optical camera lens collection face picture for adding infrared filter, obtaining the pupil has the face figure of hot spot Picture.
8. the human eye event trace method for tracing according to claim 6 based on infrared light, it is characterised in that from the people The method of pupil image is extracted in Vitrea eye area image is:
The point that pixel value changes are maximum in the human eye area image is determined using gradient algorithm, as pupil;With the pupil Centered on point, a pupil region is divided;OTSU algorithms are utilized to the pupil region, generate pupil image.
9. the human eye event trace method for tracing according to claim 6 based on infrared light, it is characterised in that from the people The method of light spot image is extracted in Vitrea eye area image is:
The human eye area image is carried out negating processing, generates human eye area image of the inverted;Determined using gradient algorithm The maximum point of pixel value changes in human eye area image of the inverted, as hot spot point;Centered on the hot spot point, one is divided Spot area;Utilize OTSU algorithms successively to the spot area, generate light spot image.
10. the human eye event trace method for tracing according to claim 6 based on infrared light, it is characterised in that utilize pupil Hole-cornea vector bounce technique, the mapping relations of the centre coordinate and blinkpunkt of establishing the pupil and hot spot are,
<mrow> <msub> <mi>X</mi> <mrow> <mi>g</mi> <mi>a</mi> <mi>z</mi> <mi>e</mi> </mrow> </msub> <mo>=</mo> <msub> <mi>a</mi> <mn>0</mn> </msub> <mo>+</mo> <msub> <mi>a</mi> <mn>1</mn> </msub> <msub> <mi>x</mi> <mi>e</mi> </msub> <mo>+</mo> <msub> <mi>a</mi> <mn>2</mn> </msub> <msub> <mi>y</mi> <mi>e</mi> </msub> <mo>+</mo> <msub> <mi>a</mi> <mn>3</mn> </msub> <msubsup> <mi>x</mi> <mi>e</mi> <mn>2</mn> </msubsup> <mo>+</mo> <msub> <mi>a</mi> <mn>4</mn> </msub> <msub> <mi>x</mi> <mi>e</mi> </msub> <msub> <mi>y</mi> <mi>e</mi> </msub> <mo>+</mo> <msub> <mi>a</mi> <mn>5</mn> </msub> <msubsup> <mi>y</mi> <mi>e</mi> <mn>2</mn> </msubsup> </mrow>
<mrow> <msub> <mi>Y</mi> <mrow> <mi>g</mi> <mi>a</mi> <mi>z</mi> <mi>e</mi> </mrow> </msub> <mo>=</mo> <msub> <mi>b</mi> <mn>0</mn> </msub> <mo>+</mo> <msub> <mi>b</mi> <mn>1</mn> </msub> <msub> <mi>x</mi> <mi>e</mi> </msub> <mo>+</mo> <msub> <mi>b</mi> <mn>2</mn> </msub> <msub> <mi>y</mi> <mi>e</mi> </msub> <mo>+</mo> <msub> <mi>b</mi> <mn>3</mn> </msub> <msubsup> <mi>x</mi> <mi>e</mi> <mn>2</mn> </msubsup> <mo>+</mo> <msub> <mi>b</mi> <mn>4</mn> </msub> <msub> <mi>x</mi> <mi>e</mi> </msub> <msub> <mi>y</mi> <mi>e</mi> </msub> <mo>+</mo> <msub> <mi>b</mi> <mn>5</mn> </msub> <msubsup> <mi>y</mi> <mi>e</mi> <mn>2</mn> </msubsup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
Wherein, (Xgaze, Ygaze) to watch point coordinates attentively,
<mrow> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msub> <mi>x</mi> <mi>e</mi> </msub> <mo>=</mo> <msub> <mi>x</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>x</mi> <mi>c</mi> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>y</mi> <mi>e</mi> </msub> <mo>=</mo> <msub> <mi>y</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>y</mi> <mi>c</mi> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
Wherein, (xp, yp) be pupil centre coordinate, (xc, yc) be hot spot centre coordinate, determine the pupil using gravity model appoach Hole, the centre coordinate of hot spot.
CN201711067388.7A 2017-11-02 2017-11-02 Infrared light-based human eye movement track tracking method and device Pending CN107729871A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711067388.7A CN107729871A (en) 2017-11-02 2017-11-02 Infrared light-based human eye movement track tracking method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711067388.7A CN107729871A (en) 2017-11-02 2017-11-02 Infrared light-based human eye movement track tracking method and device

Publications (1)

Publication Number Publication Date
CN107729871A true CN107729871A (en) 2018-02-23

Family

ID=61222047

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711067388.7A Pending CN107729871A (en) 2017-11-02 2017-11-02 Infrared light-based human eye movement track tracking method and device

Country Status (1)

Country Link
CN (1) CN107729871A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109613984A (en) * 2018-12-29 2019-04-12 歌尔股份有限公司 Processing method, equipment and the system of video image in VR live streaming
CN111061372A (en) * 2019-12-18 2020-04-24 Oppo广东移动通信有限公司 Equipment control method and related equipment
CN111443804A (en) * 2019-12-27 2020-07-24 安徽大学 Method and system for describing fixation point track based on video analysis
CN111528788A (en) * 2020-05-27 2020-08-14 温州医科大学 Portable detecting instrument for evaluating visual fatigue degree
CN112464829A (en) * 2020-12-01 2021-03-09 中航航空电子有限公司 Pupil positioning method, pupil positioning equipment, storage medium and sight tracking system
CN113362775A (en) * 2021-06-24 2021-09-07 东莞市小精灵教育软件有限公司 Display screen control method and device, electronic equipment and storage medium
CN114022946A (en) * 2022-01-06 2022-02-08 深圳佑驾创新科技有限公司 Sight line measuring method and device based on binocular camera

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530618A (en) * 2013-10-23 2014-01-22 哈尔滨工业大学深圳研究生院 Non-contact sight tracking method based on corneal reflex
CN103679180A (en) * 2012-09-19 2014-03-26 武汉元宝创意科技有限公司 Sight tracking method based on single light source of single camera
US20160081547A1 (en) * 2013-05-15 2016-03-24 The Johns Hopkins University Eye tracking and gaze fixation detection systems, components and methods using polarized light

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679180A (en) * 2012-09-19 2014-03-26 武汉元宝创意科技有限公司 Sight tracking method based on single light source of single camera
US20160081547A1 (en) * 2013-05-15 2016-03-24 The Johns Hopkins University Eye tracking and gaze fixation detection systems, components and methods using polarized light
CN103530618A (en) * 2013-10-23 2014-01-22 哈尔滨工业大学深圳研究生院 Non-contact sight tracking method based on corneal reflex

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
谢波: "基于视频图像的瞳孔定位算法研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
陈秋香: "基于瞳孔-角膜反射的视线跟踪算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109613984A (en) * 2018-12-29 2019-04-12 歌尔股份有限公司 Processing method, equipment and the system of video image in VR live streaming
CN109613984B (en) * 2018-12-29 2022-06-10 歌尔光学科技有限公司 Method, device and system for processing video images in VR live broadcast
CN111061372A (en) * 2019-12-18 2020-04-24 Oppo广东移动通信有限公司 Equipment control method and related equipment
CN111443804A (en) * 2019-12-27 2020-07-24 安徽大学 Method and system for describing fixation point track based on video analysis
CN111443804B (en) * 2019-12-27 2022-08-19 安徽大学 Method and system for describing fixation point track based on video analysis
CN111528788A (en) * 2020-05-27 2020-08-14 温州医科大学 Portable detecting instrument for evaluating visual fatigue degree
CN112464829A (en) * 2020-12-01 2021-03-09 中航航空电子有限公司 Pupil positioning method, pupil positioning equipment, storage medium and sight tracking system
CN112464829B (en) * 2020-12-01 2024-04-09 中航航空电子有限公司 Pupil positioning method, pupil positioning equipment, storage medium and sight tracking system
CN113362775A (en) * 2021-06-24 2021-09-07 东莞市小精灵教育软件有限公司 Display screen control method and device, electronic equipment and storage medium
CN114022946A (en) * 2022-01-06 2022-02-08 深圳佑驾创新科技有限公司 Sight line measuring method and device based on binocular camera

Similar Documents

Publication Publication Date Title
CN107729871A (en) Infrared light-based human eye movement track tracking method and device
Angelopoulos et al. Event based, near eye gaze tracking beyond 10,000 hz
CN104866105B (en) The eye of aobvious equipment is dynamic and head moves exchange method
US11556741B2 (en) Devices, systems and methods for predicting gaze-related parameters using a neural network
US11194161B2 (en) Devices, systems and methods for predicting gaze-related parameters
CN105094300B (en) A kind of sight line tracking system and method based on standardization eye image
CN107193383A (en) A kind of two grades of Eye-controlling focus methods constrained based on facial orientation
WO2019033569A8 (en) Eyeball movement analysis method, device and storage medium
Anilkumar et al. Pose estimated yoga monitoring system
US20200364453A1 (en) Devices, systems and methods for predicting gaze-related parameters
WO2020020022A1 (en) Method for visual recognition and system thereof
Chang et al. A pose estimation-based fall detection methodology using artificial intelligence edge computing
Cho Automated mental stress recognition through mobile thermal imaging
CN109063545A (en) A kind of method for detecting fatigue driving and device
CN105912126A (en) Method for adaptively adjusting gain, mapped to interface, of gesture movement
Wei et al. Real-time limb motion tracking with a single imu sensor for physical therapy exercises
Sangeetha A survey on deep learning based eye gaze estimation methods
CN106618479A (en) Pupil tracking system and method thereof
Hori et al. Silhouette-based synthetic data generation for 3D human pose estimation with a single wrist-mounted 360° camera
CN107832699A (en) Method and device for testing interest point attention degree based on array lens
Kar et al. Eye-gaze systems-An analysis of error sources and potential accuracy in consumer electronics use cases
Chen et al. Deep-learning-based human motion tracking for rehabilitation applications using 3D image features
Madhusanka et al. Concentrated gaze base interaction for decision making using human-machine interface
US20220198789A1 (en) Systems and methods for determining one or more parameters of a user&#39;s eye
Skowronek et al. Eye Tracking Using a Smartphone Camera and Deep Learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180223

WD01 Invention patent application deemed withdrawn after publication