CN107784280A - A kind of dynamic pupil tracking method - Google Patents
A kind of dynamic pupil tracking method Download PDFInfo
- Publication number
- CN107784280A CN107784280A CN201710973082.1A CN201710973082A CN107784280A CN 107784280 A CN107784280 A CN 107784280A CN 201710973082 A CN201710973082 A CN 201710973082A CN 107784280 A CN107784280 A CN 107784280A
- Authority
- CN
- China
- Prior art keywords
- mtd
- pupil
- mtr
- mrow
- tracking
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/207—Analysis of motion for motion estimation over a hierarchy of resolutions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/62—Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
Abstract
The present invention discloses a kind of dynamic pupil tracking method, comprises the following steps:1) in image sequence, it is assumed that tracking target t coordinate beMotion prediction is carried out using Kalman filtering, then the coordinate at t+1 moment isCovariance is ∑ (xk, yk);2) during pupil tracking, the motion of pupil in front and rear two frame is regarded as at the uniform velocity, the feature available position of pupil movement and speed describe, if (ct, rt) be t pupil position, (ut, vt) it is speed of the t in c directions and r directions, so the state vector of t pupil is xt=(ct, tt, ut, vt)t, the state model of system is expressed as:xt+1=Φ xi+wi;Wherein wtFor system noise;The dynamic pupil tracking method accuracy of detection is high and real-time.
Description
Technical field
Present invention relates particularly to a kind of dynamic pupil tracking method.
Background technology
The process that computer vision system gets up the feature in image from piece image to another width images match is referred to as
The tracking of characteristics of image.Signature tracking technology includes based drive method and the method based on template, and the former is using motion point
Cut the tracking target motion such as technology, Kalman filtering;The latter obtains the priori of target first, constructs object module, then
Template matches are carried out by sliding window to each two field picture of input.
The position of pupil is assessed in the tracking of pupil real-time continuously in one group of image sequence.At present, for face and
The detection of pupil is with tracking oneself through there is substantial amounts of achievement in research.But some of track algorithm precision are higher and real-time compared with
Difference, some algorithm real-times can meet and precision is not high.
The content of the invention
It is an object of the invention to provide a kind of high and real-time dynamic pupil tracking method of accuracy of detection.
In order to solve the above-mentioned technical problem, the technical scheme is that:
A kind of dynamic pupil tracking method, comprises the following steps:
1) in image sequence, it is assumed that tracking target t coordinate beTransported using Kalman filtering
Dynamic prediction, then the coordinate at t+1 moment isCovariance is ∑ (xk, yk);
2) during pupil tracking, the motion of pupil in front and rear two frame is regarded as at the uniform velocity, the feature of pupil movement
Available position and speed describe, if (ct, rt) be t pupil position, (ut, vt) it is t in c directions and r directions
Speed, so the state vector of t pupil is xt=(ct, tt, ut, vt)t, the state model of system is expressed as:xt+1=Φ xt
+wt;Wherein wtFor system noise;
3) when pupil at the uniform velocity moves between two field pictures, it is set as in state-transition matrix:
Observed quantityFor the position of t pupil, the measurement model of system is:zt=Hxt+vt, wherein vt
For the white noise of zero-mean.
The technology of the present invention effect major embodiment is in the following areas:First have to detect and position in initial frame when being tracked
Go out the position of pupil, and construct pupil template, tracked target is then estimated in next frame according to the movable information of image
Position, and scanned in the region that is likely to occur of estimation target.Obtain and what To Template distribution of color was most like is
For tracked target.Due to having carried out motion estimation, so as to substantially reduce hunting zone, so compared with using exhaustive search
Algorithm is more fast and effective.
Embodiment
In the present embodiment, it is necessary to which explanation, such as first and second or the like relational terms are used merely to one
Individual entity or operation make a distinction with another entity or operation, and not necessarily require or imply these entities or operate it
Between any this actual relation or order be present.Moreover, term " comprising ", "comprising" or its any other variant are intended to
Cover including for nonexcludability, so that process, method, article or equipment including a series of elements not only include those
Key element, but also the other element including being not expressly set out, or also include for this process, method, article or set
Standby intrinsic key element.
In addition, the connection between part or fixed form if not otherwise specified in this embodiment, it is connected or solid
It can be to be fixed by bolt commonly used in the prior art or pin is fixed to determine mode, or the mode such as bearing pin connection, therefore, at this
No longer it is described in detail in embodiment.
Embodiment
A kind of dynamic pupil tracking method, in image sequence, it is assumed that tracking target t coordinate beAdopt
Motion prediction is carried out with Kalman filtering, then the coordinate at t+1 moment isCovariance is ∑ (xk, yk);Pupil with
During track, the motion of pupil in front and rear two frame is regarded as at the uniform velocity, the feature available position of pupil movement and speed are retouched
State, if (ct, rt) be t pupil position, (ut, vt) it is speed of the t in c directions and r directions, so t pupil
State vector be xt=(ct, tt, ut, vt)t, the state model of system is expressed as:xt+1=Φ xt+wt;Wherein wtFor system noise
Sound;It is assumed that the displacement very little that pupil moves between two field pictures, and be at the uniform velocity to move, it is set as in state-transition matrix:Observed quantityFor the position of t pupil, the measurement model of system is:zt
=Hxt+vt, wherein vtFor the white noise of zero-mean.
Due to ztIt is only relevant with position, can set H asIf provide target initial position and
Speed, in aforementioned manners can estimate the state vector x of pupil in next two field picturet, and covariance matrix ∑t+1
(xt+1, yt+1)。
The technology of the present invention effect major embodiment is in the following areas:First have to detect and position in initial frame when being tracked
Go out the position of pupil, and construct pupil template, tracked target is then estimated in next frame according to the movable information of image
Position, and scanned in the region that is likely to occur of estimation target.Obtain and what To Template distribution of color was most like is
For tracked target.Due to having carried out motion estimation, so as to substantially reduce hunting zone, so compared with using exhaustive search
Algorithm is more fast and effective.
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited thereto, any
The change or replacement expected without creative work, it should all be included within the scope of the present invention.
Claims (1)
- A kind of 1. dynamic pupil tracking method, it is characterised in that comprise the following steps:1) in image sequence, it is assumed that tracking target t coordinate beUsing Kalman filtering move pre- Survey, then the coordinate at t+1 moment isCovariance is ∑ (xk, yk);2) during pupil tracking, the motion of pupil in front and rear two frame is regarded at the uniform velocity as the feature of pupil movement can use Position and speed describe, if (ct, rt) be t pupil position, (ut, vt) it is speed of the t in c directions and r directions Degree, so the state vector of t pupil is xt=(ct, tt, ut, vt)t, the state model of system is expressed as:xt+1=Φ xt+ wt;Wherein wtFor system noise;3) when pupil at the uniform velocity moves between two field pictures, it is set as in state-transition matrix:<mrow> <mi>&Phi;</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>&Delta;</mi> <mi>t</mi> </mrow> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mrow> <mi>&Delta;</mi> <mi>t</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> <mtd> <mn>0</mn> </mtd> </mtr> <mtr> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>0</mn> </mtd> <mtd> <mn>1</mn> </mtd> </mtr> </mtable> </mfenced> <mo>;</mo> </mrow>Observed quantityFor the position of t pupil, the measurement model of system is:zt=Hxt+vt, wherein vtIt is zero equal The white noise of value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710973082.1A CN107784280A (en) | 2017-10-18 | 2017-10-18 | A kind of dynamic pupil tracking method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710973082.1A CN107784280A (en) | 2017-10-18 | 2017-10-18 | A kind of dynamic pupil tracking method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107784280A true CN107784280A (en) | 2018-03-09 |
Family
ID=61434738
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710973082.1A Withdrawn CN107784280A (en) | 2017-10-18 | 2017-10-18 | A kind of dynamic pupil tracking method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107784280A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108595008A (en) * | 2018-04-27 | 2018-09-28 | 北京计算机技术及应用研究所 | Man-machine interaction method based on eye movement control |
CN112749604A (en) * | 2019-10-31 | 2021-05-04 | Oppo广东移动通信有限公司 | Pupil positioning method and related device and product |
CN113838086A (en) * | 2021-08-23 | 2021-12-24 | 广东电网有限责任公司 | Attention assessment test method, attention assessment test device, electronic equipment and storage medium |
CN115147462A (en) * | 2022-07-08 | 2022-10-04 | 浙江大学 | Gaze characteristic tracking method based on three-dimensional eyeball model and Kalman filtering |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101699510A (en) * | 2009-09-02 | 2010-04-28 | 北京科技大学 | Particle filtering-based pupil tracking method in sight tracking system |
CN105373766A (en) * | 2014-08-14 | 2016-03-02 | 由田新技股份有限公司 | Pupil positioning method and device |
-
2017
- 2017-10-18 CN CN201710973082.1A patent/CN107784280A/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101699510A (en) * | 2009-09-02 | 2010-04-28 | 北京科技大学 | Particle filtering-based pupil tracking method in sight tracking system |
CN105373766A (en) * | 2014-08-14 | 2016-03-02 | 由田新技股份有限公司 | Pupil positioning method and device |
Non-Patent Citations (1)
Title |
---|
党治: "实时瞳孔检测与跟踪方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108595008A (en) * | 2018-04-27 | 2018-09-28 | 北京计算机技术及应用研究所 | Man-machine interaction method based on eye movement control |
CN112749604A (en) * | 2019-10-31 | 2021-05-04 | Oppo广东移动通信有限公司 | Pupil positioning method and related device and product |
CN113838086A (en) * | 2021-08-23 | 2021-12-24 | 广东电网有限责任公司 | Attention assessment test method, attention assessment test device, electronic equipment and storage medium |
CN113838086B (en) * | 2021-08-23 | 2024-03-22 | 广东电网有限责任公司 | Attention assessment test method, device, electronic equipment and storage medium |
CN115147462A (en) * | 2022-07-08 | 2022-10-04 | 浙江大学 | Gaze characteristic tracking method based on three-dimensional eyeball model and Kalman filtering |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Henein et al. | Dynamic SLAM: The need for speed | |
CN107784280A (en) | A kind of dynamic pupil tracking method | |
US9607401B2 (en) | Constrained key frame localization and mapping for vision-aided inertial navigation | |
Nerurkar et al. | C-KLAM: Constrained keyframe-based localization and mapping | |
CN103383776B (en) | A kind of laddering Stereo Matching Algorithm based on two stage cultivation and Bayesian Estimation | |
US20210150755A1 (en) | Device and method with simultaneous implementation of localization and mapping | |
Dani et al. | Globally exponentially stable observer for vision-based range estimation | |
CN102405483A (en) | Object tracking device, object tracking method, and object tracking program | |
Yoon et al. | Object tracking from image sequences using adaptive models in fuzzy particle filter | |
JP6061770B2 (en) | Camera posture estimation apparatus and program thereof | |
Shin et al. | Hybrid approach for facial feature detection and tracking under occlusion | |
KR20120094102A (en) | Similarity degree calculation device, similarity degree calculation method, and program | |
Koledić et al. | MOFT: Monocular odometry based on deep depth and careful feature selection and tracking | |
Nitsche et al. | Constrained-covisibility marginalization for efficient on-board stereo SLAM | |
Shang et al. | Lane detection using steerable filters and FPGA-based implementation | |
CN108731683B (en) | Unmanned aerial vehicle autonomous recovery target prediction method based on navigation information | |
Huang et al. | MC-VEO: A Visual-Event Odometry With Accurate 6-DoF Motion Compensation | |
Lin et al. | A New Approach for Vision-based Rear Vehicle Tracking | |
CN101976446A (en) | Tracking method of multiple feature points of microscopic sequence image | |
CN111784680B (en) | Detection method based on consistency of key points of left and right eye views of binocular camera | |
JP2012215549A (en) | Tracking device | |
Leykin et al. | Real-time estimation of human attention field in LWIR and color surveillance videos | |
Pan et al. | Correlation tracking algorithm based on adaptive template update | |
Teshima et al. | Estimation of FOE Without Optical Flow for ehicle Lateral Position Detection. | |
Askar et al. | Optimized uav object tracking framework based on integrated particle filter with ego-motion transformation matrix |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20180309 |
|
WW01 | Invention patent application withdrawn after publication |