CN103279751A - Eye movement tracking method on the basis of accurate iris positioning - Google Patents

Eye movement tracking method on the basis of accurate iris positioning Download PDF

Info

Publication number
CN103279751A
CN103279751A CN2013102431642A CN201310243164A CN103279751A CN 103279751 A CN103279751 A CN 103279751A CN 2013102431642 A CN2013102431642 A CN 2013102431642A CN 201310243164 A CN201310243164 A CN 201310243164A CN 103279751 A CN103279751 A CN 103279751A
Authority
CN
China
Prior art keywords
iris
eye movement
formula
target
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2013102431642A
Other languages
Chinese (zh)
Inventor
郝宗波
桑楠
黄园刚
江维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN2013102431642A priority Critical patent/CN103279751A/en
Publication of CN103279751A publication Critical patent/CN103279751A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses an eye movement tracking method on the basis of accurate iris positioning. The method mainly comprises a human face detection module, a human eye detection module, an eye movement tracking module and the like. A human face area is detected according to a detector based on the AdaBoost algorithm. A human eye is detected on the basis of human face area detection. An iris area is accurately located. A tracking template based on combined characteristics of target colors and edges is built up. The influence of saturability components is considered when the color characteristic template is built up. The condition that the iris can deform due to the fact that an eye is closed is considered in the tracking stage. Self-defined standards are used for judging, so that the tracking template is adjusted in a self-adapting mode. Consequently, eye movement tracking is finished. Compared with other methods, the method is higher in tracking speed, higher in tracking accuracy, and wide in application range, meets requirements for real-timeness, accuracy, and robustness, and has good application prospects in handheld mobile equipment.

Description

A kind of eye movement tracking based on accurate location iris
Technical field
The present invention relates to a kind of eye movement tracking of the non-intrusion type based on accurate location iris.Relate in particular to a kind of trace template based on iris region color and edge associating feature, finish the method that eye movement is followed the tracks of according to self-defining criterion.Belong to the computer vision message area.
Background technology
Eyes are windows of soul, and we can probe into the rule of many human psychology activities to see through this window.Human information processing depends on vision to a great extent, therefore, never is interrupted for the scientific research of " how the people sees things ".About this point, be considered to the most effective means of visual information processing for the research of eye movement.Eye movement follow the tracks of (Eye Tracking) when mainly referring to utilize eye movement with geostationary some feature as object of reference, thereby information such as the motion track of opening the state of closing, direction of gaze, time and blinkpunkt of acquisition eyes and translational speed have good application prospects in fields such as portable terminal, medical science, interface design and assessment, product test, scene research, performance analysis, Aero-Space, man-machine interaction, virtual reality and recreation.
At present method commonly used has based on motion, based on the algorithm of feature and template, based on the eye movement instrument with based on the method for nonlinear filtering theory etc.Based drive eye movement track algorithm belongs to the intrusive mood eye movement to be followed the tracks of, and often need comprise electroculogram method and solenoid method by extraneous utility appliance.This method tends to cause tester's ophthalmic uncomfortable, therefore is unsuitable for conventional eye movement and follows the tracks of.Eye movement tracking based on feature and template refers to set up template according to biology and geometric properties after accurately locating human eye, realizes following the tracks of, and comprises template matching method, optimum linear approximatioss, median algorithm etc.But this method is subjected to the external interference factor bigger, and accuracy and robustness are still waiting to improve.Eye movement tracking based on the eye movement instrument need adopt complexity and expensive large-scale precision instrument, just begun development abroad in 20 beginnings of the century, along with fast development of computer technology, till now, a lot of companies have all developed Related product, such as Australian Seeing Machines, U.S. ASL, Japanese ISCN, German SMI, Canadian SR company etc.Domesticly start late comparatively speaking, professor Fu Rui of Chang An University and Xian Electronics Science and Technology University have carried out the research of related fields, yet it is further perfect to still need in precision, hommization and miniaturization.Based on the develop rapidly along with control theory of the eye movement track algorithm of nonlinear filtering theory, become research focus in recent years.Theoretical method commonly used comprises Kalman wave filter, EKF wave filter, UKF wave filter etc.But sampling filter also needs more deep research for the precision, real-time and the robustness that how to improve in eye movement is followed the tracks of, and the tracking analysis and evaluation is as shown in table 1.
Table 1 eye movement tracking is analyzed
Figure BSA00000912745700021
Therefore, proposing a kind of rapid eye movement tracking based on non-intrusion type is problem demanding prompt solution.Fig. 1 shows typical eye movement trace flow figure based on non-intrusion type, and core is human eye detection and tracking, and wherein human eye detection is important link, is related to the accuracy of target following, is the foundation stone of whole eye movement tracker.At the eye movement tracking phase, in order to reach accuracy, robustness, real-time requirement, many problems that need solution are arranged.
Therefore, improve the eye movement tracking velocity, the requirement that really reaches accuracy, robustness, real-time becomes the problem that this class eye movement tracking presses for solution.
Summary of the invention
Purpose of the present invention is set up the target following template with regard to being that providing a kind of in order to address the above problem accurately locatees iris in the human eye detection stage, adjusts search box size by self-defined standard self-adaptation, thereby realizes the method that eye movement is followed the tracks of.
In order to achieve the above object, the present invention has adopted following technical scheme:
A kind of eye movement tracking based on accurate location iris of the present invention is by accurate location iris, sets up the trace template based on iris region color and edge associating feature, finishes eye movement according to self-defining criterion and follows the tracks of.This method specifically may further comprise the steps:
1, background video or the real-time video that provides by the user accurately located iris region;
Accurately the location iris region further comprises following idiographic flow:
1), load the eye movement video record in advance, handle every frame video;
2), analyze a two field picture, load the human-face detector location human face region based on the AdaBoost algorithm;
3), load human eye detection device coarse positioning human eye rectangular area at the human face region that extracts;
4), in the human eye area of extracting by eliminating noise, iris is accurately located in operations such as smoothing processing.
By the elimination noise, operations such as smoothing processing, the concrete steps of accurately locating iris are as follows:
1., the human eye area of extracting is carried out binary conversion treatment;
2., introduce the susan operator and eliminate most of interference such as homochromy spot noises of class eyeball such as eyebrows;
3., the bianry image after scanning and the analyzing and processing obtains connected domain and divides, eliminate the less agglomerate of unconnected region area, thereby further eliminate interference of noise;
4., use medium filtering that the iris region that obtains is carried out smoothing processing, accurately locate iris.
2, finish the eye movement tracking according to accurately locating iris.
The eye movement tracing process comprises following flow process:
1), initialization tracking target center, set up iris color and edge the associating feature templates;
Concrete steps are as follows:
1., (x y), adjust to follow the tracks of edge circle, and template size, target's center position are shown in formula (1), (2), (3) in initialization iris center.
Figure BSA00000912745700041
Formula (1)
White point, the black that white_num wherein, black_num are respectively the bianry image target area x that counts Oright, x Oleft, y Oup, y OdownBe target area and judgement of color criterion in the statistics null value image array
Figure BSA00000912745700051
Adjusting template size and target's center's coordinate is:
Model h = y nup - y ndowm Model w = x nright - y nleft Formula (2)
x = x + ( x nright + x nleft ) / 2 y = y + ( y ndown + y nup ) / 2 Formula (3)
2., be different from the traditional C amShift algorithm that reckons without noise effect when setting up the color characteristic template, the present invention considers noise effect, set up image HSV space S component variation curve map, it is lower to obtain the noise saturation degree by analysis, set up the relation of s min and search window s, as shown in Equation 4.
η = s k - 1 / s k s min k + 1 = s min k * η Formula (4)
3., statistics region of search grey level histogram distribution hisArray (being divided into the N level), gray probability distribution hisPro calculates region of search gray average μ, record statistics with histogram globalHis sets up iris color and edge and unites feature templates;
2), prediction next frame target's center position, finish eye movement by self-defining edge feature standard and follow the tracks of.
Concrete steps are as follows:
1., the maximum between-cluster variance segmentation threshold T in the calculating region of search Max, as shown in Equation (5), and the record searching matrix
Figure BSA00000912745700055
Edge calculation point mean distance μ d = Σ x edge Σ y edge ( x - x edge ) 2 + ( y - y edge ) 2 / total , Be recorded in array array DisIn;
μ 1 = Σ i = 0 T i * hispro i d 1 = Σ i = 0 T ( i - μ 1 ) 2 * hispro i / pros 1 pros 2 = 1 - pros 1 μ 2 = μ - μ 1 d 2 = Σ i = T + 1 N ( i - μ 2 ) 2 * hispro i / pros 2 d w = pros 1 * d 1 + pros 2 * d 2 d b = pros 1 * pros 2 * ( μ 2 - μ 1 ) 2 θ = d b / d w Formula (5)
Wherein T ∈ (1, N-1), when θ obtains maximal value θ MaxThe time be the segmentation threshold T that tries to achieve Max, wherein hispro is the probability distribution in gray level image template zone.
2., predicting tracing target's center position (x, y), as shown in Equation (6), and record;
x off = Σ x Σ y x * s ( x , y ) / total y off = Σ x Σ y y * s ( x , y ) / total x = x + x off - Model w / 2 y = y + y off - Model h / 2 Formula (6)
Model wherein w, Model hIt is wide and high to be respectively template.
3., judge whether iris deformation takes place, namely whether need to upgrade, criterion is: whether two frame statistics with histogram values change greater than 3 times of marginal point standard deviation status EdgeAnd marginal point is to the mean distance status of central point Dis, status Edge, status DisCalculate as shown in Equation (7), according to front LISTMAX frame target travel law forecasting target's center position, carry out as formula (8) when deformation takes place.
μ = Σ i = 0 index array edge [ i ] / index status edge = sqrt ( Σ i = 0 index ( array edge [ i ] - μ ) 2 / index ) status dis = array dis [ num - 1 ] + ( array dis [ num - 1 ] - array dis [ 0 ] ) / ( num - 1 ) Formula (7)
Figure BSA00000912745700072
Formula (8)
X wherein Pd, y PdBe respectively record and follow the tracks of the x of coordinates matrix, y component.
4. iteration is finished coupling to tracking target in the region of search, shown in formula (9), (10).
p te ( b in ) = C e Σ x i ∈ w k ( | | x i | | 2 ) M x i δ ( α x i - b in ) p ce ( b in ) = C c Σ x i ∈ w k ( | | y i - x i h | | 2 ) M x i δ ( α x i - b in ) Formula (9)
ρ 1 = ρ ( p th , p ch ) = Σ l = 1 16 p th l p ch l ρ 2 = ρ ( p te , p ce ) = Σ m = 1 16 p te m p ce m Formula (10)
Final similarity measure coefficient is ρ Final=ε ρ 1+ β ρ 2D (y), ε+β=1 wherein, ε, β ∈ [0,1].When
Figure BSA00000912745700075
Reaching a hour iteration finishes.Wherein, C is normaliztion constant, and δ is Kronecker delta function, and k (x) is the Epanechnikov kernel function.
4. 1. repeated execution of steps arrive step, finishes or the user is interrupted eye movement and followed the tracks of up to one section Video processing.
Beneficial effect of the present invention is:
The given eye movement tracking based on accurate location iris of the present invention, the eye movement tracking that belongs to non-intrusion type adopts in the human eye detection stage and accurately locatees iris, sets up the target following template, adjust search box size by self-defined standard self-adaptation, thereby realize the eye movement tracking.Method of the present invention is compared with additive method, and tracking velocity is faster, and tracking accuracy is higher, uses more conveniently, applied widely, satisfies real-time, accuracy, robustness requirement.
Description of drawings
Fig. 1 is the eye movement trace flow figure of the described non-intrusion type of background technology of the present invention
Fig. 2 is image HSV space S component variation curve map of the present invention
Fig. 3 is the main processing flow chart of embodiment of the invention system;
Embodiment
Below in conjunction with the drawings and specific embodiments the present invention is further described in detail:
The experiment of following the tracks of with the eye movement of the accurate location iris under the indoor conditions is example below, and the present invention is specifically described:
In this experiment, comparatively violent or situation such as camera shake occurs for illumination variation, though be under indoor conditions, also be the exemplary that explanation is followed the tracks of based on the eye movement of accurate location iris, its concrete treatment scheme as shown in Figure 3, concrete implementation step is as follows:
Step 1: input video
The video that Shu Ru video can be for ready for also can be handled every frame video for the real-time video of camera transmission herein, changes step 2;
Step 2: add manned face cascade classifier
People's face cascade classifier herein is the good human-face detector based on the AdaBoost algorithm of precondition, analyzes a two field picture, and the human-face detector that loads based on the AdaBoost algorithm detects people's face, changes step 3;
Step 3: detect whether there is people's face
When not having people's face, obtain input one two field picture again; Otherwise change step 4;
Step 4: detect people's face, set the emerging zone of sense
In image, detect people's face region and mark, set the emerging regional ROI of sense 1, change step 5;
Step 5: load the human eye cascade classifier
Surely feel emerging regional ROI1 at the people's face that gets access to and load the human eye detection device;
Step 6: detect human eye, set the emerging zone of sense
The emerging regional ROI of people's face sense that is obtaining 1Middle position of human eye, coarse positioning human eye rectangular area and the mark of detecting set the emerging regional ROI of human eye sense 2, change step 7;
Step 7: binary image and susan operator are handled
Rectangular area, gray processing human eye place, and simultaneously gray level image is carried out susan operator and binary image processing, and with the image phase or the operation that obtain, eliminate the interference of noise like points such as eyebrow, change step 8;
Step 8: connected domain analysis, smoothing denoising
Bianry image after handling is obtained connected domain divide, eliminate the less agglomerate of unconnected region area, use medium filtering that the iris region that obtains is carried out smoothing processing at last;
Step 9: accurately locate iris
The iris region that obtains is accurately located, indicate iris region and preservation;
Step 10: set up trace template
According to the iris region initialization tracking target center that obtains and adjust template size, follow the tracks of edge circle, set up color and edge associating feature templates thus.Concrete steps are as follows:
1., (x y), adjust to follow the tracks of edge circle, and template size, target's center position are shown in formula (1), (2), (3) in initialization iris center.
Formula (1)
White point, the black that white_num wherein, black_num are respectively the bianry image target area x that counts Oright, x Oleft, y Oup, y OdownBe target area and judgement of color criterion in the statistics null value image array
Figure BSA00000912745700102
Adjusting template size and target's center's coordinate is:
Model h = y nup - y ndowm Model w = x nright - y nleft Formula (2)
x = x + ( x nright + x nleft ) / 2 y = y + ( y ndown + y nup ) / 2 Formula (3)
2., be different from the traditional C amShift algorithm that reckons without noise effect when setting up the color characteristic template, the present invention considers noise effect, set up image HSV space S component variation curve map, as shown in Figure 2, because The noise, saturation degree is lower, and noise is covered as between the 510-520 frame, set up the relation of s min and search window s, as shown in Equation 4.
η = s k - 1 / s k s min k + 1 = s min k * η Formula (4)
3., statistics region of search grey level histogram distribution hisArray (being divided into the N level), gray probability distribution hisPro calculates region of search gray average μ, record statistics with histogram globalHis sets up iris color and edge and unites feature templates;
Step 11: prediction next frame tracking target center
Calculate the maximum between-cluster variance segmentation threshold T in the region of search Max, as shown in Equation (5), and the record searching matrix Edge calculation point mean distance μ d = Σ x edge Σ y edge ( x - x edge ) 2 + ( y - y edge ) 2 / total , Be recorded in array array DisIn;
μ 1 = Σ i = 0 T i * hispro i d 1 = Σ i = 0 T ( i - μ 1 ) 2 * hispro i / pros 1 pros 2 = 1 - pros 1 μ 2 = μ - μ 1 d 2 = Σ i = T + 1 N ( i - μ 2 ) 2 * hispro i / pros 2 d w = pros 1 * d 1 + pros 2 * d 2 d b = pros 1 * pros 2 * ( μ 2 - μ 1 ) 2 θ = d b / d w Formula (5)
Wherein T ∈ (1, N-1), when θ obtains maximal value θ MaxThe time be the segmentation threshold T that tries to achieve Max, wherein hispro is the probability distribution in gray level image template zone.
According to present frame iris characteristics of motion prediction next frame target's center position (x, y), as shown in Equation (6), and record;
x off = Σ x Σ y x * s ( x , y ) / total y off = Σ x Σ y y * s ( x , y ) / total x = x + x off - Model w / 2 y = y + y off - Model h / 2 Formula (6)
Model wherein w, Model hIt is wide and high to be respectively template.
Step 12: whether deformation takes place
Consider to have situation such as close one's eyes, judge according to self-defining target deformation standard whether iris changes, namely whether need renewal, criterion is: whether two frame statistics with histogram values change greater than 3 times of marginal point standard deviation status EdgeAnd marginal point is to the mean distance status of central point Dis, status Edge, status DisCalculate as shown in Equation (7):
μ = Σ i = 0 index array edge [ i ] / index status edge = sqrt ( Σ i = 0 index ( array edge [ i ] - μ ) 2 / index ) status dis = array dis [ num - 1 ] + ( array dis [ num - 1 ] - array dis [ 0 ] ) / ( num - 1 ) Formula (7)
If change, then change step 13, otherwise change step 14;
Step 13: upgrade the target of prediction center
When deformation takes place, adjust the target of prediction center.According to LISTMAX frame target travel law forecasting center, front, as shown in Equation (8):
Figure BSA00000912745700131
Formula (8)
X wherein Pd, y PdBe respectively record and follow the tracks of the x of coordinates matrix, y component.
Change step 14 after upgrading the target of prediction center;
Step 14: iteration is finished the target coupling in the region of search
Specific algorithm such as formula (9), (10):
p te ( b in ) = C e Σ x i ∈ w k ( | | x i | | 2 ) M x i δ ( α x i - b in ) p ce ( b in ) = C c Σ x i ∈ w k ( | | y i - x i h | | 2 ) M x i δ ( α x i - b in ) Formula (9)
ρ 1 = ρ ( p th , p ch ) = Σ l = 1 16 p th l p ch l ρ 2 = ρ ( p te , p ce ) = Σ m = 1 16 p te m p ce m Formula (10)
Final similarity measure coefficient is ρ Final=ε ρ 1+ β ρ 2D (y), ε+β=1 wherein, ε, β ∈ [0,1].When
Figure BSA00000912745700134
Reaching a hour iteration finishes.Wherein, C is normaliztion constant, and δ is Kronecker delta function, and k (x) is the Epanechnikov kernel function.
Step 15: whether video finishes dealing with, and the then system finishing of finishing dealing with continues the eye movement tracking otherwise change step 11.

Claims (7)

1. the eye movement tracking based on accurate location iris is characterized in that: by video, accurately locate iris, set up the trace template based on iris region color and edge associating feature, finish eye movement according to self-defining criterion and follow the tracks of.
2. described a kind of eye movement tracking based on accurate location iris according to claim 1 is characterized in that: comprise following two big steps:
(1), by background video or real-time video that the user provides, accurately locate iris region;
(2), finish the eye movement tracking according to accurately locating iris.
3. described a kind of eye movement tracking based on accurate location iris according to claim 2, it is characterized in that: described step (1)---by background video or the real-time video that the user provides, accurately locate iris region and further comprise following idiographic flow:
1), load the eye movement video record in advance, handle every frame video;
2), analyze a two field picture, load the human-face detector location human face region based on the AdaBoost algorithm;
3), load human eye detection device coarse positioning human eye rectangular area at the human face region that extracts;
4), in the human eye area of extracting by eliminating noise, iris is accurately located in operations such as smoothing processing.
4. described a kind of eye movement tracking based on accurate location iris according to claim 3, it is characterized in that: described idiographic flow 4)---pass through to eliminate noise in the human eye area of extracting, operations such as smoothing processing, accurately locate iris and comprise following concrete steps:
1., the human eye area of extracting is carried out binary conversion treatment;
2., introduce the susan operator and eliminate most of interference such as homochromy spot noises of class eyeball such as eyebrows;
3., the bianry image after scanning and the analyzing and processing obtains connected domain and divides, eliminate the less agglomerate of unconnected region area, thereby further eliminate interference of noise;
4., use medium filtering that the iris region that obtains is carried out smoothing processing, accurately locate iris.
5. described a kind of eye movement tracking based on accurate location iris according to claim 2 is characterized in that: described step (2)---finishes eye movement according to accurate location iris and follows the tracks of and further comprise following flow process:
1), initialization tracking target center, set up iris color and edge the associating feature templates;
2), prediction next frame target's center position, finish eye movement by self-defining edge feature standard and follow the tracks of.
6. described a kind of eye movement tracking based on accurate location iris according to claim 5 is characterized in that: described flow process 1)---initialization tracking target center, and set up iris color and edge associating feature templates and comprise following concrete steps:
1., (x y), adjust to follow the tracks of edge circle, and template size, target's center position are shown in formula (1), (2), (3) in initialization iris center.
Figure FSA00000912745600021
Formula (1)
White point, the black that white_num wherein, black_num are respectively the bianry image target area x that counts Oright, x Oleft, y Oup, y OdownBe target area and judgement of color criterion in the statistics null value image array
Figure FSA00000912745600022
Adjusting template size and target's center's coordinate is:
Figure FSA00000912745600031
Formula (2)
Figure FSA00000912745600032
Formula (3)
2., be different from the traditional C amShift algorithm that reckons without noise effect when setting up the color characteristic template, the present invention considers noise effect, set up image HSV space S component variation curve map, it is lower to obtain the noise saturation degree by analysis, set up the relation of s min and search window s, as shown in Equation 4.
Figure FSA00000912745600033
Formula (4)
3., statistics region of search grey level histogram distribution hisArray (being divided into the N level), gray probability distribution hisPro calculates region of search gray average μ, record statistics with histogram globalHis sets up iris color and edge and unites feature templates.
7. described a kind of eye movement tracking based on accurate location iris according to claim 5, it is characterized in that: described flow process 2)---prediction next frame target's center position, finish the eye movement tracking by self-defining edge feature standard and comprise following concrete steps:
1., the maximum between-cluster variance segmentation threshold T in the calculating region of search Max, as shown in Equation (5), and the record searching matrix
Figure FSA00000912745600034
Edge calculation point mean distance
Figure FSA00000912745600035
Be recorded in array array DisIn;
Figure FSA00000912745600041
Formula (5)
Wherein T ∈ (1, N-1), when θ obtains maximal value θ MaxThe time be the segmentation threshold T that tries to achieve Max, wherein hispro is the probability distribution in gray level image template zone.
2., predicting tracing target's center position (x, y), as shown in Equation (6), and record;
Figure FSA00000912745600042
Formula (6)
Model wherein w, Model hIt is wide and high to be respectively template.
3., judge whether iris deformation takes place, namely whether need to upgrade, criterion is: whether two frame statistics with histogram values change greater than 3 times of marginal point standard deviation status EdgeAnd marginal point is to the mean distance status of central point Dis, status Edge, status DisCalculate as shown in Equation (7), according to front LISTMAX frame target travel law forecasting target's center position, carry out as formula (8) when deformation takes place.
Figure FSA00000912745600051
Formula (7)
Figure FSA00000912745600052
Formula (8)
X wherein Pd, y PdBe respectively record and follow the tracks of the x of coordinates matrix, y component.
4. iteration is finished coupling to tracking target in the region of search, shown in formula (9), (10).
Figure FSA00000912745600053
Formula (9)
Formula (10)
Final similarity measure coefficient is ρ Final=ε ρ 1+ β ρ 2D (y), ε+β=1 wherein, ε, β ∈ [0,1].When
Figure FSA00000912745600055
Reaching a hour iteration finishes.Wherein, C is normaliztion constant, and δ is Kronecker delta function, and k (x) is the Epanechnikov kernel function.
4. 1. repeated execution of steps arrive step, finishes or the user is interrupted eye movement and followed the tracks of up to one section Video processing.
CN2013102431642A 2013-06-19 2013-06-19 Eye movement tracking method on the basis of accurate iris positioning Pending CN103279751A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2013102431642A CN103279751A (en) 2013-06-19 2013-06-19 Eye movement tracking method on the basis of accurate iris positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2013102431642A CN103279751A (en) 2013-06-19 2013-06-19 Eye movement tracking method on the basis of accurate iris positioning

Publications (1)

Publication Number Publication Date
CN103279751A true CN103279751A (en) 2013-09-04

Family

ID=49062266

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2013102431642A Pending CN103279751A (en) 2013-06-19 2013-06-19 Eye movement tracking method on the basis of accurate iris positioning

Country Status (1)

Country Link
CN (1) CN103279751A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104166996A (en) * 2014-08-06 2014-11-26 北京航空航天大学 Human eye tracking method based on edge and color double-feature space column diagram
CN104463216A (en) * 2014-12-15 2015-03-25 北京大学 Eye movement pattern data automatic acquisition method based on computer vision
CN104463921A (en) * 2015-01-09 2015-03-25 厦门美图之家科技有限公司 Image processing method for positioning center of eyeball
CN104463080A (en) * 2013-09-16 2015-03-25 展讯通信(天津)有限公司 Detection method of human eye state
CN104463081A (en) * 2013-09-16 2015-03-25 展讯通信(天津)有限公司 Detection method of human eye state
WO2015172514A1 (en) * 2014-05-16 2015-11-19 北京天诚盛业科技有限公司 Image acquisition device and method
CN105141938A (en) * 2015-08-18 2015-12-09 深圳先进技术研究院 Sight positioning device
WO2016034021A1 (en) * 2014-09-02 2016-03-10 Hong Kong Baptist University Method and apparatus for eye gaze tracking
WO2017080399A1 (en) * 2015-11-12 2017-05-18 阿里巴巴集团控股有限公司 Method and device for tracking location of human face, and electronic equipment
CN106774893A (en) * 2016-12-15 2017-05-31 飞狐信息技术(天津)有限公司 A kind of virtual reality exchange method and virtual reality device
CN109298782A (en) * 2018-08-31 2019-02-01 阿里巴巴集团控股有限公司 Eye movement exchange method, device and computer readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070036397A1 (en) * 2005-01-26 2007-02-15 Honeywell International Inc. A distance iris recognition
CN101246544A (en) * 2008-01-24 2008-08-20 电子科技大学中山学院 Iris locating method based on boundary point search and SUSAN edge detection

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070036397A1 (en) * 2005-01-26 2007-02-15 Honeywell International Inc. A distance iris recognition
CN101246544A (en) * 2008-01-24 2008-08-20 电子科技大学中山学院 Iris locating method based on boundary point search and SUSAN edge detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐洪飚: "一种快速的非侵入式眼动跟踪方法", 《计算机系统应用》, vol. 21, no. 1, 31 December 2012 (2012-12-31), pages 172 - 175 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104463080A (en) * 2013-09-16 2015-03-25 展讯通信(天津)有限公司 Detection method of human eye state
CN104463081A (en) * 2013-09-16 2015-03-25 展讯通信(天津)有限公司 Detection method of human eye state
WO2015172514A1 (en) * 2014-05-16 2015-11-19 北京天诚盛业科技有限公司 Image acquisition device and method
CN104166996A (en) * 2014-08-06 2014-11-26 北京航空航天大学 Human eye tracking method based on edge and color double-feature space column diagram
WO2016034021A1 (en) * 2014-09-02 2016-03-10 Hong Kong Baptist University Method and apparatus for eye gaze tracking
CN104463216B (en) * 2014-12-15 2017-07-28 北京大学 Eye movement mode data automatic obtaining method based on computer vision
CN104463216A (en) * 2014-12-15 2015-03-25 北京大学 Eye movement pattern data automatic acquisition method based on computer vision
CN104463921A (en) * 2015-01-09 2015-03-25 厦门美图之家科技有限公司 Image processing method for positioning center of eyeball
CN105141938A (en) * 2015-08-18 2015-12-09 深圳先进技术研究院 Sight positioning device
WO2017080399A1 (en) * 2015-11-12 2017-05-18 阿里巴巴集团控股有限公司 Method and device for tracking location of human face, and electronic equipment
US10410046B2 (en) 2015-11-12 2019-09-10 Alibaba Group Holding Limited Face location tracking method, apparatus, and electronic device
US10713472B2 (en) 2015-11-12 2020-07-14 Alibaba Group Holding Limited Face location tracking method, apparatus, and electronic device
US11003893B2 (en) 2015-11-12 2021-05-11 Advanced New Technologies Co., Ltd. Face location tracking method, apparatus, and electronic device
US11423695B2 (en) 2015-11-12 2022-08-23 Advanced New Technologies Co., Ltd. Face location tracking method, apparatus, and electronic device
CN106774893A (en) * 2016-12-15 2017-05-31 飞狐信息技术(天津)有限公司 A kind of virtual reality exchange method and virtual reality device
CN106774893B (en) * 2016-12-15 2019-10-18 飞狐信息技术(天津)有限公司 A kind of virtual reality exchange method and virtual reality device
CN109298782A (en) * 2018-08-31 2019-02-01 阿里巴巴集团控股有限公司 Eye movement exchange method, device and computer readable storage medium
CN109298782B (en) * 2018-08-31 2022-02-18 创新先进技术有限公司 Eye movement interaction method and device and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN103279751A (en) Eye movement tracking method on the basis of accurate iris positioning
CN102324025B (en) Human face detection and tracking method based on Gaussian skin color model and feature analysis
KR101434205B1 (en) Systems and methods for object detection and classification with multiple threshold adaptive boosting
CN110221699B (en) Eye movement behavior identification method of front-facing camera video source
CN106899968A (en) A kind of active noncontact identity identifying method based on WiFi channel condition informations
CN104063722A (en) Safety helmet identification method integrating HOG human body target detection and SVM classifier
CN101404086A (en) Target tracking method and device based on video
CN104318202A (en) Method and system for recognizing facial feature points through face photograph
CN102663454B (en) Method and device for evaluating character writing standard degree
CN110728185B (en) Detection method for judging existence of handheld mobile phone conversation behavior of driver
CN104281839A (en) Body posture identification method and device
CN108898621B (en) Related filtering tracking method based on instance perception target suggestion window
CN105809713A (en) Object tracing method based on online Fisher discrimination mechanism to enhance characteristic selection
CN104036528A (en) Real-time distribution field target tracking method based on global search
CN116386120A (en) Noninductive monitoring management system
CN102509308A (en) Motion segmentation method based on mixtures-of-dynamic-textures-based spatiotemporal saliency detection
CN105701486A (en) Method for realizing human face information analysis and extraction in video camera
CN104200226A (en) Particle filtering target tracking method based on machine learning
Vrânceanu et al. NLP EAC recognition by component separation in the eye region
Singh et al. Implementation and evaluation of DWT and MFCC based ISL gesture recognition
CN107977622A (en) Eyes detection method based on pupil feature
Wu et al. NIR-based gaze tracking with fast pupil ellipse fitting for real-time wearable eye trackers
CN108257148A (en) The target of special object suggests window generation method and its application in target following
CN105447440B (en) Real-time iris image evaluation method and device
Zhen-Yan Chinese character recognition method based on image processing and hidden markov model

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20130904