CN103366381A - Sight line tracking correcting method based on space position - Google Patents
Sight line tracking correcting method based on space position Download PDFInfo
- Publication number
- CN103366381A CN103366381A CN2013103398035A CN201310339803A CN103366381A CN 103366381 A CN103366381 A CN 103366381A CN 2013103398035 A CN2013103398035 A CN 2013103398035A CN 201310339803 A CN201310339803 A CN 201310339803A CN 103366381 A CN103366381 A CN 103366381A
- Authority
- CN
- China
- Prior art keywords
- screen
- blinkpunkt
- coordinate
- pupil center
- error
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Eye Examination Apparatus (AREA)
Abstract
The invention provides a sight line tracking and correcting method based on a space position. Watching properties of human eyes to different regions on a screen are different and a relation between a point of the whole screen and a watching point can not be accurately described by a simple mapping relation. A calculated watching point is corrected according to a deflection angle characteristic of an optical axis and a visual axis when the human eyes watch the different regions. According to the sight line tracking correcting method disclosed by the invention, the screen is divided into a plurality of regions and double computers are used for calculating regions which are vertically mapped on the screen by pupil centers, and an estimation error of the watching point is calculated; the relation between the estimation error and the corresponding estimated position of the watching point is compared and the watching point is corrected according to the relation between the estimation error and the corresponding estimated position, so as to determine the accurate position of a tested point.
Description
Technical field
The present invention relates to the eye tracking bearing calibration of a kind of eye space position that obtains based on double camera and two light sources, belong to video, multimedia signal processing technique field.
Background technology
Along with the develop rapidly of computer science and technology and industry, computer user's sharp increase, intelligentized man-machine interaction also more and more receives publicity.As the important channel that the mankind obtain external information, gaze tracking system is also being brought into play huge potentiality in intelligent human-machine interaction, also is a large focus of scientific circles' research always.It is widely used in the fields such as commercial advertisement test, medical diagnosis, developmental psychology and virtual reality.
Gaze tracking system is as a kind of intelligentized aid, and its precision is an important parameter of measurement system quality, will directly affect the practicality of system.General gaze tracking system can roughly be divided into the human eye location, pupil region is extracted, calculated corneal reflection point coordinate, location pupil center, blinkpunkt estimation and proofreaies and correct five links.The precision that improves any one link can improve the precision of system to a certain extent, and the method for proofreading and correct at present all is to improve system accuracy by the process of calibrating.
The method of present existing eye tracking mainly obtains direction of gaze or blinkpunkt by calculating optical axis, and real direction of gaze or blinkpunkt are determined by the optical axis, so need to calculate Watch Error by calibration process.But calibration process needs the user all will carry out before use a series of training test, and different users will watch attentively pre-set test point before use, determines the parameter of tracker, and process is complicated, and user experience is poor.So, the method that finds a kind of eye tracking that need not to calibrate to Visual Trace Technology popularize and development has great importance.
According to architectural feature and the motion feature of human eye, the optical axis of human eye is the line of CC and macula retinae central fovea, and the optical axis of human eye is the line of CC and pupil center, has generally speaking the drift angle between the optical axis and the optical axis.There is research to point out that there is certain regularity the drift angle between human eye's visual axis and the optical axis along with the difference of direction of gaze.
Summary of the invention
For saving loaded down with trivial details calibration process, improve the naturality that the user uses gaze tracking system, the present invention proposes a kind of eye tracking bearing calibration based on the locus.The method will eliminate error, put forward high-precision task has given the automatic calibration process of automatically being calculated by computing machine by the original artificial calibration process that participates in of user that needs, the user no longer needs coupled system to calibrate when using gaze tracking system, can directly carry out the control of sight line uses, save the time, greatly improved user's experience.
In the method for the present invention, the automatic calibration of computing machine is based on that ready-made matched curve carries out.And the formation of matched curve is in the fixing situation of hardware, the people of some is tested obtaining in advance.The angle that the present invention just is being based between human eye's visual axis and the optical axis changes along with the change of direction of gaze, and have these characteristics of certain regularity, mode by curve obtains the angle of the optical axis and optical axis and the relation between the blinkpunkt estimated position, recycle this relation error compensation is carried out in the blinkpunkt position of estimating, can greatly improve the precision of estimation.Because matched curve is only relevant with hardware, and irrelevant with particular user, so trimming process does not have fully in the feeling situation the user and is automatically finished by computing machine.
In order to obtain the degree of accuracy of high as far as possible error compensation under suitable curve complexity, the present invention adopts the method for dividing some zones at screen.The point of each zoning fixed position is as test point.
The technical solution used in the present invention is as follows:
A kind of eye tracking bearing calibration based on the locus, it is characterized in that: the calibration phase that the correction data that utilization had generated when the method was divided into based on the blinkpunkt position estimation error correction data generation phase of eye space position and actual measurement is proofreaied and correct, concrete steps are:
(1) based on the blinkpunkt position estimation error correction data generation phase of eye space position
(1) computer screen is divided into some zones, the location point of the some known coordinates of each zoning;
(2) choose several tested person persons, allow tested person person watch the location point of regulation in the step (1) attentively, utilize and annotate
The viewpoint algorithm for estimating estimates the position coordinates of blinkpunkt;
(3) calculate error between the true coordinate of the blinkpunkt position coordinates estimate and the location point of regulation, work
Be the blinkpunkt evaluated error;
(4) utilize double camera to calculate the volume coordinate of pupil center, then determine pupil center vertically be mapped in computer screen coordinate in the plane;
(5) allow tested person person that the point of the diverse location on the screen is tested, and several tested person persons are tested, record accordingly each blinkpunkt evaluated error and pupil center and be mapped in coordinate on the computer screen;
(6) utilize curve fitting algorithm, vertically be mapped in coordinate on the screen as reference point take pupil center, simulate the evaluated error of different screen subregion and the relation of blinkpunkt position, generate thus correction data;
(2) calibration phase that utilizes during actual measurement the correction data that generated to proofread and correct
(1) in the actual measurement of gaze tracking system, utilize the blinkpunkt algorithm for estimating to estimate the position coordinates of user's blinkpunkt;
(2) utilize double camera to calculate in real time the volume coordinate of user pupil center, determine pupil center and be mapped in coordinate on the screen;
(3) be mapped in coordinate on the screen according to user's blinkpunkt position coordinates and pupil center, determine when being mapped in coordinate on the screen as reference point take pupil center, the screen area at blinkpunkt position coordinates place, the evaluated error of the different screen subregion that simulates according to step (6) in the stage (one) and the relation of blinkpunkt position, the user's blinkpunkt coordinate that estimates is carried out real time correction, obtain comparatively accurate user's blinkpunkt coordinate.
In described stage (one's) step (1), screen is divided into m*n zone, horizontal direction is equally divided into m zone, and vertical direction is equally divided into n zone, wherein A
11, A
1n, A
M1A
MnRepresent respectively the zone of relevant position, the test point of several known coordinates is set respectively in each zone.
In described stage (one's) step (2), several testers are tested, each tester watches the test point on the screen successively attentively, and each point is watched attentively the regular hour, utilizes the blinkpunkt algorithm for estimating to calculate the position coordinates of blinkpunkt.
In described stage (one's) step (3), real blinkpunkt is determined by the optical axis, the method that obtains at present blinkpunkt is determined that by optical axis the angular error that therefore calculates can be considered the angular deviation of the optical axis and optical axis in this step, the specific implementation step is:
A, blinkpunkt coordinate and the true coordinate of at first utilizing middle step (2) of stage (one) to estimate compare, and calculate the distance error of different screen domain test point this moment;
B, according to the distance error that calculates in the step (a), the distance in conjunction with pupil center and screen centre this moment is converted to angular error with distance error.
In described stage (one's) step (4), the specific implementation step is:
A, at first utilize double camera three-dimensional imaging principle to determine center coordinate of eye pupil;
B, according to the position relationship of camera and screen, the camera coordinates system of pupil is converted to coordinate in world coordinate system.
Among the described step a, the double camera normal to screen is in the lower boundary center of screen, utilizes the three-dimensional imaging principle to calculate the coordinate of pupil center in camera coordinates system.
Among the described step b, the camera coordinates of the pupil that calculates among the step a is converted to coordinate in the world coordinate system take the screen lower left corner as initial point.
In described stage (one's) step (6), the specific implementation step is:
The blinkpunkt position coordinates that calculates in a, the step (2) according to stage (one) and the true coordinate of test point are judged in which subregion of blinkpunkt place screen;
B, be mapped in position coordinates on the screen according to pupil center in stage (one's) the step (4), judge in which subregion on pupil center's place screen;
C, take the zone at pupil center place as reference point, by curve, simulate the evaluated error take angle as unit of different screen subregion and the relation of blinkpunkt position, generate correction data.
In described stage (twos') step (3), the specific implementation step is:
The position coordinates that the blinkpunkt position coordinates that calculates in a, the step (1) (2) according to stage (two) and pupil are mapped on the screen is determined both subregions on screen, supposes that both are respectively A by the subregion on screen
Ij, A
Lk, then take the subregion at pupil center place as reference point, the position of blinkpunkt is A
(i-l) (j-k)
B, in conjunction with the correction data that obtains in the stage (one), obtain corresponding correction error this moment, the drift angle value that namely horizontal direction and vertical direction are corresponding;
C, the drift angle value that obtains according to step b corresponding remove to proofread and correct the blinkpunkt coordinate that estimates.
Description of drawings
Fig. 1 is gaze tracking system hardware synoptic diagram;
Fig. 2 is the gaze tracking system block diagram;
Fig. 3 is the flow chart of steps of the inventive method;
Fig. 4 is the optical axis and optical axis drift angle synoptic diagram, wherein (a) be right and left eyes in the horizontal direction the angle (b) of the optical axis and optical axis be that right and left eyes is at the angle of the vertical direction optical axis and optical axis;
Fig. 5 is that screen area is divided and the test point distribution schematic diagram.
Embodiment
The method of this invention is based on the gaze tracking system that adopts double camera, so that calculating three dimensional coordinate.The hardware facility of present embodiment comprises a 2.99GHZ Dell double-core personal computer, indicator screen is 17 cun (34 * 27cm), and two corners arrange respectively a near-infrared light source below display, be used for producing reflection spot on the cornea, it is that 680 * 480 gray scale video camera is used for experimentation and gathers video that two resolution are installed in the computing machine below.Before the tester was sitting in screen, head can move freely in the field range of double camera.
Fig. 1 has provided the hardware block diagram of system, and Fig. 2 has provided the sight line tracking system block diagram, and Fig. 3 has provided the realization flow of this invention, and is as follows according to this invention specific implementation step of flow process:
(1) based on the blinkpunkt position estimation error correction data generation phase of eye space position
(1) screen is drawn be divided into 9*10 the zone, horizontal direction evenly is divided into 10 zones, vertical direction evenly is divided into 10 zones, uses respectively A
11, A
110, A
91A
910The expression, 2 test points evenly are set in each zone on screen, specifically cut apart with the test point set-up mode as shown in Figure 5;
(2) allow 10 testers test, the test point that allows each tested person person watch successively diverse location on the screen attentively, each point was watched attentively 3-5 seconds, and the blinkpunkt evaluated error that corresponding record is each and pupil center are mapped in the position coordinates on the screen;
(3) calculate error between the true coordinate of location point of the blinkpunkt position coordinates estimate and regulation, as the blinkpunkt evaluated error, its embodiment is:
A, obtain the image coordinate of pupil center by graph processing technique and ellipse fitting technology, utilize the blinkpunkt algorithm for estimating to estimate the position coordinates of blinkpunkt;
B, the blinkpunkt coordinate that estimates and true coordinate relatively calculate the distance error of different screen domain test point this moment;
C, the distance error that calculates in conjunction with the middle distance of this moment pupil center and screen, are converted to angular error with distance error.
(4) volume coordinate of calculating pupil center determines that pupil center is mapped in the zone on plane, computer screen place, and its embodiment is:
A, with the match point of pupil center as image, utilize double camera three-dimensional imaging principle to determine center coordinate of eye pupil;
B, according to the camera coordinates system of the pupil center that obtains among a and the position relationship of camera and screen, the double camera normal to screen is in the lower boundary center of screen, and the camera coordinates system of pupil is converted to coordinate in world coordinate system.
(5) allow tested person person that the point of the diverse location on the screen is tested, and 10 tested person persons are tested, record accordingly each blinkpunkt evaluated error and pupil center and be mapped in coordinate on the computer screen.
(6) utilize curve fitting algorithm, vertically be mapped in coordinate on the screen as reference point take pupil center, simulate the evaluated error of different screen subregion and the relation of blinkpunkt position, generate thus correction data, as shown in Figure 4, among Fig. 4 (a) (b) numeral in the horizontal ordinate represent respectively the zone that screen is divided in the horizontal and vertical directions, its embodiment is:
A, according to the blinkpunkt position coordinates that calculates in the step (3) and the true coordinate of test point, judge in which zone of blinkpunkt place screen;
B, according to the volume coordinate of pupil center in the step (4), vertically be mapped on the plane at screen place, determine in which zone of pupil place screen;
C, take the zone at pupil center place as reference point (being 0 position of horizontal ordinate among Fig. 4), by curve, simulate the evaluated error (take angle as unit) of different screen subregion and the relation of blinkpunkt position, generate correction data.
According to the test to 10 tested person persons, obtain result as shown in Figure 4.Wherein Fig. 4 (a) is the in the horizontal direction angle of the optical axis and optical axis of right and left eyes, and Fig. 4 (b) is that right and left eyes is at the angle of the vertical direction optical axis and optical axis.As shown in Figure 4, the Alpha angle between the optical axis and the optical axis changes because of different direction of gaze, and the point that vertically meets on the screen take pupil center is described as reference point.In the horizontal direction, the drift angle can constantly become large when right eye was seen to right, this moment, optical axis was based on optical axis deflection to the right, sail angle can reach 5 °, the drift angle can reduce gradually until the optical axis and optical axis coincidence when eyeing left, the optical axis that eyes left again is based on optical axis deflection left, and the drift angle that reaches behind 1 ° of the sail angle between the two can diminish, until drift angle between the two is 0 °; Equally, the drift angle can constantly become large when left eye eyed left, this moment, optical axis was based on optical axis deflection left, sail angle can reach 5 °, the drift angle can reduce gradually until the optical axis and optical axis coincidence when eyeing right, the optical axis that eyes right again is based on optical axis deflection to the right, and the drift angle that reaches behind 1 ° of the sail angle between the two can reduce, until drift angle between the two is 0 °; In the vertical direction, the drift angle situation of the right and left eyes optical axis and optical axis is identical, the zone that reference point is above, regional drift angle more up is larger, sail angle is 1 °, the drift angle of this moment is that optical axis upward deflects based on the optical axis, more below larger and drift angle, regional drift angle be that optical axis deflects down based on the optical axis.
According to above-mentioned test result, obtain correction data, this correction data has reflected the evaluated error of different screen subregion and the relation of blinkpunkt position.During actual measurement, after estimating to obtain screen partition corresponding to blinkpunkt position and human eye, can obtain corresponding correction error, directly blinkpunkt location estimation value be compensated with this correction error, can obtain comparatively accurately blinkpunkt location estimation value.
(2) calibration phase that utilizes during actual measurement the correction data that generated to proofread and correct
(1) allows the tester that the test point of diverse location on the screen is watched attentively, utilize the blinkpunkt algorithm for estimating to estimate
Go out the position coordinates of blinkpunkt;
(2) utilize that double camera calculates this moment pupil center volume coordinate, determine pupil center and vertically be mapped in coordinate on the screen;
(3) according to the blinkpunkt position coordinates in step (1) and (2) and the position coordinates of pupil center, vertically be mapped in coordinate on the screen as reference point take pupil center, relation according to the blinkpunkt position coordinates that simulates and evaluated error is carried out real time correction to blinkpunkt, obtains comparatively accurate blinkpunkt coordinate.
The position coordinates that a, the blinkpunkt position coordinates that calculates and pupil are mapped on the screen is determined both subregions on screen, supposes that both are respectively A by the subregion on screen
Ij, A
Lk, then take the subregion at pupil center place as reference point, the position of blinkpunkt is A
(i-l) (j-k)
B, in conjunction with correction data and Fig. 4 of obtaining in the stage (one), obtain corresponding correction error this moment.Namely be as the criterion with horizontal ordinate 0, the correction error of horizontal direction is (i-l) regional corresponding drift angle value, and the correction error of vertical direction is (j-k) regional corresponding drift angle value;
C, the drift angle value that obtains according to step (b) corresponding remove to proofread and correct the blinkpunkt coordinate that estimates.
Claims (9)
1. eye tracking bearing calibration based on the locus, it is characterized in that: the calibration phase that the correction data that utilization had generated when the method was divided into based on the blinkpunkt position estimation error correction data generation phase of eye space position and actual measurement is proofreaied and correct, concrete steps are:
(1) based on the blinkpunkt position estimation error correction data generation phase of eye space position
(1) computer screen is divided into some zones, the location point of the some known coordinates of each zoning;
(2) choose several tested person persons, allow tested person person watch the location point of regulation in the step (1) attentively, utilize the blinkpunkt algorithm for estimating to estimate the position coordinates of blinkpunkt;
(3) calculate error between the true coordinate of location point of the blinkpunkt position coordinates estimate and regulation, as the blinkpunkt evaluated error;
(4) utilize double camera to calculate the volume coordinate of pupil center, then determine pupil center vertically be mapped in computer screen coordinate in the plane;
(5) allow tested person person that the point of the diverse location on the screen is tested, and several tested person persons are tested, record accordingly each blinkpunkt evaluated error and pupil center and be mapped in coordinate on the computer screen;
(6) utilize curve fitting algorithm, vertically be mapped in coordinate on the screen as reference point take pupil center, simulate the evaluated error of different screen subregion and the relation of blinkpunkt position, generate thus correction data;
(2) calibration phase that utilizes during actual measurement the correction data that generated to proofread and correct
(1) in the actual measurement of gaze tracking system, utilize the blinkpunkt algorithm for estimating to estimate the position coordinates of user's blinkpunkt;
(2) utilize double camera to calculate in real time the volume coordinate of user pupil center, determine pupil center and be mapped in coordinate on the screen;
(3) be mapped in coordinate on the screen according to user's blinkpunkt position coordinates and pupil center, determine when being mapped in coordinate on the screen as reference point take pupil center, the screen area at blinkpunkt position coordinates place, the evaluated error of the different screen subregion that simulates according to step (6) in the stage (one) and the relation of blinkpunkt position, the user's blinkpunkt coordinate that estimates is carried out real time correction, obtain comparatively accurate user's blinkpunkt coordinate.
2. the eye tracking bearing calibration based on the locus according to claim 1, it is characterized in that: in described stage (one's) step (1), screen is divided into m*n zone, horizontal direction is equally divided into m zone, vertical direction is equally divided into n zone, wherein A
11, A
1n, A
M1A
MnRepresent respectively the zone of relevant position, the test point of several known coordinates is set respectively in each zone.
3. the eye tracking bearing calibration based on the locus according to claim 1, it is characterized in that: in described stage (one's) step (2), several testers are tested, each tester watches the test point on the screen successively attentively, each point is watched attentively the regular hour, utilizes the blinkpunkt algorithm for estimating to calculate the position coordinates of blinkpunkt.
4. the eye tracking bearing calibration based on the locus according to claim 1, it is characterized in that: in described stage (one's) step (3), real blinkpunkt is determined by the optical axis, the method that obtains at present blinkpunkt is determined by optical axis, therefore in this step, calculate angular error can be considered the angular deviation of the optical axis and optical axis, the specific implementation step is:
A, blinkpunkt coordinate and the true coordinate of at first utilizing middle step (2) of stage (one) to estimate compare, and calculate the distance error of different screen domain test point this moment;
B, according to the distance error that calculates among the step a, the distance in conjunction with pupil center and screen centre this moment is converted to angular error with distance error.
5. the eye tracking bearing calibration based on the locus according to claim 1 is characterized in that: in described stage (one's) step (4), the specific implementation step is:
A, at first utilize double camera three-dimensional imaging principle to determine center coordinate of eye pupil;
B, according to the position relationship of camera and screen, the camera coordinates system of pupil is converted to coordinate in world coordinate system.
6. the eye tracking bearing calibration based on the locus according to claim 5, it is characterized in that: among the described step a, the double camera normal to screen is in the lower boundary center of screen, utilizes the three-dimensional imaging principle to calculate the coordinate of pupil center in camera coordinates system.
7. the eye tracking bearing calibration based on the locus according to claim 5 is characterized in that: among the described step b, the camera coordinates of the pupil that calculates among the step a is converted to coordinate in the world coordinate system take the screen lower left corner as initial point.
8. the eye tracking bearing calibration based on the locus according to claim 1 is characterized in that: in described stage (one's) step (6), the specific implementation step is:
The blinkpunkt position coordinates that calculates in a, the step (2) according to stage (one) and the true coordinate of test point are judged in which subregion of blinkpunkt place screen;
B, be mapped in position coordinates on the screen according to pupil center in stage (one's) the step (4), judge in which subregion on pupil center's place screen;
C, take the zone at pupil center place as reference point, by curve, simulate the evaluated error take angle as unit of different screen subregion and the relation of blinkpunkt position, generate correction data.
9. the eye tracking bearing calibration based on the locus according to claim 1 is characterized in that: in described stage (twos') step (3), the specific implementation step is:
The position coordinates that the blinkpunkt position coordinates that calculates in a, the step (1) (2) according to stage (two) and pupil are mapped on the screen is determined both subregions on screen, supposes that both are respectively A by the subregion on screen
Ij, A
Lk, then take the subregion at pupil center place as reference point, the position of blinkpunkt is A
(i-l) (j-k)
B, in conjunction with the correction data that obtains in the stage (one), obtain corresponding correction error this moment, the drift angle value that namely horizontal direction and vertical direction are corresponding;
C, the drift angle value that obtains according to step b corresponding remove to proofread and correct the blinkpunkt coordinate that estimates.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2013103398035A CN103366381A (en) | 2013-08-06 | 2013-08-06 | Sight line tracking correcting method based on space position |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2013103398035A CN103366381A (en) | 2013-08-06 | 2013-08-06 | Sight line tracking correcting method based on space position |
Publications (1)
Publication Number | Publication Date |
---|---|
CN103366381A true CN103366381A (en) | 2013-10-23 |
Family
ID=49367649
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2013103398035A Pending CN103366381A (en) | 2013-08-06 | 2013-08-06 | Sight line tracking correcting method based on space position |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103366381A (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104644120A (en) * | 2013-11-15 | 2015-05-27 | 现代自动车株式会社 | Gaze detecting apparatus and method |
CN104915013A (en) * | 2015-07-03 | 2015-09-16 | 孙建德 | Eye tracking and calibrating method based on usage history |
CN105078404A (en) * | 2015-09-02 | 2015-11-25 | 北京津发科技股份有限公司 | Fully automatic eye movement tracking distance measuring calibration instrument based on laser algorithm and use method of calibration instrument |
CN105260008A (en) * | 2014-07-15 | 2016-01-20 | 华为技术有限公司 | Position locating method and device |
CN105278659A (en) * | 2014-06-18 | 2016-01-27 | 中国电信股份有限公司 | Target positioning method and device based on visual line tracking technology |
WO2016115870A1 (en) * | 2015-01-21 | 2016-07-28 | 成都理想境界科技有限公司 | Binocular ar head-mounted display device and information displaying method therefor |
CN106124169A (en) * | 2016-06-29 | 2016-11-16 | 南京睿悦信息技术有限公司 | A kind of VR helmet equipment angle of visual field measuring method |
CN106662917A (en) * | 2014-04-11 | 2017-05-10 | 眼球控制技术有限公司 | Systems and methods of eye tracking calibration |
CN107003521A (en) * | 2014-09-22 | 2017-08-01 | 脸谱公司 | The display visibility assembled based on eyes |
CN107392120A (en) * | 2017-07-06 | 2017-11-24 | 电子科技大学 | A kind of notice intelligence direct method based on sight estimation |
CN107770561A (en) * | 2017-10-30 | 2018-03-06 | 河海大学 | A kind of multiresolution virtual reality device screen content encryption algorithm using eye-tracking data |
CN108462868A (en) * | 2018-02-12 | 2018-08-28 | 叠境数字科技(上海)有限公司 | The prediction technique of user's fixation point in 360 degree of panorama VR videos |
CN108833880A (en) * | 2018-04-26 | 2018-11-16 | 北京大学 | Using across user behavior pattern carry out view prediction and realize that virtual reality video optimizes the method and apparatus transmitted |
CN109213323A (en) * | 2018-08-28 | 2019-01-15 | 北京航空航天大学青岛研究院 | A method of screen Attitude estimation is realized based on eye movement interaction technique |
CN109903231A (en) * | 2017-12-11 | 2019-06-18 | 上海聚虹光电科技有限公司 | The antidote of VR or AR equipment iris image |
CN109976514A (en) * | 2019-03-01 | 2019-07-05 | 四川大学 | Eye movement data bearing calibration based on eyeball error model |
CN110263745A (en) * | 2019-06-26 | 2019-09-20 | 京东方科技集团股份有限公司 | A kind of method and device of pupil of human positioning |
CN110537897A (en) * | 2019-09-10 | 2019-12-06 | 北京未动科技有限公司 | Sight tracking method and device, computer readable storage medium and electronic equipment |
CN110908511A (en) * | 2019-11-08 | 2020-03-24 | Oppo广东移动通信有限公司 | Method for triggering recalibration and related device |
CN112132755A (en) * | 2019-06-25 | 2020-12-25 | 京东方科技集团股份有限公司 | Method, device and system for correcting and calibrating pupil position and computer readable medium |
CN113361459A (en) * | 2021-06-29 | 2021-09-07 | 平安普惠企业管理有限公司 | Advertisement display method, device and equipment based on fixation point identification and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110069277A1 (en) * | 2009-04-01 | 2011-03-24 | Tobii Technology Ab | Visual display with illuminators for gaze tracking |
CN102176191A (en) * | 2011-03-23 | 2011-09-07 | 山东大学 | Television control method based on sight-tracking |
CN102802502A (en) * | 2010-03-22 | 2012-11-28 | 皇家飞利浦电子股份有限公司 | System and method for tracking the point of gaze of an observer |
-
2013
- 2013-08-06 CN CN2013103398035A patent/CN103366381A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110069277A1 (en) * | 2009-04-01 | 2011-03-24 | Tobii Technology Ab | Visual display with illuminators for gaze tracking |
CN102802502A (en) * | 2010-03-22 | 2012-11-28 | 皇家飞利浦电子股份有限公司 | System and method for tracking the point of gaze of an observer |
CN102176191A (en) * | 2011-03-23 | 2011-09-07 | 山东大学 | Television control method based on sight-tracking |
Non-Patent Citations (2)
Title |
---|
SHENG-WEN等: "A Calibration-Free Gaze Tracking Technique", 《PROCEEDINGS 15TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION》 * |
龚秀峰等: "基于标记点检测的视线跟踪注视点估计", 《计算机工程》 * |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104644120A (en) * | 2013-11-15 | 2015-05-27 | 现代自动车株式会社 | Gaze detecting apparatus and method |
CN106662917A (en) * | 2014-04-11 | 2017-05-10 | 眼球控制技术有限公司 | Systems and methods of eye tracking calibration |
CN105278659A (en) * | 2014-06-18 | 2016-01-27 | 中国电信股份有限公司 | Target positioning method and device based on visual line tracking technology |
CN105278659B (en) * | 2014-06-18 | 2018-09-14 | 中国电信股份有限公司 | Object localization method based on Eye Tracking Technique and device |
CN105260008B (en) * | 2014-07-15 | 2018-10-12 | 华为技术有限公司 | A kind of method and device of position location |
US10168773B2 (en) | 2014-07-15 | 2019-01-01 | Huawei Technologies Co., Ltd. | Position locating method and apparatus |
CN105260008A (en) * | 2014-07-15 | 2016-01-20 | 华为技术有限公司 | Position locating method and device |
CN107003521A (en) * | 2014-09-22 | 2017-08-01 | 脸谱公司 | The display visibility assembled based on eyes |
CN107003521B (en) * | 2014-09-22 | 2020-06-05 | 脸谱科技有限责任公司 | Display visibility based on eye convergence |
US11263794B2 (en) | 2015-01-21 | 2022-03-01 | Chengdu Idealsee Technology Co., Ltd. | Binocular see-through AR head-mounted display device and information displaying method thereof |
WO2016115870A1 (en) * | 2015-01-21 | 2016-07-28 | 成都理想境界科技有限公司 | Binocular ar head-mounted display device and information displaying method therefor |
CN104915013B (en) * | 2015-07-03 | 2018-05-11 | 山东管理学院 | A kind of eye tracking calibrating method based on usage history |
CN104915013A (en) * | 2015-07-03 | 2015-09-16 | 孙建德 | Eye tracking and calibrating method based on usage history |
CN105078404A (en) * | 2015-09-02 | 2015-11-25 | 北京津发科技股份有限公司 | Fully automatic eye movement tracking distance measuring calibration instrument based on laser algorithm and use method of calibration instrument |
CN106124169A (en) * | 2016-06-29 | 2016-11-16 | 南京睿悦信息技术有限公司 | A kind of VR helmet equipment angle of visual field measuring method |
CN107392120B (en) * | 2017-07-06 | 2020-04-14 | 电子科技大学 | Attention intelligent supervision method based on sight line estimation |
CN107392120A (en) * | 2017-07-06 | 2017-11-24 | 电子科技大学 | A kind of notice intelligence direct method based on sight estimation |
CN107770561A (en) * | 2017-10-30 | 2018-03-06 | 河海大学 | A kind of multiresolution virtual reality device screen content encryption algorithm using eye-tracking data |
CN109903231A (en) * | 2017-12-11 | 2019-06-18 | 上海聚虹光电科技有限公司 | The antidote of VR or AR equipment iris image |
CN108462868A (en) * | 2018-02-12 | 2018-08-28 | 叠境数字科技(上海)有限公司 | The prediction technique of user's fixation point in 360 degree of panorama VR videos |
CN108833880A (en) * | 2018-04-26 | 2018-11-16 | 北京大学 | Using across user behavior pattern carry out view prediction and realize that virtual reality video optimizes the method and apparatus transmitted |
CN109213323A (en) * | 2018-08-28 | 2019-01-15 | 北京航空航天大学青岛研究院 | A method of screen Attitude estimation is realized based on eye movement interaction technique |
CN109976514A (en) * | 2019-03-01 | 2019-07-05 | 四川大学 | Eye movement data bearing calibration based on eyeball error model |
CN109976514B (en) * | 2019-03-01 | 2021-09-03 | 四川大学 | Eyeball error model-based eye movement data correction method |
CN112132755A (en) * | 2019-06-25 | 2020-12-25 | 京东方科技集团股份有限公司 | Method, device and system for correcting and calibrating pupil position and computer readable medium |
CN110263745B (en) * | 2019-06-26 | 2021-09-07 | 京东方科技集团股份有限公司 | Method and device for positioning pupils of human eyes |
CN110263745A (en) * | 2019-06-26 | 2019-09-20 | 京东方科技集团股份有限公司 | A kind of method and device of pupil of human positioning |
CN110537897A (en) * | 2019-09-10 | 2019-12-06 | 北京未动科技有限公司 | Sight tracking method and device, computer readable storage medium and electronic equipment |
CN110908511A (en) * | 2019-11-08 | 2020-03-24 | Oppo广东移动通信有限公司 | Method for triggering recalibration and related device |
CN110908511B (en) * | 2019-11-08 | 2022-03-15 | Oppo广东移动通信有限公司 | Method for triggering recalibration and related device |
CN113361459A (en) * | 2021-06-29 | 2021-09-07 | 平安普惠企业管理有限公司 | Advertisement display method, device and equipment based on fixation point identification and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103366381A (en) | Sight line tracking correcting method based on space position | |
CN108968907B (en) | The bearing calibration of eye movement data and device | |
CN109558012B (en) | Eyeball tracking method and device | |
CN104113680B (en) | Gaze tracking system and method | |
CN102043952B (en) | Eye-gaze tracking method based on double light sources | |
CN108038884B (en) | Calibration method, calibration device, storage medium and processor | |
CN108235778B (en) | Calibration method and device based on cloud computing, electronic equipment and computer program product | |
WO2016115874A1 (en) | Binocular ar head-mounted device capable of automatically adjusting depth of field and depth of field adjusting method | |
CN107330976B (en) | Human head three-dimensional modeling device and use method | |
CN104951084A (en) | Eye-tracking method and device | |
CN109976514B (en) | Eyeball error model-based eye movement data correction method | |
US20220207919A1 (en) | Methods, devices and systems for determining eye parameters | |
US10620454B2 (en) | System and method of obtaining fit and fabrication measurements for eyeglasses using simultaneous localization and mapping of camera images | |
JP2012524293A5 (en) | ||
CN105278659B (en) | Object localization method based on Eye Tracking Technique and device | |
CN107633240A (en) | Eye-controlling focus method and apparatus, intelligent glasses | |
WO2019204171A1 (en) | Adjusting gaze point based on determined offset adjustment | |
US10898073B2 (en) | Optoelectronic binocular instrument for the correction of presbyopia and method for the binocular correction of presbyopia | |
KR20210128467A (en) | Devices and methods for evaluating the performance of visual equipment on visual tasks | |
CN113940812B (en) | Cornea center positioning method for excimer laser cornea refractive surgery | |
Lu et al. | Neural 3D gaze: 3D pupil localization and gaze tracking based on anatomical eye model and neural refraction correction | |
US20240210730A1 (en) | Method for simulating optical products | |
CN109213323A (en) | A method of screen Attitude estimation is realized based on eye movement interaction technique | |
CN110414302A (en) | Non-contact pupil distance measuring method and system | |
US20200201070A1 (en) | System and method of utilizing computer-aided optics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C02 | Deemed withdrawal of patent application after publication (patent law 2001) | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20131023 |