CN111625090A - Comprehensive testing platform for large-range eye movement tracking and sight line estimation algorithm - Google Patents

Comprehensive testing platform for large-range eye movement tracking and sight line estimation algorithm Download PDF

Info

Publication number
CN111625090A
CN111625090A CN202010400457.7A CN202010400457A CN111625090A CN 111625090 A CN111625090 A CN 111625090A CN 202010400457 A CN202010400457 A CN 202010400457A CN 111625090 A CN111625090 A CN 111625090A
Authority
CN
China
Prior art keywords
setting
display screen
test
eye movement
sight line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010400457.7A
Other languages
Chinese (zh)
Inventor
刘天键
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Minjiang University
Original Assignee
Minjiang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Minjiang University filed Critical Minjiang University
Priority to CN202010400457.7A priority Critical patent/CN111625090A/en
Publication of CN111625090A publication Critical patent/CN111625090A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

The invention discloses a comprehensive test platform for large-range eye movement tracking and sight line estimation algorithms, and particularly relates to the field of performance test of an eye movement tracking system. According to the invention, the stability of the system is inspected through standard test setting, special position test setting, bracket test setting, environment setting, shielding setting, robustness setting, stability setting and rapidity setting, coordinate space transformation and multi-resolution analysis, the spatial stability, the scale stability and the time stability of the system can be comprehensively inspected, and the test effect of the invention is better as a whole.

Description

Comprehensive testing platform for large-range eye movement tracking and sight line estimation algorithm
Technical Field
The invention relates to the technical field of performance testing of an eye movement tracking system, in particular to a comprehensive testing platform for a large-range eye movement tracking and sight line estimation algorithm.
Background
Along with the rapid development of computer software and hardware, people's lives become faster and more convenient, ways of using computers in complex and changeable environments are continuously improved and changed, traditional mouse and keyboard-based man-machine interaction technology increasingly shows limitations, and man-machine interaction ways based on suspension touch technology are gradually favored by people due to flexibility, intelligence, convenience and naturalness of the man-machine interaction ways
The eye movement tracking and sight line estimation system is used for collecting facial images of a test person and predicting the position of a fixation point on a screen by estimating the motion state of eyeballs relative to the head. In the prior art, there are many methods for tracking a line of sight, such as an eye tracker using infrared illumination and an eye tracking system based on a visible light environment. Different eye tracking devices or systems are based on different technical routes, the used methods are different, and the used range and the tracking index are different. Testing different eye tracking systems may provide a basis for improving system performance, thereby pointing to improved systems while facilitating comparison of different methods on the same platform. Therefore, a platform capable of comprehensively testing the effects of eye movement tracking and sight line estimation is particularly important, and the existing platform is not comprehensive enough for system testing, so that the testing effect is not ideal enough, and therefore, a comprehensive testing platform for large-range eye movement tracking and sight line estimation algorithms is urgently needed to be designed, and the comprehensive performance index of the eye movement tracking system can be comprehensively tested by the large-range system.
Disclosure of Invention
To overcome the above-mentioned drawbacks of the prior art, embodiments of the present invention provide a comprehensive testing platform for a wide-range eye-movement tracking and gaze estimation algorithm, the system stability and reliability when the head moves freely are inspected through standard test setting, the influence of position disturbance on the system stability and precision is inspected through special position test setting, the system accuracy and reliability when the head is fixed and fixed are inspected through bracket test setting, the response of the system to environmental change is inspected through environmental setting and shielding setting, and the dynamic and static performance indexes of the system are inspected through robustness setting, stability setting and rapidity setting, the system stability is inspected through coordinate space transformation and multiresolution analysis, the spatial stability, the scale stability and the time stability of the system can be comprehensively inspected, and the test effect of the invention is better as a whole.
In order to achieve the purpose, the invention provides the following technical scheme: a large-range eye movement tracking and sight line estimation algorithm comprehensive test platform comprises a console and a display screen, the CPU performance of the console is not lower than Pentium IV2.8GHz, the memory of the console is not lower than 8GB, a remote eye tracker is arranged at the bottom of the display screen, a first camera placing position is arranged at the center of the top of the display screen, a second camera placing position is arranged at the left side of the top of the display screen at a position 19.5cm away from the central axis of the display screen, other positions are distributed around the display screen similarly, a head fixing support used in a test scene needing head immobilization is arranged at a position 80cm in front of the display screen, and a chair is arranged at the rear side of the head fixing support;
taking the point E as the key point of eyes, the point C as the central point of the display screen, x as a target coordinate, x' as an estimated coordinate, and expressing the distance error as follows:
e=|x′-x|
the angle error is expressed as:
e=|arctan(DxC/DEC)-arctan(Dx′C/DEC)|。。
in a preferred embodiment, a standard test setting, a special position test setting, a bracket test setting, an environment setting, a shielding setting, a robustness setting, a stability setting and a rapidity setting are further included.
In a preferred embodiment, the standard test setting is specifically that a remote eye tracker is used as a landmark of a viewpoint, a head fixing support is removed, a test object is 80cm away from a display screen, and the display screen displays a browsed webpage.
In a preferred embodiment, the special position test setting is specifically that the cameras are arranged at the camera placement positions around the display screen, when the camera positions are changed, the relative angles of the human face and the display screen are changed, tracking is achieved in a large range through coordinate transformation, and other settings are the same as standard test settings.
In a preferred embodiment, the cradle test set-up is embodied as a gaze tracking test with the head stationary, with the head stationary cradle placed back 80cm from the display screen, the other set-ups being identical to the standard test set-up.
In a preferred embodiment, the environment setting is specifically that the light brightness of the experiment segment is changed.
In a preferred embodiment, the blocking setting is to partially block a face with a mask, the robustness setting is to partially block eyes with a mask, and the robustness setting is to respectively block left and right eyes, close eyes and move the head into the field of view after leaving the field of view of the camera.
In a preferred embodiment, the stability setting is specifically a large range of movement of the human face in clockwise and counterclockwise directions, and the quickness setting is specifically a random display of bright spots on the display screen.
The invention has the technical effects and advantages that:
the system is characterized in that the system stability and reliability of the system when the head moves freely are inspected through standard test setting, the influence of position disturbance on the system stability and precision is inspected through special position test setting, the system accuracy and reliability when the head is fixed and fixed are inspected through bracket test setting, the response of the system to environment change is inspected through environment setting and shielding setting, the dynamic and static performance indexes of the system are inspected through robustness setting, stability setting and rapidity setting, the system stability is inspected through coordinate transformation and multiresolution analysis, a point E is taken as the key point of eyes in figure 1, a point C is taken as the central point of a display screen 2, a point X is taken as a target coordinate, a point X 'is taken as an estimated coordinate, and a distance error E is taken as x' -x and an angle error E is taken as arctan (D)xC/DEC)-arctan(Dx′C/DEC) The accuracy of the system is defined, so that the spatial stability, the scale stability and the time stability of the system can be comprehensively inspected, and the testing effect of the invention is better as a whole.
Drawings
Fig. 1 is a schematic view of the overall structure of the present invention.
Fig. 2 is a schematic diagram of coordinate transformation of the present invention, wherein x ═ d × S + P, x: abscissa of the bright spot in the camera coordinate system (in mm), d: abscissa (point unit) of the bright spot in the device coordinate system, S: scale factor (in mm/point), P: the position of the origin of the device coordinates relative to the camera coordinate system (in millimeters).
Fig. 3 is a schematic diagram of the transformation of different camera placement positions to the camera coordinate system of the present invention, note: different camera positions simulate different positions of the face relative to the display screen.
The reference signs are: 1 control cabinet, 2 display screens, 3 remote formula eye movement appearance, 4 first camera place the position, 5 second camera place the position, 6 head fixed bolster, 7 arm-chairs.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The comprehensive testing platform for the large-range eye movement tracking and sight line estimation algorithm comprises a console 1 and a display screen 2, wherein the CPU performance of the console 1 is not lower than Pentium IV2.8GHz, the memory of the console 1 is not lower than 8GB, a remote eye movement instrument 3 is arranged at the bottom of the display screen 2, a first camera placing position 4 is arranged at the center of the top of the display screen 2, a second camera placing position 5 is arranged at the position, 19.5cm away from the central axis of the display screen 2, of the left side of the top of the display screen 2, a head fixing support 6 used in a testing scene needing head fixation is arranged at the position 80cm in front of the display screen 2, and a chair 7 is arranged at the rear side of the head fixing support 6;
taking the point E as the key point of the eyes, the point C as the central point of the display screen 2, x as a target coordinate, x' as an estimated coordinate, and the distance error is expressed as:
e=|x′-x|
the angle error is expressed as:
e=|arctan(DxC/DEC)-arctan(Dx′C/DEC)|;
the method also comprises standard test setting, special position test setting, bracket test setting, environment setting, shielding setting, robustness setting, stability setting and rapidity setting;
the standard test setting is specifically that a remote eye tracker 3 is used as a landmark of a viewpoint, a head fixing support 6 is removed, a test object is 280cm away from a display screen, and a browsed webpage is displayed on the display screen 2;
the special position test setting is specifically that the cameras are arranged at the camera placing positions around the display screen 2, when the positions of the cameras are changed, the relative angles of the human face and the display screen are changed, tracking is achieved in a large range through coordinate transformation, and other settings are the same as standard test settings;
the support test setting is specifically that when the head fixing support 6 is placed back to a position 280cm away from the display screen, the sight tracking when the head is fixed is tested, and other settings are the same as the standard test setting;
the environment setting is specifically that the lamplight brightness of an experiment link is changed;
the shielding setting is to partially shield the face by using a mask, the robustness setting is to partially shield the eyes by using the mask, and the robustness setting is to respectively shield the left eye and the right eye, close the eyes and move the rotating head into the visual field after leaving the visual field of the camera;
the stability setting specifically is that the face moves on a large scale in clockwise and anticlockwise directions, the rapidity setting specifically is that bright spots are randomly displayed on the display screen 2.
The implementation mode is specifically as follows: when the invention is used for testing the system, a tester sits on the armchair 7 facing the display screen 2, relaxes the body, looks at the display screen 2, the lifting adjustment is carried out through the adjusting bracket of the display screen 2, so that the face of the test object is aligned with the central position of the display screen 2, a camera is arranged at a first camera placing position 4 at the center of the top of the display screen 2, the position of the camera is changed, the camera is arranged around the display screen 2, the remote eye tracker 3 is arranged below the display screen 2 and used for comparison test, the head fixing bracket 6 is arranged at the position 80cm in front of the display screen 2 and used for fixing the head posture, according to different experiments, the position of the chair 7 can be adjusted to change the distance between the test object and the display screen 2, the standard test setting is used for inspecting the stability and accuracy of the system when the head moves freely, and the special position test setting is used for inspecting the position disturbance to the system.The influence of stability and precision, the support test setting is used for inspecting the accuracy and reliability of the system when the head is fixed and fixed, the response of the system to the environmental change is inspected by changing the light brightness (0.1-101ux) of an experimental link in the environment setting and partially shielding eyes (10% -60%) by using a shielding cover in the shielding setting, the dynamic and static performance indexes of the system are inspected by robustness setting, stability setting and rapidity setting of randomly displaying bright spots 5-20 times per second, the system stability is inspected by browsing different webpages, a point E is taken as the key point of eyes in figure 1, a point C is the central point of a display screen 2, x is a target coordinate, x 'is an estimated coordinate, and a distance error E ═ x' -x | and an angle error E ═ arctan (D) are adoptedxC/DEC)-arctan(Dx′C/DEC) The accuracy of the system is defined, images collected by a plurality of cameras arranged at a plurality of positions in the figure 2 are converted into camera coordinates from equipment coordinates through coordinate conversion, the vision of the cameras can be expanded by the same camera coordinates to the figure 3, data collection of different directions of the face and the cameras is achieved, the range and the accuracy of sight estimation are expanded, meanwhile, the reliability of the system under images of different scales is investigated by adopting multi-resolution analysis, so that the spatial stability, the scale stability and the time stability of the system can be comprehensively investigated, and the overall testing effect of the invention is better.
The working principle of the invention is as follows:
referring to the attached figure 1 of the specification, when the system is tested by using the invention, a tester sits on the armchair 7 facing the display screen 2, relaxes the body, looks at the display screen 2, the lifting adjustment is carried out through the adjusting bracket of the display screen 2, so that the face of the test object is aligned with the central position of the display screen 2, according to different experiments, the position of the arm-chair 7 can be adjusted to change the distance between a test object and the display screen 2, the standard test setting is used for inspecting the stability and reliability of the system when the head freely moves in a large range, the special position test setting is used for inspecting the influence of position disturbance on the stability and precision of the system, the support test setting is used for inspecting the accuracy and reliability of the system when the head is fixed, and the light brightness (0.1-101ux) of an experiment link in the environment setting and the shielding part in the shielding setting are shielded by a shielding cover.Eyes (10% -60%) are used for inspecting the response of the system to environmental changes, robustness setting, stability setting and rapidity setting of randomly displaying bright points 5-20 times per second are used for inspecting the dynamic and static performance indexes of the system, the reliability of the system is inspected by browsing common web pages and comparing the web pages with landmark data of a remote eye tracker 3, in the graph 1, a point E is taken as the key point of the eyes, a point C is the central point of a display screen 2, x is target coordinates, x 'is estimated coordinates, and distance error E ═ x' -x ═ and angle error E ═ arctan (D is adoptedxC/DEC)-arctan(Dx′C/DEC) The accuracy of the system is defined, so that the spatial stability, the scale stability and the time stability of the system can be comprehensively inspected, and the testing effect of the invention is better as a whole.
The points to be finally explained are: first, in the description of the present application, it should be noted that, unless otherwise specified and limited, the terms "mounted," "connected," and "connected" should be understood broadly, and may be a mechanical connection or an electrical connection, or a communication between two elements, and may be a direct connection, and "upper," "lower," "left," and "right" are only used to indicate a relative positional relationship, and when the absolute position of the object to be described is changed, the relative positional relationship may be changed;
secondly, the method comprises the following steps: in the drawings of the disclosed embodiments of the invention, only the structures related to the disclosed embodiments are referred to, other structures can refer to common designs, and the same embodiment and different embodiments of the invention can be combined with each other without conflict;
and finally: the above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that are within the spirit and principle of the present invention are intended to be included in the scope of the present invention.

Claims (8)

1. A comprehensive test platform for large-range eye movement tracking and sight line estimation algorithm is characterized in that: the device comprises a console (1) and a display screen (2), the CPU performance of the console (1) is not lower than Pentium IV2.8GHz, the memory of the console (1) is not lower than 8GB, a remote eye tracker (3) is arranged at the bottom of the display screen (2), a first camera placing position (4) is arranged at the center of the top of the display screen (2), a second camera placing position (5) is arranged at the position, 19.5cm away from the central axis of the display screen (2), of the left side of the top of the display screen (2), other positions are distributed around the display screen similarly, a head fixing support (6) used in a test scene needing head immobilization is arranged at the position 80cm in front of the display screen (2), and a chair (7) is arranged at the rear side of the head fixing support (6);
taking the point E as the key point of eyes, the point C as the central point of the display screen (2), x as a target coordinate, x' as an estimated coordinate, and the distance error is expressed as:
e=|x′-x|
the angle error is expressed as:
e=|arctan(DxC/DEC)-arctan(Dx′C/DEC)|。
2. the comprehensive test platform for wide-range eye movement tracking and sight line estimation algorithm according to claim 1, wherein: the method further comprises standard test setting, special position test setting, support test setting, environment setting, shielding setting, robustness setting, stability setting and rapidity setting.
3. The comprehensive test platform for wide-range eye movement tracking and sight line estimation algorithm according to claim 2, wherein: the standard test setting is specifically that a remote eye tracker (3) is used as a landmark of a viewpoint, a head fixing support (6) is removed, a test object is 80cm away from a display screen (2), and the display screen (2) displays a browsed webpage.
4. The comprehensive test platform for wide-range eye movement tracking and sight line estimation algorithm according to claim 2, wherein: the special position test setting specifically comprises the steps that the cameras are arranged at the camera placing positions on the periphery of the display screen (2), when the positions of the cameras are changed, the relative angle between the face and the display screen is changed, large-scale tracking is achieved through coordinate transformation, and other settings are the same as standard test settings.
5. The comprehensive test platform for wide-range eye movement tracking and sight line estimation algorithm according to claim 2, wherein: support test sets up specifically to, puts back distance display screen (2)80cm department in head fixed bolster (6), and sight when testing the head rigid tracks, and other settings are the same with standard test setting.
6. The comprehensive test platform for wide-range eye movement tracking and sight line estimation algorithm according to claim 2, wherein: the environment setting is specifically that the lamplight brightness of an experiment link is changed.
7. The comprehensive test platform for wide-range eye movement tracking and sight line estimation algorithm according to claim 2, wherein: the shielding setting is to partially shield eyes by using a shielding cover, and the robustness setting is to respectively shield left and right eyes, close eyes and move the rotating head into the visual field after leaving the visual field of the camera.
8. The comprehensive test platform for wide-range eye movement tracking and sight line estimation algorithm according to claim 2, wherein: the stability setting specifically is that the people's face moves on a large scale along clockwise and anticlockwise directions, the rapidity setting specifically is that bright spots are randomly displayed on the display screen (2).
CN202010400457.7A 2020-05-13 2020-05-13 Comprehensive testing platform for large-range eye movement tracking and sight line estimation algorithm Pending CN111625090A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010400457.7A CN111625090A (en) 2020-05-13 2020-05-13 Comprehensive testing platform for large-range eye movement tracking and sight line estimation algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010400457.7A CN111625090A (en) 2020-05-13 2020-05-13 Comprehensive testing platform for large-range eye movement tracking and sight line estimation algorithm

Publications (1)

Publication Number Publication Date
CN111625090A true CN111625090A (en) 2020-09-04

Family

ID=72271818

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010400457.7A Pending CN111625090A (en) 2020-05-13 2020-05-13 Comprehensive testing platform for large-range eye movement tracking and sight line estimation algorithm

Country Status (1)

Country Link
CN (1) CN111625090A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070263923A1 (en) * 2004-04-27 2007-11-15 Gienko Gennady A Method for Stereoscopic Measuring Image Points and Device for Carrying Out Said Method
CN201307266Y (en) * 2008-06-25 2009-09-09 韩旭 Binocular sightline tracking device
CN101576771A (en) * 2009-03-24 2009-11-11 山东大学 Scaling method for eye tracker based on nonuniform sample interpolation
CN102547123A (en) * 2012-01-05 2012-07-04 天津师范大学 Self-adapting sightline tracking system and method based on face recognition technology
CN108230363A (en) * 2017-12-29 2018-06-29 李文清 A kind of human-computer interaction device for target following
CN108992035A (en) * 2018-06-08 2018-12-14 云南大学 The compensation method of blinkpunkt positional shift in a kind of tracking of eye movement
CN109976514A (en) * 2019-03-01 2019-07-05 四川大学 Eye movement data bearing calibration based on eyeball error model
US20190274543A1 (en) * 2016-05-13 2019-09-12 Erasmus University Medical Center Rotterdam Method for measuring a subject's eye movement and scleral contact lens

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070263923A1 (en) * 2004-04-27 2007-11-15 Gienko Gennady A Method for Stereoscopic Measuring Image Points and Device for Carrying Out Said Method
CN201307266Y (en) * 2008-06-25 2009-09-09 韩旭 Binocular sightline tracking device
CN101576771A (en) * 2009-03-24 2009-11-11 山东大学 Scaling method for eye tracker based on nonuniform sample interpolation
CN102547123A (en) * 2012-01-05 2012-07-04 天津师范大学 Self-adapting sightline tracking system and method based on face recognition technology
US20190274543A1 (en) * 2016-05-13 2019-09-12 Erasmus University Medical Center Rotterdam Method for measuring a subject's eye movement and scleral contact lens
CN108230363A (en) * 2017-12-29 2018-06-29 李文清 A kind of human-computer interaction device for target following
CN108992035A (en) * 2018-06-08 2018-12-14 云南大学 The compensation method of blinkpunkt positional shift in a kind of tracking of eye movement
CN109976514A (en) * 2019-03-01 2019-07-05 四川大学 Eye movement data bearing calibration based on eyeball error model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵新灿等: "眼动仪与视线跟踪技术综述", 《计算机工程与应用》 *

Similar Documents

Publication Publication Date Title
CN103443742B (en) For staring the system and method with gesture interface
CN109804220B (en) Apparatus and method for tracking head movement
CN103713737B (en) Virtual keyboard system used for Google glasses
AU2013224653B2 (en) Virtual reality display system
Yan et al. Eyes-free target acquisition in interaction space around the body for virtual reality
US20150316981A1 (en) Gaze calibration
CN107743605A (en) For determining the capacitance sensor in eye gaze direction
CN1714388A (en) Body-centric virtual interactive apparatus and method
US10275933B2 (en) Method and apparatus for rendering object for multiple 3D displays
JP6110893B2 (en) Virtual space location designation method, program, recording medium recording program, and apparatus
KR20160060582A (en) Device and method for processing visual data, and related computer program product
JP5952931B1 (en) Computer program
CN104679222A (en) Medical office system based on human-computer interaction, medical information sharing system and method
CN111625090A (en) Comprehensive testing platform for large-range eye movement tracking and sight line estimation algorithm
Yamamoto et al. Development of eye-tracking pen display based on stereo bright pupil technique
Lim et al. Development of gaze tracking interface for controlling 3D contents
Narcizo et al. Remote eye tracking systems: technologies and applications
Maciejewski et al. Testing the SteamVR trackers operation correctness with the OptiTrack system
JP2016181267A (en) Computer program
Lin et al. A novel device for head gesture measurement system in combination with eye-controlled human–machine interface
Miyoshi et al. Input device using eye tracker in human-computer interaction
Boczon State of the art: eye tracking technology and applications
Lee et al. A new eye tracking method as a smartphone interface
Wang et al. Low-cost eye-tracking glasses with real-time head rotation compensation
Le et al. A Practical Method to Eye-tracking on the Phone: Toolkit, Accuracy and Precision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200904

RJ01 Rejection of invention patent application after publication