CN111568367B - Method for identifying and quantifying eye jump invasion - Google Patents

Method for identifying and quantifying eye jump invasion Download PDF

Info

Publication number
CN111568367B
CN111568367B CN202010404817.0A CN202010404817A CN111568367B CN 111568367 B CN111568367 B CN 111568367B CN 202010404817 A CN202010404817 A CN 202010404817A CN 111568367 B CN111568367 B CN 111568367B
Authority
CN
China
Prior art keywords
eye
jump
data
eye movement
invasion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010404817.0A
Other languages
Chinese (zh)
Other versions
CN111568367A (en
Inventor
靳慧斌
楚明健
常银霞
刘海波
冯朝辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Civil Aviation University of China
Original Assignee
Civil Aviation University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Civil Aviation University of China filed Critical Civil Aviation University of China
Priority to CN202010404817.0A priority Critical patent/CN111568367B/en
Publication of CN111568367A publication Critical patent/CN111568367A/en
Application granted granted Critical
Publication of CN111568367B publication Critical patent/CN111568367B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils

Abstract

The invention discloses a method for identifying and quantifying eye jump invasion, which comprises the following steps: extracting original eye movement data; preprocessing the extracted data; performing eye jumping eye movement detection; identifying gaze eye movement; identifying an eye jump intrusion based on the gaze baseline; data cleaning is carried out; and quantifying the eye jump invasion. The invention provides a method for identifying and quantifying the eye jump invasion based on the current eye movement research technology, which can stably, accurately and effectively identify and quantify the eye jump invasion.

Description

Method for identifying and quantifying eye jump invasion
Technical Field
The invention relates to the technical field of eye movement, in particular to a method for identifying and quantifying eye jump invasion.
Background
In recent years, with the development of artificial intelligence, eye movement technology is becoming a popular direction and leading edge technology of current research. The eyes have two distinct movements, gaze and eye jump. The jump invasion during eye movement is a special deviation of the jump eye movement in the horizontal direction only occurring when the eyes are gazing. Studies have shown that the eye jump intrusion index is related to the workload of the operator, and as the workload of the operator increases, the eye jump intrusion index increases.
Current research on eye movement techniques is focused mainly on the recognition of gaze behavior and eye jump behavior, and relatively few other eye movement behaviors (e.g., blinks, etc.) are recognized. Although the existing eye movement behavior recognition algorithm can recognize gazing and eye jumping behaviors to a certain extent, the existing eye movement behavior recognition algorithm lacks certain accuracy and can rarely quantify the eye jumping behaviors.
Disclosure of Invention
The invention aims to provide a method for identifying and quantifying the eye jump invasion, which solves the problems in the prior art and can accurately quantify the eye jump behavior.
In order to achieve the above object, the present invention provides the following solutions: the invention provides a method for identifying and quantifying eye jump invasion, which comprises the following steps:
s1, extracting original eye movement data;
s2, preprocessing the data extracted in the step S1;
s3, detecting eye jumping eye movement;
s4, recognizing fixation eye movement;
s5, identifying eye jump invasion based on the gazing base line;
s6, cleaning the data obtained in the step S5;
s7, quantifying eye jump invasion.
Preferably, the raw eye movement data in step S1 is obtained by the head-mounted eye movement device and saved as a file in the TXT format.
Preferably, the data preprocessing in step S2 is to replace the data points with confidence lower than 0.8, the data points not on the screen, and the missing data points.
Preferably, the detection of eye-jumped eye movement in step S3 employs an EK algorithm.
Preferably, the eye-jumping eye movement in step S3 includes regular eye-jumping and eye-jumping eye movement deviation.
Preferably, in step S4, the eye-jump deviation is eliminated, so that a normal eye-jump, i.e. a complement of the gaze eye-movement, is obtained.
Preferably, in step S5, two moving median windows are used to identify the eye jump deviation in each segment of gazing eye movement process, so as to obtain the eye jump invasion.
Preferably, the data cleaning in step S6 includes: conventional eye jump residue, unreasonable gaze procedure, and invalid data.
Preferably, in step S7, the data obtained in step S6 is classified to obtain all the eye jump invasion values and smaller eye movement deviation values, and an average value of the eye jump invasion values and the smaller eye movement deviation values is calculated to obtain an average eye jump invasion value, and the average smaller eye movement deviation values are averaged to complete the eye jump invasion quantification.
The invention discloses the following technical effects:
compared with the prior art, the method has the advantages of reliability, stability and accuracy, and the method can stably and accurately identify the eye jump invasion, further quantify the eye jump invasion and facilitate the research on the influence of the eye jump invasion.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of the method of the present invention;
fig. 2 is a simplified schematic diagram of an eye-tracker spatial coordinate system and a standardized spatial coordinate system.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
Referring to fig. 1, the present invention provides a method of identifying and quantifying eye-jump intrusions, comprising the steps of:
first, the original eye movement data of the subject is obtained by the head-mounted eye movement device, and referring to fig. 2, since the head-mounted eye movement device is adopted in this embodiment, the origin of the coordinate system of the eye movement device is fixed relative to the head, wherein the right hand direction to be tested is the positive X-axis direction of the coordinate system, the positive Y-axis direction is upward, and the positive Z-axis direction is forward. The face of the tested person is opposite to the center of the screen, and the normal line of the face is perpendicular to the screen. The visual axis is a straight line connecting the position on the screen seen by the eyes and the pupil center, and the point F is the intersection point of the visual axis and the screen, namely the point on the screen observed at the moment. The origin of the standard space coordinate system is the origin of the eye-tracker coordinate system, and the positive direction of the standard space coordinate system is the same as the eye-tracker space coordinate system. The three columns Gaze_NORMA0_x, gaze_NORMA0_y and Gaze_NORMA0_z in the raw data samples are the X-axis, Y-axis and Z-axis coordinates, respectively, of the visual axis in a standardized spatial coordinate system, and the three coordinate axes in the standardized spatial coordinate system range from [ -1,1]. diameter_3d is the pupil diameter detected by the eye tracker, and when the detection area of the pupil is incomplete, the system automatically processes into a circle, giving a diameter and confidence, and when the confidence is greater than or equal to 0.8, the data is credible. x_scaled and y_scaled are screen coordinates of the eye observation position at the time of the eye tracker sampling, the unit is pixels, and the resolution of the computer used in this embodiment is 1366x768, so that the range of x_scaled is [0,1366], and the range of y_scaled is [0,768]; on_ srf is a logical value sequence used to determine whether the eyes are looking at the screen range at the sampling time, if 1, the sample time is that the subject looks at a certain place in the screen, and if 0, the sample time is that the subject looks at a place outside the screen. The original data is finally saved as a file in the format of "txt". The types of eye movement data are shown in table 1:
TABLE 1
And secondly, preprocessing the acquired original eye movement data, and replacing points with confidence lower than 0.8, points which are not on a screen and missing data with the median value of the first ten data of the points.
Thirdly, adopting an EK algorithm to detect the eye jumping eye movement. The EK algorithm is essentially a speed threshold algorithm. Velocity of each pointObtained from formula (1):
in which isIs the horizontal angle of the second data point following data point n;
is the horizontal angle of the first data point after data point n;
is the horizontal angle of the first data point before data point n;
is the horizontal angle of the second data point preceding data point n;
Δt is the sampling time interval.
Speed threshold, horizontal speed threshold σ for eye jump eye movement detection in two-dimensional speed space x And a vertical velocity threshold sigma y Obtained from equation (2) and equation (3), respectively.
Where the symbol < v > represents the median value.
Wherein the detection thresholds ηx and ηy are multiples of the median value:
η x =λσ x
η y =λσ y
where λ is a parameter that needs to be set appropriately in the method for detecting eye jump.
One possible eye jump eye movement t (k) detected finally satisfies the formula (4):
where k represents the kth data point,
in the step, the eye jump eye movement is identified, the eye jump peak speed and the eye jump maximum amplitude in the eye jump eye movement process are calculated, when the eye jump maximum amplitude is increased, the eye jump peak speed is also increased, the eye jump maximum amplitude is taken as a horizontal axis, the eye jump peak speed is taken as a vertical axis, a linear trend can be observed, and the accuracy of the algorithm can be represented by the main sequence relation between the eye jump peak speed and the eye jump maximum amplitude.
Fourth, eye-jumped eye movements are identified, including regular eye-jumpers and eye-jumped eye movement shifts. Eliminating the eye jump type eye movement offset to obtain the conventional eye jump, wherein the complement of the conventional eye jump is all gazing eye movement.
And fifthly, identifying the eye jump invasion based on the gazing baseline, namely acquiring the eye jump type eye movement deviation data. And identifying the eye jump eye movement deviation in the process of gazing at the eyes by using the two moving median windows, wherein the eye jump eye movement deviation is eye jump invasion. The two moving median windows comprise a large moving median window and a small moving median window, the median of the data 1000ms before and 1000ms after the current data point is taken to form the large moving median window with the time length of 2000ms, however, the time window is not continuous, the data in the middle 500ms of the time window does not participate in calculating a gazing baseline, and the large moving median window is used for determining a basic horizontal gazing position, namely, determining the gazing baseline, and the gazing baseline represents one trend of stable sight; the length of time for the small moving median window is 50ms for detecting real-time eye level deviations.
An important application-level meaning of the eye movement deviation quantification algorithm is to quantify the eye jump eye movement deviation based on the gaze baseline position. The eye movement deviation quantization algorithm does not require an accurate position of the visual target in the physical world because it can automatically detect the gaze process and evaluate the gaze point position, and calculates the eye jump eye movement deviation from the evaluated gaze point position and the eye movement deviation. In this embodiment, the purpose of the eye movement shift quantization algorithm using two moving median windows of different lengths is to highlight the difference between the base gaze position represented by the large moving median window and the eye movement shift position represented by the small moving median window.
Sixth, cleaning the acquired eye jump eye movement deviation data, and cleaning the following data: conventional eye jump remains, unreasonable gazing process and invalid data. The final quantized data is made to contain only eye movement deviations of the fixation process.
Offset values corresponding to conventional eye-jump residual data: for each detected regular eye jump, the starting time and the ending time of the regular eye jump are marked, but the actual starting time of the regular eye jump may be out of the marked time period, which is caused by the fixed sampling time interval of the eye tracker, and the correction cannot be performed accurately, and the offset value corresponding to the residual data is large, but is not caused by gazing eye movement offset, so that the residual data needs to be cleaned. For a detected regular eye jump, the offset value corresponding to the regular eye jump process and one data point before and one data point after the process is set to 0.
Deviation values corresponding to eye jump eye movement deviation of unreasonable gaze eye movement process: a gaze eye movement process with a duration of less than 1000ms is called an unreasonable gaze eye movement process, and because a large moving median window cannot acquire sufficiently much data from a gaze eye movement process with a duration of less than 1000ms, the offset value corresponding to a gaze eye movement process with a duration of less than 1000ms needs to be cleared by setting the offset value corresponding to an unreasonable gaze eye movement process to 0.
Invalid data: refers to the missing data that was replaced. The offset values corresponding to these data are generated by a replacement process, and are not caused by the eye movement offset of the fixation process, and therefore need to be cleaned up, and the processing method is to set the offset value corresponding to the invalid data to 0.
And seventhly, classifying the eye jump type eye movement deviation data after cleaning according to a specified classification standard to obtain two eye movement deviations, wherein the eye movement deviation with the deviation amplitude of 0.4-4.1 degrees is called eye jump invasion, the eye movement deviation with the deviation amplitude of less than 0.4 degrees is called smaller eye movement deviation or micro eye jump, and the average value of each class is calculated. The average eye jump intrusion value is the sum of deviation values of the eye jump intrusion data points divided by the corresponding time in units of deg/s. The eye movement deviation quantization algorithm can output two values after processing the data of each experiment: and (5) averaging the eye jump invasion values, averaging smaller eye movement deviation values, and finishing the eye jump invasion quantification.
The accuracy of the identification method can be known according to the main sequence relation between the identified eye jump peak speed and the maximum amplitude; the invention adopts the EK algorithm, and compared with the speed threshold algorithm, the speed threshold of the EK algorithm is obtained in a self-adaptive way, and the speed threshold is not set manually, so that the invention has better reliability; the self-adaptive threshold value is selected, so that the method is suitable for different people and unused data sampling methods, and the stability is good.
In the description of the present invention, it should be understood that the terms "longitudinal," "transverse," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate or are based on the orientation or positional relationship shown in the drawings, merely to facilitate description of the present invention, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus should not be construed as limiting the present invention.
The above embodiments are only illustrative of the preferred embodiments of the present invention and are not intended to limit the scope of the present invention, and various modifications and improvements made by those skilled in the art to the technical solutions of the present invention should fall within the protection scope defined by the claims of the present invention without departing from the design spirit of the present invention.

Claims (8)

1. A method of identifying and quantifying eye-hop intrusion, comprising: the method comprises the following steps:
s1, extracting original eye movement data;
s2, preprocessing the data extracted in the step S1, wherein the preprocessing comprises the following steps: replacing the point with the confidence coefficient lower than 0.8, the point which is not on the screen and the missing data with the median value of the first ten data of the point;
s3, detecting eye jumping eye movement to obtain eye jumping eye movement;
s4, based on the eye jump eye movement, identifying fixation eye movement;
s5, identifying eye jump invasion to the gazing eye movement based on a gazing base line, namely acquiring eye jump type eye movement deviation data;
s6, cleaning the data obtained in the step S5;
s7, quantifying eye jump invasion, classifying the data obtained in the step 6 to obtain all eye jump invasion values and smaller eye movement deviation values, wherein the eye movement deviation with the deviation amplitude of 0.4-4.1 degrees is called eye jump invasion, the eye movement deviation with the deviation amplitude of less than 0.4 degrees is called smaller eye movement deviation or micro eye jump, calculating the average value of the eye jump invasion values and the smaller eye movement deviation values, wherein the average eye jump invasion value is obtained by dividing the sum of deviation values of eye jump invasion data points by corresponding time, the unit is deg/S, and the eye movement deviation quantification algorithm can output two values after processing the data of each experiment: and (5) averaging the eye jump invasion values, and averaging the smaller eye movement deviation values to finish the eye jump invasion quantification.
2. The method of identifying and quantifying an eye-jump intrusion according to claim 1, wherein: the original eye movement data in step S1 is obtained by the head-mounted eye movement device and stored as a TXT format file.
3. The method of identifying and quantifying an eye-jump intrusion according to claim 1, wherein: the data preprocessing in step S2 is to replace the data points with confidence below the predetermined threshold, the data points not on the screen and the missing data points.
4. The method of identifying and quantifying an eye-jump intrusion according to claim 1, wherein: in the step S3, an EK algorithm is used for detecting the eye jump type eye movement.
5. The method of identifying and quantifying an eye-jump intrusion according to claim 1, wherein: the eye-jumping eye movements in the step S3 include regular eye-jumping and eye-jumping eye movement deviation.
6. The method of identifying and quantifying an eye-jump intrusion according to claim 1, wherein: and in the step S4, the eye jump deviation is eliminated, so that the normal eye jump is obtained, and the complement of the normal eye jump is the gazing eye movement.
7. The method of identifying and quantifying an eye-jump intrusion according to claim 1, wherein: in the step S5, two moving median windows are utilized to identify the eye jump type eye movement deviation in each section of gazing eye movement process, and then the eye jump invasion is obtained.
8. The method of identifying and quantifying an eye-jump intrusion according to claim 1, wherein: the step S6 of data cleaning includes: conventional eye jump residue, unreasonable gaze procedure, and invalid data.
CN202010404817.0A 2020-05-14 2020-05-14 Method for identifying and quantifying eye jump invasion Active CN111568367B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010404817.0A CN111568367B (en) 2020-05-14 2020-05-14 Method for identifying and quantifying eye jump invasion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010404817.0A CN111568367B (en) 2020-05-14 2020-05-14 Method for identifying and quantifying eye jump invasion

Publications (2)

Publication Number Publication Date
CN111568367A CN111568367A (en) 2020-08-25
CN111568367B true CN111568367B (en) 2023-07-21

Family

ID=72121028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010404817.0A Active CN111568367B (en) 2020-05-14 2020-05-14 Method for identifying and quantifying eye jump invasion

Country Status (1)

Country Link
CN (1) CN111568367B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08279247A (en) * 1995-04-05 1996-10-22 Fujitsu Ltd Method and device for processing reproduced signal as well as disk device
CN103713728A (en) * 2014-01-14 2014-04-09 东南大学 Method for detecting usability of complex system human-machine interface
CN104898823A (en) * 2014-03-04 2015-09-09 中国电信股份有限公司 Method and device for controlling sighting mark motion
WO2016112690A1 (en) * 2015-01-14 2016-07-21 北京工业大学 Eye movement data based online user state recognition method and device
CN109124657A (en) * 2013-03-11 2019-01-04 亚特兰大儿童医疗保健公司 For recognizing and the system and method for the detection of development condition
CN109199411A (en) * 2018-09-28 2019-01-15 南京工程学院 Case insider's recognition methods based on Model Fusion
CN109199412A (en) * 2018-09-28 2019-01-15 南京工程学院 Abnormal emotion recognition methods based on eye movement data analysis
CN109933193A (en) * 2019-03-01 2019-06-25 北京体育大学 Intelligent auxiliary training system based on sportsman's eye movement information real-time capture
CN109925678A (en) * 2019-03-01 2019-06-25 北京七鑫易维信息技术有限公司 A kind of training method based on eye movement tracer technique, training device and equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08279247A (en) * 1995-04-05 1996-10-22 Fujitsu Ltd Method and device for processing reproduced signal as well as disk device
CN109124657A (en) * 2013-03-11 2019-01-04 亚特兰大儿童医疗保健公司 For recognizing and the system and method for the detection of development condition
CN103713728A (en) * 2014-01-14 2014-04-09 东南大学 Method for detecting usability of complex system human-machine interface
CN104898823A (en) * 2014-03-04 2015-09-09 中国电信股份有限公司 Method and device for controlling sighting mark motion
WO2016112690A1 (en) * 2015-01-14 2016-07-21 北京工业大学 Eye movement data based online user state recognition method and device
CN109199411A (en) * 2018-09-28 2019-01-15 南京工程学院 Case insider's recognition methods based on Model Fusion
CN109199412A (en) * 2018-09-28 2019-01-15 南京工程学院 Abnormal emotion recognition methods based on eye movement data analysis
CN109933193A (en) * 2019-03-01 2019-06-25 北京体育大学 Intelligent auxiliary training system based on sportsman's eye movement information real-time capture
CN109925678A (en) * 2019-03-01 2019-06-25 北京七鑫易维信息技术有限公司 A kind of training method based on eye movement tracer technique, training device and equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
郑秀娟.基于视觉特性的驾驶安全眼动研究进展.《技术与创新管理》.2018,第39卷(第39期),50-59. *
靳慧斌.基于眼动数据分析的安检视觉搜索特征.《工业工程》.第17卷(第17期),43-48. *

Also Published As

Publication number Publication date
CN111568367A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN111611905B (en) Visible light and infrared fused target identification method
US7742621B2 (en) Dynamic eye tracking system
CN107292252A (en) A kind of personal identification method of autonomous learning
WO2019136641A1 (en) Information processing method and apparatus, cloud processing device and computer program product
CN112464793A (en) Method, system and storage medium for detecting cheating behaviors in online examination
CN103164315A (en) Computer using time prompting method and system based on intelligent video analysis
CN111616718A (en) Method and system for detecting fatigue state of driver based on attitude characteristics
CN114005167A (en) Remote sight estimation method and device based on human skeleton key points
CN111914665A (en) Face shielding detection method, device, equipment and storage medium
CN106127814A (en) A kind of wisdom gold eyeball identification gathering of people is fought alarm method and device
CN111985328A (en) Unsafe driving behavior detection and early warning method based on facial feature analysis
CN112733772A (en) Real-time cognitive load and fatigue degree detection method and system in storage sorting task
CN110097012B (en) Fatigue detection method for monitoring eye movement parameters based on N-range image processing algorithm
CN113989788A (en) Fatigue detection method based on deep learning and multi-index fusion
CN112926522B (en) Behavior recognition method based on skeleton gesture and space-time diagram convolution network
CN111568367B (en) Method for identifying and quantifying eye jump invasion
CN114120188A (en) Multi-pedestrian tracking method based on joint global and local features
CN105335695A (en) Glasses detection based eye positioning method
CN110909687B (en) Action feature validity determination method, computer storage medium, and electronic device
Gao et al. Non-invasive eye tracking technology based on corneal reflex
CN113534945A (en) Method, device and equipment for determining eyeball tracking calibration coefficient and storage medium
CN109993118B (en) Action recognition method and system
CN114468977B (en) Ophthalmologic vision examination data collection and analysis method, system and computer storage medium
CN115393830A (en) Fatigue driving detection method based on deep learning and facial features
CN108363974B (en) Foot root detection method for computer vision gait recognition for social security management

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant