CN117649157B - Instrument discrimination capability assessment method based on sight tracking - Google Patents

Instrument discrimination capability assessment method based on sight tracking Download PDF

Info

Publication number
CN117649157B
CN117649157B CN202410126043.8A CN202410126043A CN117649157B CN 117649157 B CN117649157 B CN 117649157B CN 202410126043 A CN202410126043 A CN 202410126043A CN 117649157 B CN117649157 B CN 117649157B
Authority
CN
China
Prior art keywords
normal distribution
test question
tester
test
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410126043.8A
Other languages
Chinese (zh)
Other versions
CN117649157A (en
Inventor
王秀超
刘旭峰
武圣君
王卉
郭亚宁
尉怀怀
杨天奇
殷梦馨
邹明萱
何育清
王灿灿
王雪峰
徐翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Air Force Medical University of PLA
Original Assignee
Air Force Medical University of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Air Force Medical University of PLA filed Critical Air Force Medical University of PLA
Priority to CN202410126043.8A priority Critical patent/CN117649157B/en
Publication of CN117649157A publication Critical patent/CN117649157A/en
Application granted granted Critical
Publication of CN117649157B publication Critical patent/CN117649157B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The application relates to the field of sight tracking, in particular to an instrument discrimination capability assessment method based on sight tracking, which comprises the following steps: acquiring the fixation point coordinates of each test question; taking the region with the correct option as a normal distribution center, carrying out normal distribution on the option region to generate a normal distribution curve, and carrying out normalization processing on the normal distribution curve to obtain a normal distribution function; substituting the gaze point coordinates into a normal distribution function to obtain the accuracy of the gaze point; calculating the representation value of a tester, traversing each test question to obtain ternary data of all the test questions in one round of test, wherein the ternary data comprises flight instrument parameters, the representation value and the grading result of each test question; forming a binary group by the normalized representation value and the scoring result, and obtaining a plurality of evaluation categories for the binary group; and calculating the average value of the expression values in the evaluation category, and taking the average value of the expression values as an evaluation result value. The method has the effect of improving the accuracy of the pilot evaluation capability.

Description

Instrument discrimination capability assessment method based on sight tracking
Technical Field
The application relates to the field of gaze tracking, in particular to an instrument discrimination capability assessment method based on gaze tracking.
Background
The attitude of a flight refers to the state of the three axes of the aircraft in the air with respect to some reference plane or some fixed coordinate system, and the attitude of the aircraft may be pitch (pitch), roll (roll), yaw (yaw), etc. Pitch refers to the attitude of an aircraft rotating up or down on a vertical plane, and pitch refers to the angle of inclination of the aircraft fuselage from front to back. Roll refers to the attitude of an aircraft rotating about a longitudinal axis, and roll angle refers to the angle of inclination of the aircraft fuselage from side to side. Yaw refers to the attitude of an aircraft rotating about a vertical axis, and yaw angle refers to the yaw angle of the aircraft fuselage in the left-right direction. Through various parameters and instrument indicators on the flight instrument, the pilot can obtain important information about the attitude of the flight, such as the altitude, speed, inclination, etc. of the aircraft.
In the flight process of the aircraft, the pilot needs to judge the flight attitude of the aircraft through the flight instrument, if the pilot cannot accurately grasp the flight attitude of the aircraft through the flight instrument, the misjudgment of the flight attitude can be caused, and misoperation of the pilot is easy to cause the flight risk. The pilot needs to perform a test of the aircraft attitude determination, and the accuracy of the determination of the pilot is tested by evaluating the discrimination capability of the pilot to the instruments.
The existing instrument panel discrimination capability assessment method is usually only based on the accuracy of a pilot as an assessment result, and an aircraft usually flies at a higher speed in flight, so that the instrument panel discrimination capability should also be assessed at a judgment speed, and the targeted training can be performed according to the poor performance of the current pilot.
Disclosure of Invention
In order to combine the aircraft attitude judgment capability of judging the speed to judge the pilot so as to improve the accuracy of the pilot evaluation capability, the application provides an instrument judgment capability evaluation method based on sight tracking, which adopts the following technical scheme:
the instrument discrimination capability assessment method based on the sight tracking comprises the following steps: acquiring position data of a gaze point of a tester, and generating gaze point coordinates of each test question, wherein each test question is a selection question, and options of each test question are distributed in the same row; taking the region with the correct option as a normal distribution center, carrying out normal distribution on the option region according to a preset normal distribution model to generate a normal distribution curve, and carrying out normalization processing on the normal distribution curve to obtain a normal distribution function; substituting the gaze point coordinates into a normal distribution function to obtain the correct rate of the gaze point; calculating the performance value of the tester, wherein the calculation formula is as follows:wherein->For representing value, ->Is a step function of 0-1->Is->The correct rate of the gaze point of the question, +.>Is a preset super parameter->Indicating that the pilot is currently doing the first/>Time spent in each test question +.>Is->Mean value of->Is->Is a variance of (2); traversing each test question to obtain ternary data of all the test questions in one round of test, wherein the ternary data comprises flight instrument parameters, a representation value and a grading result of each test question, and in the grading result, a correct answer mark is 1, and a wrong answer mark is 0; normalizing the performance values in the ternary data, forming a binary group by the normalized performance values and the grading result of each test question, and classifying the binary group according to a classification algorithm to obtain a plurality of evaluation categories; and calculating the average value of the representation values in the evaluation category, and taking the average value of the representation values as the evaluation result value of the current tester on the judging capability of the instrument.
Optionally, the method for obtaining the position data of the gaze point of the tester and generating the gaze point coordinate of each test question includes the steps of: acquiring a face image of a tester when the tester participates in the test, and acquiring the fixation point coordinates of the tester on a test question interface according to a sight tracking method; and generating a fixation point coordinate sequence of each test question according to the fixation point coordinates of the test question interface.
Optionally, normal distribution is performed on the option area according to a preset normal distribution model to generate a normal distribution curve, and normalization processing is performed on the normal distribution curve to obtain a normal distribution function, wherein the normalization processing method comprises the following steps: the normal distribution curve is multiplied by the inverse of the peak point in the normal distribution curve.
Optionally, the classification algorithm is an ISODATA algorithm.
Optionally, the condition for stopping iteration of the isadata algorithm is that: and (3) judging whether the grading result of the single test question in each evaluation category is correct or incorrect, stopping iteration if the grading result is correct or incorrect, and otherwise, continuing iteration.
The application has the following technical effects:
1. according to the sight tracking method, the gaze point coordinates are obtained, the region with the correct option is used as a normal distribution center, a normal distribution function is generated, the gaze point coordinates are substituted into the normal distribution function to calculate the gaze point correct rate and calculate the performance value of the tester, and the aircraft attitude judgment capability of the pilot is judged by combining the judgment speed and the correct and incorrect judgment result in the testing process of the tester.
2. In the calculation of the expression value, a step function is introduced, so that the interference brought by different orders of the options read by the testers to the accuracy of the expression value is reduced.
3. When the evaluation result values of test questions related to certain flight instrument parameters of a tester are lower than those of test questions related to other flight instrument parameters, a test question set related to the flight instrument parameters is generated and pushed to the tester, so that targeted training is performed according to the poor performance of the current tester.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. In the drawings, several embodiments of the present application are shown by way of example and not by way of limitation, and identical or corresponding reference numerals indicate identical or corresponding parts.
Fig. 1 is a flow chart of a method in the instrument discrimination capability evaluation method based on gaze tracking.
Fig. 2 is a flowchart of step S1 in the vision-tracking-based instrument discrimination capability evaluation method of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
It should be understood that when the terms "first," "second," and the like are used in the claims, specification, and drawings of this application, they are used merely for distinguishing between different objects and not for describing a particular sequential order. The terms "comprises" and "comprising," when used in the specification and claims of this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The embodiment of the application discloses an instrument discrimination capability assessment method based on line-of-sight tracking, referring to fig. 1, comprising steps S1-S7, specifically as follows:
s1: and acquiring position data of the gaze point of the tester, and generating gaze point coordinates of each test question, wherein each test question is a selection question, and the options of each test question are distributed in the same row. Referring to fig. 2, the method includes steps S10 to S11, specifically as follows:
s10: and acquiring a face image of the tester when the tester participates in the test, and acquiring the fixation point coordinates of the tester on the test question interface according to the sight tracking method.
The tester, i.e. the pilot, the tool for testing is a computer, a camera is arranged on a display screen for displaying the questions in a computer component, the face of the tester is opposite to the camera when the tester reads the questions, and the camera acquires the face image of the tester.
The principle of the sight tracking method is that a camera is utilized to capture a face image of a tester, the collected face image is processed to identify the positions of eyes, a two-dimensional coordinate system of the positions of the eyes is established, the two-dimensional coordinate system of the positions of the eyes can correspond to a three-dimensional real space coordinate system, and the three-dimensional real space coordinate system corresponds to the two-dimensional coordinate system of the gaze point on a display screen, so that the gaze point coordinate of the gaze point of the tester on the display screen can be obtained through the positions of the eyes, and the part is the prior art and is not described herein.
S11: and generating a fixation point coordinate sequence of each test question according to the fixation point coordinates of the test question interface.
After the gaze point coordinates of the test question interface are obtained, the gaze point coordinates are ordered according to the time walking sequence, a gaze point coordinate sequence is obtained, and in the gaze point coordinate sequence, if the test person stays for a longer time in a correct answer, and the test question test person is correct in answer, the more accurate judgment of the current test person on the flight attitude is shown.
If the gazing point scans the correct answer, the longer the gazing point gazes the wrong option or the longer the question making time, the correct test result shows that the current test personnel correctly judges the flight attitude, but hesitation exists, and the longer the hesitation time, the worse the judgment of the pilot on the flight attitude is represented.
If the answer of the tester is wrong, the judgment of the current tester on the flight attitude is extremely poor, and if the error occurs in the flight mission of the pilot, the serious loss is possibly caused.
S2: and taking the region with the correct option as a normal distribution center, carrying out normal distribution on the option region according to a preset normal distribution model to generate a normal distribution curve, and carrying out normalization processing on the normal distribution curve to obtain a normal distribution function.
The test question options are ordered in the same row, a two-dimensional normal distribution curve in a three-dimensional space can be obtained by taking the center coordinate point of the region of the correct option as the normal distribution center, two dimensions of the three-dimensional space are the dimension where the abscissa and the ordinate of the gaze point at the interface coordinate of the test question are located, and the other dimension is the dimension where the normal distribution value is located.
The normal distribution model is a Gaussian model, the region corresponding to the correct option is taken as a normal distribution center, and the normal distribution is two-dimensional normal distribution because the test question interface is a two-dimensional interface, and the variance value corresponding to two dimensions is 1.
The normal distribution curve is normalized to obtain a normal distribution function, and the normalization processing method comprises the following steps: the normal distribution curve is multiplied by the inverse of the peak point in the normal distribution curve.
S3: substituting the gaze point coordinates into a normal distribution function to obtain the correct rate of the gaze point.
The correct rate of the gaze point is a normal distribution value corresponding to the gaze point coordinates.
S4: the representative value of the tester is calculated.
The expression value is calculated as follows:wherein->For representing value, ->The larger the value of (2) is, the more the current tester is doing +>The better the effect of making the questions.
For a step function of 0-1, before the gaze point does not reach the center point of the correct options area,the value of (2) is 0, and the interference caused by different orders of the options read by the testers on the accuracy of the representation values is reduced.
Is->The correct rate of the gaze point of the question, +.>Is a preset super parameter for adjusting the error of the gaze point, +.>The value range of (2) is +.>Exemplary, the present application->、/>Or->
Indicating that the current pilot does->Time spent in each test question +.>In seconds, (-)>The bigger the pilot is, the more hesitation is indicated, and the worse the effect of doing the questions is.
Is->Mean value of->The larger the value of (c) indicates that the gaze result is biased toward the correct answer, the higher the performance value. />Is->Variance of (2),/>The smaller the value of (c) is, the lower the degree of hesitation of the process of gazing is.
S5: traversing each test question to obtain ternary data of all the test questions in one round of test, wherein the ternary data comprises flight instrument parameters, a representation value and a grading result of each test question, and in the grading result, the correct answer is marked as 1, and the incorrect answer is marked as 0.
S6: normalizing the performance values in the ternary data, forming a binary group by the normalized performance values and the grading result of each test question, and classifying the binary group according to a classification algorithm to obtain a plurality of evaluation categories.
And normalizing the expression values in the ternary data of each round of test by using a Sigmoid function, and reducing the situation of poor classification effect caused by overlarge difference of the expression value range when classifying the binary function. The Sigmoid function is an S-type function that can be used as an activation function for a neural network, mapping variables between 0 and 1.
The classification algorithm adopts an ISODATA algorithm (Iterative Self-Organizing Data Analysis Techniques Algorithm) which aims to divide a group of data points into a plurality of clusters (clusters) with similar characteristics, and the number of the clusters and the positions of the central points of the clusters are automatically adjusted in a continuous Iterative mode, so that the clustering of the data is realized. The conditions for stopping the iteration in the present application are: and (3) judging whether the grading result of the single test question in each evaluation category is correct or incorrect, stopping iteration if the grading result is correct or incorrect, and otherwise, continuing iteration.
Exemplary, parameters of the ISODATA algorithm are as follows: number of clusters expectedNumber of clusters initially setThe least number of samples in each category +.>If the minimum number of samples in each category is less than 15, the category is removed.
In one class, the maximum standard deviation in the sample characteristics. Minimum distance between centers of two categoriesIf the minimum distance is less than 5, combining the two categories, wherein in one combining operation, the maximum logarithm of the categories which can be combined is +.>The number of iterative operations ∈ ->
S7: and calculating the average value of the representation values in the evaluation category, and taking the average value of the representation values as the evaluation result value of the current tester on the judging capability of the instrument.
Since the pilot has serious consequences caused by the error in the judgment of the flight attitude when flying the pilot, the error in the judgment of the flight attitude is not allowed, and if the scoring result of 0 in the evaluation categories does not exist, the category with the lowest average value of the representation values when the scoring result of a single test question is 1 in all the evaluation categories is obtained as the evaluation result value.
In the ternary data, one flight instrument parameter corresponds to one representation value and one grading result, and when the evaluation result values of test questions related to one flight instrument parameter by a tester are lower than those of test questions related to other flight instrument parameters, a test question set related to the flight instrument parameter is generated and pushed to the tester.
While various embodiments of the present application have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Many modifications, changes, and substitutions will now occur to those skilled in the art without departing from the spirit and spirit of the application. It should be understood that various alternatives to the embodiments of the present application described herein may be employed in practicing the application.
The foregoing are all preferred embodiments of the present application, and are not intended to limit the scope of the present application in any way, therefore: all equivalent changes in structure, shape and principle of this application should be covered in the protection scope of this application.

Claims (5)

1. The instrument discrimination capability assessment method based on the sight tracking is characterized by comprising the following steps:
acquiring position data of a gaze point of a tester, and generating gaze point coordinates of each test question, wherein each test question is a selection question, and options of each test question are distributed in the same row;
taking the region with the correct option as a normal distribution center, carrying out normal distribution on the option region according to a preset normal distribution model to generate a normal distribution curve, and carrying out normalization processing on the normal distribution curve to obtain a normal distribution function;
substituting the gaze point coordinates into a normal distribution function to obtain the correct rate of the gaze point;
calculating the performance value of the tester, wherein the calculation formula is as follows:
wherein->For representing value, ->Is a step function of 0-1->Is->The correct rate of the gaze point of the question, +.>Is a preset super parameter->Indicating the current tester does +>Time spent in each test question +.>Is->Mean value of->Is->Is a variance of (2);
traversing each test question to obtain ternary data of all the test questions in one round of test, wherein the ternary data comprises flight instrument parameters, a representation value and a grading result of each test question, and in the grading result, a correct answer mark is 1, and a wrong answer mark is 0;
normalizing the performance values in the ternary data, forming a binary group by the normalized performance values and the grading result of each test question, and classifying the binary group according to a classification algorithm to obtain a plurality of evaluation categories;
and calculating the average value of the representation values in the evaluation category, and taking the average value of the representation values as the evaluation result value of the current tester on the judging capability of the instrument.
2. The vision-tracking-based instrument discrimination capability evaluating method according to claim 1, wherein acquiring position data of a gaze point of a tester, generating gaze point coordinates of each test question, includes the steps of:
acquiring a face image of a tester when the tester participates in the test, and acquiring the fixation point coordinates of the tester on a test question interface according to a sight tracking method;
and generating a fixation point coordinate sequence of each test question according to the fixation point coordinates of the test question interface.
3. The method for evaluating the discrimination capability of the instrument based on the line-of-sight tracking according to claim 1, wherein the normal distribution of the option area according to a preset normal distribution model generates a normal distribution curve, the normal distribution curve is normalized to obtain a normal distribution function, and the normalization method comprises the following steps: the normal distribution curve is multiplied by the inverse of the peak point in the normal distribution curve.
4. A method for evaluating discrimination capability of a sight-line tracking-based meter according to any one of claims 1 to 3, wherein the classification algorithm is an ISODATA algorithm.
5. The gaze tracking-based meter discriminant ability assessment method of claim 4, wherein said stopping of iteration of the ISODATA algorithm is conditioned by: and (3) judging whether the grading result of the single test question in each evaluation category is correct or incorrect, stopping iteration if the grading result is correct or incorrect, and otherwise, continuing iteration.
CN202410126043.8A 2024-01-30 2024-01-30 Instrument discrimination capability assessment method based on sight tracking Active CN117649157B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410126043.8A CN117649157B (en) 2024-01-30 2024-01-30 Instrument discrimination capability assessment method based on sight tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410126043.8A CN117649157B (en) 2024-01-30 2024-01-30 Instrument discrimination capability assessment method based on sight tracking

Publications (2)

Publication Number Publication Date
CN117649157A CN117649157A (en) 2024-03-05
CN117649157B true CN117649157B (en) 2024-03-29

Family

ID=90048185

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410126043.8A Active CN117649157B (en) 2024-01-30 2024-01-30 Instrument discrimination capability assessment method based on sight tracking

Country Status (1)

Country Link
CN (1) CN117649157B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014081913A (en) * 2012-09-27 2014-05-08 Dainippon Printing Co Ltd Questionnaire analysis device, questionnaire analysis system, questionnaire analysis method and program
CN107992896A (en) * 2017-12-13 2018-05-04 东南大学 A kind of scientific concept evaluating method that tracer technique is moved based on eye
JP2021096325A (en) * 2019-12-16 2021-06-24 日本電気株式会社 Management device, terminal, management system, and method for management
CN115222427A (en) * 2022-07-29 2022-10-21 平安科技(深圳)有限公司 Artificial intelligence-based fraud risk identification method and related equipment
CN115607153A (en) * 2022-09-06 2023-01-17 北京工业大学 Psychological scale answer quality evaluation system and method based on eye movement tracking

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014081913A (en) * 2012-09-27 2014-05-08 Dainippon Printing Co Ltd Questionnaire analysis device, questionnaire analysis system, questionnaire analysis method and program
CN107992896A (en) * 2017-12-13 2018-05-04 东南大学 A kind of scientific concept evaluating method that tracer technique is moved based on eye
JP2021096325A (en) * 2019-12-16 2021-06-24 日本電気株式会社 Management device, terminal, management system, and method for management
CN115222427A (en) * 2022-07-29 2022-10-21 平安科技(深圳)有限公司 Artificial intelligence-based fraud risk identification method and related equipment
CN115607153A (en) * 2022-09-06 2023-01-17 北京工业大学 Psychological scale answer quality evaluation system and method based on eye movement tracking

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Tracking the Decision-Making Process in Multiple-Choice Assessment: Evidence from Eye Movements;MARLIT ANNALENA LINDNER;Applied Cognitive Psychology;20140807;第28卷;738-752 *

Also Published As

Publication number Publication date
CN117649157A (en) 2024-03-05

Similar Documents

Publication Publication Date Title
Kang et al. An eye movement analysis algorithm for a multielement target tracking task: Maximum transition-based agglomerative hierarchical clustering
CN112818988B (en) Automatic identification reading method and system for pointer instrument
US11822007B2 (en) System and method for identification of an airborne object
JP6740033B2 (en) Information processing device, measurement system, information processing method, and program
JP2003065812A (en) System for forming numeral data based on image information in measuring instrument display
CN106123819B (en) A kind of ' s focus of attention measurement method
CN107525652A (en) Lens distortion method of testing, apparatus and system
CN112215120A (en) Method and device for determining visual search area and driving simulator
CN117649157B (en) Instrument discrimination capability assessment method based on sight tracking
CN112836581B (en) Sensitive fault feature extraction method and device based on correlation analysis
EP3074844B1 (en) Estimating gaze from un-calibrated eye measurement points
CN111898552B (en) Method and device for distinguishing person attention target object and computer equipment
CN116137012B (en) Online teaching quality supervision and management system based on Internet education
CN112115896A (en) Instrument panel pointer reading prediction method and device, computer equipment and storage medium
WO2022121694A1 (en) Brake disc wear degree measuring method, apparatus and device
CN111985826B (en) Visual quality grading method and system for multi-index industrial products
CN109033957A (en) A kind of gaze estimation method based on quadratic polynomial
CN115861977A (en) Evaluation method for simulated driving posture and simulated driving device
CN111507555B (en) Human body state detection method, classroom teaching quality evaluation method and related device
CN112556655A (en) Forestry fire prevention monocular positioning method and system
Bentley et al. The morphological classification of red cells using an image analysing computer
Xue et al. A computational personality traits analysis based on facial geometric features
Kim et al. Effects of visual complexity levels and information decluttering methods for cockpit displays on human search performance
JP5984380B2 (en) Target direction estimation device
Tang et al. Segmentation of cervical cell nucleus using Intersecting Cortical Model optimized by Particle Swarm Optimization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant