CN113655883B - Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method - Google Patents

Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method Download PDF

Info

Publication number
CN113655883B
CN113655883B CN202110942597.1A CN202110942597A CN113655883B CN 113655883 B CN113655883 B CN 113655883B CN 202110942597 A CN202110942597 A CN 202110942597A CN 113655883 B CN113655883 B CN 113655883B
Authority
CN
China
Prior art keywords
task
instruction
completion rate
eye movement
instruction input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110942597.1A
Other languages
Chinese (zh)
Other versions
CN113655883A (en
Inventor
高岚岚
刘怡静
刘然
乐剑
彭超
黄婧
周颖伟
李宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Institute of War of PLA Academy of Military Science
Original Assignee
Research Institute of War of PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research Institute of War of PLA Academy of Military Science filed Critical Research Institute of War of PLA Academy of Military Science
Priority to CN202110942597.1A priority Critical patent/CN113655883B/en
Publication of CN113655883A publication Critical patent/CN113655883A/en
Application granted granted Critical
Publication of CN113655883B publication Critical patent/CN113655883B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to an ergonomic experimental analysis system and method of a human-computer interface eye movement interaction mode, which extracts the operation capable of adopting eye movement interaction according to the operation task of a human-computer interface; configuring instruction types for each operation capable of adopting eye movement interaction, configuring a plurality of instruction input modes for each instruction type, and forming a test task formed by combining a plurality of different instruction input modes; executing the test tasks one by one, and calculating the operation time and the task completion rate for completing each test task; selecting a configuration mode corresponding to the test task with the highest task completion rate average value as a human eye interaction configuration mode of the task; and if the same completion rate mean value exists, selecting the configuration mode with the shortest operation time as the human eye interaction configuration mode of the task. And through experimental comparison and analysis, the eye movement interaction mode is optimized, so that the ergonomic experimental analysis process is standard and feasible.

Description

Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method
Technical Field
The invention relates to the technical field of novel human-computer interaction, in particular to an ergonomic experimental analysis system and method of a human-computer interface eye movement interaction mode.
Background
Eye movement interaction is one of the most natural interaction modes of human beings, two hands are often occupied, traditional gesture interaction cannot meet requirements in some occasions, and eye movement interaction becomes particularly important. By detecting the indicators such as the eye movement fixation time, the eye movement fixation position, the eye movement fixation track and the like, the interaction between an operator and the equipment or the system can be completed.
The existing eye movement interaction is still in a concept stage, and the eye movement interaction mode used for the task of which type is not clear, so that eye movement interaction mode analysis and professional research aiming at the task type are lacked.
In the natural state, people often have some unintentional eye jumps or blinks, which become a problem in eye movement interaction and trigger interface changes, but in fact the changes are not the real intention of the user, which leads to certain misoperation.
Improper eye movement interaction may also cause discomfort to the operator, possibly causing eye strain to the user, and the like.
On the other hand, the timeliness of the operation is better, related researches are not made in the prior art, and the problem of operation delay caused by eye movement interaction is not solved.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an ergonomic experimental analysis method of a human-computer interface eye movement interaction mode, which compares the suitability of a blinking mode and a watching mode in an eye movement interaction task through objective data analysis results, provides scientific support for selection of eye movement interaction, and provides recommended interaction modes under different task types according to comparison of operation time and task completion rate.
In order to achieve the above object, the present invention provides an ergonomic experimental analysis system of human-computer interface eye movement interaction mode, comprising: a configuration unit, an eye movement device, an operation control unit and an analysis unit;
the configuration unit is used for configuring input modes of various types of operation instructions;
the eye movement equipment is used for collecting eye movements of an operator and outputting the eye movements to the operation control unit;
the operation control unit outputs a corresponding operation instruction based on the eye action of the operator;
the analysis unit receives the operation instruction sent by the operation control unit, executes the operation instruction, completes the test task, and calculates the operation time and task completion rate of the test task.
Further, the types of the operation instruction include: clicking a button, selecting a target and performing information input operation;
the input modes of various types of operation instructions can be set to blink k times and watch for n seconds, a test task is formed according to the configured instruction input modes, and k can be set to be 2, 3, 4 or 5,n and can be set to be 2-5 seconds.
Further, the operation control unit acquires a corresponding operation instruction according to the eye action, calculates the duration of the operation instruction, and sends the duration to the analysis unit;
the analysis unit compares the operation instruction with a corresponding instruction in the execution task and judges the operation completion condition;
the analysis unit is used for counting the time length average value and the task completion rate average value of each instruction input mode based on the time length and the operation completion condition of the operation instruction and storing the time length average value and the task completion rate average value in an operation library; and the analysis unit predicts the task completion rate and the operation time of the test task according to the completion rate average value and the time length average value corresponding to various instruction input modes in the test task and provides the task completion rate and the operation time for configuration personnel.
Further, the analysis unit calculates a duration average value and a completion rate average value corresponding to each instruction input mode, and provides the duration average value and the completion rate average value to a configurator as a reference for selecting the instruction input mode.
Further, a playback unit is also provided;
the operation library stores a plurality of eye movement equipment input data corresponding to each instruction input mode; storing according to the instruction input mode and the operation target position;
and the playback unit searches the eye movement equipment input data corresponding to the operation instruction from the operation library according to the instruction input mode and the position of the operation target for each operation instruction in the test task, inputs the eye movement equipment input data corresponding to the operation instruction into the operation control unit, and reviews the eye movement behavior and the operation behavior of the operator.
In another aspect, a method for performing ergonomic experimental analysis by using the ergonomic experimental analysis system of the human-computer interface eye movement interaction mode is provided, which includes:
extracting the operation which can adopt eye movement interaction by the operation task of the human-computer interface;
configuring instruction types for each operation capable of adopting eye movement interaction, configuring a plurality of instruction input modes for each instruction type, and forming a test task formed by combining a plurality of different instruction input modes;
executing the test tasks one by one, and calculating the operation time and the task completion rate for completing each test task; selecting a configuration mode corresponding to the test task with the highest task completion rate mean value as a human eye interaction configuration mode of the task; and if the same completion rate mean value exists, selecting the configuration mode with the shortest operation time as the human eye interaction configuration mode of the task.
Furthermore, each instruction type is configured with a plurality of instruction input modes, including: acquiring time length average values corresponding to various instruction input modes by an ergonomic experimental analysis system; and selecting three instruction input modes with the shortest average time length for each instruction type according to the average time lengths corresponding to the various instruction input modes.
Furthermore, each instruction type is configured with a plurality of instruction input modes, including: estimating the task completion rate and the operation time of the test task according to the task completion rate mean value and the time length mean value corresponding to various instruction input modes in the test task, and providing the estimated task completion rate and operation time for configuration personnel; if the estimated task completion rate is lower than the required completion rate or the estimated operation time exceeds the required time, the configuration personnel reconfigures the instruction input mode.
Further, executing the test tasks one by one, including selecting a plurality of operators to execute the test tasks one by one respectively; the average value of the operation time of the unified test task executed by the plurality of operators is used as the operation time for completing the test task, and the average value of the task completion rate is used as the task completion rate for completing the test task.
Further, for each instruction in the test task, searching corresponding eye movement equipment input data by the operation library according to the instruction input mode and the position of the operation target, inputting the data into the operation control unit, and playing back the eye movement behavior and the operation behavior of the operator.
The technical scheme of the invention has the following beneficial technical effects:
(1) The invention provides an ergonomic experimental analysis system and method of a human-computer interface eye movement interaction mode, which optimizes the eye movement interaction mode through experimental comparison analysis, so that the ergonomic experimental analysis process is standard and feasible.
(2) The ergonomic experimental analysis system of the human-computer interface eye movement interaction mode can realize the configuration and research of each instruction input mode, is convenient to select the optimal instruction input mode and optimizes the execution flow of the actual operation task; and an evaluation result is given based on the whole task, so that the reasonability of task configuration is improved conveniently.
(3) The human-computer interface eye movement interaction type ergonomics experiment analysis system can realize data playback, and improves the convenience of experiments.
Drawings
FIG. 1 is a schematic diagram of the components of an ergonomic experimental analysis system;
FIG. 2 is an analysis flow chart of an ergonomic experiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the accompanying drawings in combination with the embodiments. It should be understood that the description is intended to be exemplary only, and is not intended to limit the scope of the present invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
The invention provides an ergonomic experimental analysis system in a human-computer interface eye movement interaction mode, which comprises a configuration unit, an eye movement device, an operation control unit, an analysis unit and a display unit, wherein the human-computer interface eye movement interaction mode is combined with the analysis system shown in figure 1.
And the configuration unit is used for configuring the input modes of various types of operation instructions.
Furthermore, the input modes of various types of operation instructions can be configured to blink k times and watch for n seconds, a test task is formed according to the configured instruction input modes, and k can be configured to be 2, 3, 4 or 5,n can be configured to be 2-5 seconds.
The eye movement equipment is used for acquiring the eye state of an operator and outputting the eye state to the analysis unit.
Further, the eye movement equipment adopts mature commercial products, and the sampling rate is not lower than 60Hz. The tester is supported to operate in a sitting posture, and the sight distance is not more than 600mm.
The operation control unit outputs a corresponding operation instruction based on the eye action of the operator. Further, the operation control unit acquires a corresponding operation instruction according to the eye movement, calculates the input duration of the operation instruction, and sends the eye movement data (eye tracker), the input duration and the operation instruction corresponding to the operation instruction to the analysis unit.
The types of instructions provided in one embodiment include: clicking a button, selecting a target, and information entry operations.
The analysis unit receives the operation instruction sent by the operation control unit, executes the operation instruction, completes the test task, and calculates the operation time of the test task. The analysis unit also counts the average value of the input duration of each instruction input mode based on the input duration of the operation instruction and stores the average value in an operation library; and the analysis unit is used for predicting the operation time of the test task according to the time length average value corresponding to each instruction input mode in the test task and providing the operation time for configuration personnel. The configurator can know the instruction input mode corresponding to the test task and the expected execution operation time in advance, and if the operation time far exceeds the total allowable task execution time, the configurator can adjust the instruction input mode in time. For example, a duration of 3 seconds of gaze is longer than a duration of 2 blinks, and to reduce the operation time, the 3 seconds of gaze may be adjusted to 2 blinks.
The task completion rate is one of indexes for comparing the operation efficiency of various interactive modes. The task completion rate is obtained from the operation completion condition, and the calculation method is shown as the formula (1):
Figure BDA0003215679630000051
wherein, omega is the task completion rate, n is the stimulation number presented in the process of executing the test task, k j For the jth operation completion condition, the data is collected by the analysis unit, and the data is recorded with 1 after completion and is not recorded with 0.
The operation time of the two test tasks is t 1 And t 2 The task completion rate is omega 1 And ω 2 Then:
ω 1 >ω 2 selecting an eye movement interaction mode corresponding to the first test task;
ω 1 <ω 2 selecting an eye movement interaction mode corresponding to the second test task;
ω 1 =ω 2 when t is 1 >t 2 Selecting an eye movement interaction mode corresponding to the second test task;
ω 1 =ω 2 when t is 1 <t 2 And selecting the eye movement interaction mode corresponding to the first test task.
And the analysis unit compares the operation instruction with a corresponding instruction in the execution task, judges the operation completion condition, and completes the operation to 1 but does not complete the operation to 0. And calculating the task completion rate omega based on the operation completion condition of the operation instruction, and counting the task completion rate mean value of each instruction input mode. The analysis unit can estimate the task completion rate of the test task according to the completion rate mean value corresponding to various instruction input modes in the test task, and provides the task completion rate for configuration personnel. Estimating task completion rate omega of test task s For example, are:
Figure BDA0003215679630000061
ω i And recording 1 for the task completion rate of the ith operation of the test task, and recording 0 for the completion. And m is the total number of the operation steps of the test task.
The configurator can know the instruction input mode corresponding to the test task and the expected task completion rate in advance, and if the requirement is not met, some instruction input modes with higher completion rate mean values can be adjusted and selected to adjust the configuration scheme.
Further, the analysis unit calculates the time length average value corresponding to each instruction input mode, performs sequencing, and provides the sequencing for a configurator as a reference for selecting the instruction input mode. In one embodiment, the eye movement interaction mode of the click button is 2 blinks or 3 seconds of fixation, the eye movement interaction mode of the selection target is 2 blinks or 3 seconds of fixation, and the eye movement interaction mode of the information entry is 3 blinks or 3 seconds of fixation.
Further, the analysis unit calculates task completion rate mean values corresponding to various instruction input modes, sorts the task completion rate mean values, and provides the task completion rate mean values for configuration personnel as references for selecting the instruction input modes.
The display unit is used for displaying an operation interface, and displaying the operation interface to the buttons, the selection targets and the information input operation interface which can be selected by an operator. The button simulates a 'start' button in a Word software interface, the selection target simulation selects one icon from a plurality of icons, and the information is input into an analog digital input box.
Further, the operation control unit sends the eye movement data sent by the eye tracker to the analysis unit, and the analysis unit stores the eye movement data into an operation library.
The operation library stores a plurality of eye movement equipment input data corresponding to each instruction input mode; and storing according to the instruction input mode and the operation target position.
And a playback unit is also arranged, for each operation instruction in the test task, according to the instruction input mode and the position of the operation target, the operation library is used for searching the eye movement equipment input data corresponding to the operation instruction, and inputting the eye movement equipment input data corresponding to the operation instruction into the operation control unit, so that the eye movement behavior and the operation behavior of an operator can be reviewed, and the behavior of the operator is further analyzed.
On the other hand, the invention provides an ergonomic experiment analysis system based on the human-computer interface eye movement interaction mode, and an ergonomic experiment analysis method comprises the following steps:
(1) And extracting the operation capable of adopting eye movement interaction by the operation task of the human-computer interface.
The operation task adopts a cognitive walk method, and the extraction can adopt the operation of eye movement interaction.
(2) And configuring instruction types for each operation capable of adopting eye movement interaction, configuring a plurality of instruction input modes for each instruction type, and forming a test task instruction input mode formed by combining a plurality of different instruction input modes.
And configuring the instruction types for the operation of each eye movement interaction, wherein the instruction types comprise clicking a button, selecting a target and inputting information.
The instruction input mode for configuring the instruction type can be configured to blink k times and watch for n seconds, a test task is formed according to each configured instruction input mode, and k can be configured to be 2, 3, 4 or 5,n can be configured to be 2-5 seconds.
And forming a plurality of test tasks by taking the instruction input mode as a variable. For example, the instruction input modes of clicking a button, selecting a target and performing information input operation corresponding to the first task are blinking twice; the instruction input modes of the click button, the selection target and the information input operation corresponding to the task two are all blinking three times; the instruction input modes of clicking the button, selecting the target and performing information input operation corresponding to the task three are all watching for 3 seconds; the instruction input modes of the click button, the selection target and the information input operation corresponding to the task four are all the instruction input modes of the click button, the selection target and the information input operation which are watched for 2.5 seconds …, and the instruction input modes of the selection target and the information input operation can be different and are combined to form various test tasks according to configuration.
Each instruction type is configured with a plurality of instruction input modes, including: and selecting three instruction input modes with the shortest time average or the highest task completion rate average for each instruction type according to the task completion rate average and the operation time average corresponding to each instruction input mode. Configuration personnel can complete configuration quickly and pertinently based on the information provided by the analysis unit, and configuration efficiency is improved. Due to the accuracy of the configuration scheme, the complexity of subsequent testing tasks is reduced, and the test can be completed quickly.
Configuring a plurality of instruction input modes for each instruction type can further comprise: estimating the operation time of the test task according to the time length average value corresponding to various instruction input modes in the test task, and providing the operation time for configuration personnel; if the estimated operation time exceeds the required time, the configurator reconfigures the instruction input mode. And (4) predicting the task completion rate of the test task, and if the predicted task completion rate is lower than the required completion degree, reconfiguring the instruction input mode by a configurator.
For example, several tasks with the longest estimated completion time may be deleted from the plurality of test tasks, or only the task with the shortest estimated completion time may be retained. Test tasks with task completion rates not meeting requirements can also be eliminated.
(3) Executing the test tasks one by one, and calculating the operation time and the task completion rate for completing each test task; selecting a configuration mode corresponding to the test task with the highest task completion rate mean value as a human eye interaction configuration mode of the task; and if the same completion rate mean value exists, selecting the configuration mode with the shortest operation time mean value as the human eye interaction configuration mode of the task.
And after each instruction is executed, storing a plurality of eye movement equipment data corresponding to the instruction input modes into an operation library. The analysis unit recalculates the average instruction operation time length of the instruction input mode.
A plurality of operators can be used to perform test tasks one by one, including:
experiments should be carried out in professional laboratories, the experimental platform adopts a desktop computer integrated with eye movement equipment, the eye movement equipment adopts mature commercial products, and the sampling rate is not lower than 60Hz. The platform supports the trial adoption of sitting posture operation, and the sight distance is not more than 600mm.
The special person introduces the experimental background and requirements to make the tested person know the experimental materials. The test was first read and exercised.
The selected tested object should meet the requirement of naked eye vision 1.0, and has no color blindness or color weakness. Before the official experiment begins, the testee is trained, the training content is an eye movement control mode of each test task, and the testee can participate in the official experiment after mastering an eye movement interaction method in the experiment platform through training.
The eye movement apparatus automatically recorded the data of the tested eye movements in each experimental stimulus, including the location of the fixation point, the saccade trajectory, and the time of the operation, the completion of the operation (0 is incomplete, 1 is complete).
And executing the test task, searching corresponding eye movement equipment input data from the operation library according to the instruction input mode and the position of the operation target for each instruction in the test task, inputting the data into the operation control unit, and replaying the eye movement behavior and the operation behavior of the operator.
Further, if no satisfactory scheme is obtained, the step (2) of reconfiguring the instruction input mode can be returned.
In summary, the present invention relates to an ergonomic experimental analysis system and method of human-computer interface eye movement interaction mode, wherein the operation task of the human-computer interface is used to extract the operation capable of adopting eye movement interaction; configuring instruction types for each operation capable of adopting eye movement interaction, configuring a plurality of instruction input modes for each instruction type, and forming a test task formed by combining a plurality of different instruction input modes; executing the test tasks one by one, and calculating the operation time and the task completion rate for completing each test task; selecting a configuration mode corresponding to the test task with the highest task completion rate average value as a human eye interaction configuration mode of the task; and if the same completion rate mean value exists, selecting the configuration mode with the shortest operation time as the human eye interaction configuration mode of the task. And through experimental comparison and analysis, the eye movement interaction mode is optimized, so that the ergonomic experimental analysis process is standard and feasible.
It is to be understood that the above-described embodiments of the present invention are merely illustrative of or explaining the principles of the invention and are not to be construed as limiting the invention. Therefore, any modifications, equivalents, improvements and the like which are made without departing from the spirit and scope of the present invention shall be included in the protection scope of the present invention. Further, it is intended that the appended claims cover all such variations and modifications as fall within the scope and boundaries of the appended claims or the equivalents of such scope and boundaries.

Claims (5)

1. A method for performing ergonomic experimental analysis by using an ergonomic experimental analysis system of a human-computer interface eye movement interaction mode, the ergonomic experimental analysis system of the human-computer interface eye movement interaction mode comprises: a configuration unit, an eye movement device, an operation control unit and an analysis unit;
the configuration unit is used for configuring input modes of various types of operation instructions;
the eye movement equipment is used for collecting eye movements of an operator and outputting the eye movements to the operation control unit; the operation control unit outputs a corresponding operation instruction based on the eye action of the operator;
the operation control unit acquires a corresponding operation instruction according to the eye action, calculates the duration of the operation instruction and sends the duration to the analysis unit;
the analysis unit receives the operation instruction sent by the operation control unit, executes the operation instruction, completes the test task, and calculates the operation time and task completion rate of the test task: comparing the operation instruction with a corresponding instruction in the execution task, and judging the operation completion condition;
the analysis unit is also used for counting the time length average value and the task completion rate average value of each instruction input mode based on the time length of the operation instruction and the operation completion condition and storing the time length average value and the task completion rate average value in an operation library; estimating the task completion rate and the operation time of the test task according to the completion rate average value and the duration average value corresponding to various instruction input modes in the test task, and providing the task completion rate and the operation time for configuration personnel;
the analysis unit calculates the time length average value and the completion rate average value corresponding to various instruction input modes and provides the time length average value and the completion rate average value to a configurator as a reference for selecting the instruction input modes;
the types of the operation instructions comprise: clicking a button, selecting a target and performing information input operation; the method for analyzing the ergonomic experiment comprises the following steps:
extracting the operation which can adopt eye movement interaction by the operation task of the human-computer interface;
configuring instruction types for each operation capable of adopting eye movement interaction, configuring a plurality of instruction input modes for each instruction type, wherein the instruction input modes comprise blink k times and fixation n seconds, and forming a test task formed by combining a plurality of different instruction input modes according to each configured instruction input mode;
executing the test tasks one by one, and calculating the operation time and the task completion rate for completing each test task; selecting a configuration mode corresponding to the test task with the highest task completion rate mean value as a human eye interaction configuration mode of the task; if the same completion rate mean value exists, selecting a configuration mode with the shortest operation time as a human eye interaction configuration mode of the task;
each instruction type is configured with a plurality of instruction input modes, including: estimating the task completion rate and the operation time of the test task according to the task completion rate mean value and the time length mean value corresponding to various instruction input modes in the test task, and providing the estimated task completion rate and operation time for configuration personnel; if the estimated task completion rate is lower than the required completion rate or the estimated operation time exceeds the required time, the configuration personnel reconfigures the instruction input mode;
executing the test tasks one by one, including selecting a plurality of operators to execute the test tasks one by one respectively; the average value of the operation time of the unified test task executed by the plurality of operators is used as the operation time for completing the test task, and the average value of the task completion rate is used as the task completion rate for completing the test task.
2. The method of performing an ergonomic experimental analysis of claim 1 wherein each command type is configured with a plurality of command input means comprising: acquiring time length average values corresponding to various instruction input modes by an ergonomic experimental analysis system; and selecting three instruction input modes with the shortest average time length for each instruction type according to the average time lengths corresponding to the various instruction input modes.
3. The method for performing ergonomic laboratory analysis of claim 1 wherein, for each command in the test task, the corresponding eye-movement equipment input data is searched for from the operation library according to the command input mode and the position of the operation target, and the eye-movement behavior and the operation behavior of the operator are played back by inputting the operation control unit.
4. The method of claim 1, wherein k is configurable to be 2, 3, 4, or 5,n is configurable to be 2-5 seconds.
5. The method for performing ergonomic experimental analysis of claim 4 wherein,
the ergonomic experimental analysis system of the human-computer interface eye movement interaction mode is also provided with a playback unit;
the operation library stores a plurality of eye movement equipment input data corresponding to each instruction input mode; storing according to the instruction input mode and the operation target position;
and the playback unit searches the eye movement equipment input data corresponding to the operation instruction from the operation library according to the instruction input mode and the position of the operation target for each operation instruction in the test task, inputs the eye movement equipment input data corresponding to the operation instruction into the operation control unit, and reviews the eye movement behavior and the operation behavior of the operator.
CN202110942597.1A 2021-08-17 2021-08-17 Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method Active CN113655883B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110942597.1A CN113655883B (en) 2021-08-17 2021-08-17 Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110942597.1A CN113655883B (en) 2021-08-17 2021-08-17 Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method

Publications (2)

Publication Number Publication Date
CN113655883A CN113655883A (en) 2021-11-16
CN113655883B true CN113655883B (en) 2022-10-14

Family

ID=78479917

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110942597.1A Active CN113655883B (en) 2021-08-17 2021-08-17 Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method

Country Status (1)

Country Link
CN (1) CN113655883B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105204993A (en) * 2015-09-18 2015-12-30 中国航天员科研训练中心 Questionnaire test system and method based on multimodal interactions of eye movement, voice and touch screens
EP3336656A1 (en) * 2016-12-19 2018-06-20 OFFIS e.V. Model based detection of user reaction times and further effects as well as systems therefore
CN111887803A (en) * 2020-08-13 2020-11-06 上海交通大学 Multi-dimensional monitoring and evaluation system for man-machine work efficiency of aircraft cockpit
CN111949131A (en) * 2020-08-17 2020-11-17 陈涛 Eye movement interaction method, system and equipment based on eye movement tracking technology

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050047629A1 (en) * 2003-08-25 2005-03-03 International Business Machines Corporation System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking
CN101108120A (en) * 2007-08-29 2008-01-23 中国人民解放军第三军医大学第一附属医院 Testing and analyzing method for eye movement
CN103713728B (en) * 2014-01-14 2016-09-21 东南大学 A kind of detection method of complication system man machine interface availability
CN106901686B (en) * 2017-02-28 2018-10-12 北京七鑫易维信息技术有限公司 Execution method, server, test lead and the system of test of eye movement task
CN108459710B (en) * 2018-02-08 2021-04-06 东南大学 Interaction device controlled by eye movement signal
CN109298782B (en) * 2018-08-31 2022-02-18 创新先进技术有限公司 Eye movement interaction method and device and computer readable storage medium
CN111124124A (en) * 2019-12-25 2020-05-08 中国航空工业集团公司沈阳飞机设计研究所 Human-computer efficacy evaluation method based on eye movement tracking technology
CN111459993B (en) * 2020-02-17 2023-06-06 平安科技(深圳)有限公司 Configuration updating method, device, equipment and storage medium based on behavior analysis

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105204993A (en) * 2015-09-18 2015-12-30 中国航天员科研训练中心 Questionnaire test system and method based on multimodal interactions of eye movement, voice and touch screens
EP3336656A1 (en) * 2016-12-19 2018-06-20 OFFIS e.V. Model based detection of user reaction times and further effects as well as systems therefore
CN111887803A (en) * 2020-08-13 2020-11-06 上海交通大学 Multi-dimensional monitoring and evaluation system for man-machine work efficiency of aircraft cockpit
CN111949131A (en) * 2020-08-17 2020-11-17 陈涛 Eye movement interaction method, system and equipment based on eye movement tracking technology

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Minho Kim ; Byung Hyung Kim ; Sungho Jo."Quantitative Evaluation of a Low-Cost Noninvasive Hybrid Interface Based on EEG and Eye Movement".《IEEE Transactions on Neural Systems and Rehabilitation Engineering》.2014, *
激发点击的不同注视时间参数对眼控交互操作绩效的影响;李宏汀等;《人类工效学》;20170420(第02期);全文 *
王庆敏 ; 姚永杰 ; 李科华 ; 时粉周 ; 刘秋红."飞行员眼动追踪与航空工效学的研究现状".《海军医学杂志》.2016, *

Also Published As

Publication number Publication date
CN113655883A (en) 2021-11-16

Similar Documents

Publication Publication Date Title
Aaltonen et al. 101 spots, or how do users read menus?
CN109074166A (en) Change application state using neural deta
CN105867599A (en) Gesture control method and device
Liu et al. On a Real Real-Time Wearable Human Activity Recognition System.
US20220398937A1 (en) Information processing device, information processing method, and program
CN110402099B (en) Information display device, biological signal measuring system, and computer-readable recording medium
CN113974589B (en) Multi-modal behavior paradigm evaluation optimization system and cognitive ability evaluation method
CN107785066B (en) Method, device and system for modifying heartbeat type
CN104636890A (en) Measurement method for workload of air traffic controller
Uva et al. A user-centered framework for designing midair gesture interfaces
Till et al. Embodied effects of conceptual knowledge continuously perturb the hand in flight
Kunapipat et al. Sensor-assisted EMG data recording system
JP7187785B2 (en) Information display device, biological signal measurement system and program
CN113655883B (en) Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method
Cabric et al. A predictive performance model for immersive interactions in mixed reality
US11138779B2 (en) Information processing apparatus, information processing method, computer-readable medium, and biological signal measurement system
Pleydell-Pearce et al. Multivariate analysis of EEG: Predicting cognition on the basis of frequency decomposition, inter-electrode correlation, coherence, cross phase and cross power
JP7135845B2 (en) Information processing device, information processing method, program, and biological signal measurement system
RU2663639C2 (en) System for determining visual perception
Gisler et al. Indicators of training success in virtual reality using head and eye movements
CN114327046B (en) Method, device and system for multi-mode human-computer interaction and intelligent state early warning
Hou et al. Applicability Study of Eye Movement Menu based on Analytic Hierarchy Process
US20240061512A1 (en) Computer-implemented method and apparatus for determining reaction time processes
Hou et al. Research on Visual Feedback Based on Natural Gesture
Biricz et al. User friendly virtual reality software development and testing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant