CN113507561A - Personalized display system design method based on eye movement data analysis - Google Patents

Personalized display system design method based on eye movement data analysis Download PDF

Info

Publication number
CN113507561A
CN113507561A CN202110610648.0A CN202110610648A CN113507561A CN 113507561 A CN113507561 A CN 113507561A CN 202110610648 A CN202110610648 A CN 202110610648A CN 113507561 A CN113507561 A CN 113507561A
Authority
CN
China
Prior art keywords
display
eye movement
movement data
pilot
display element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110610648.0A
Other languages
Chinese (zh)
Other versions
CN113507561B (en
Inventor
魏巍
张少卿
刘健
王亚卓
杨东岳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Aircraft Design and Research Institute Aviation Industry of China AVIC
Original Assignee
Shenyang Aircraft Design and Research Institute Aviation Industry of China AVIC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Aircraft Design and Research Institute Aviation Industry of China AVIC filed Critical Shenyang Aircraft Design and Research Institute Aviation Industry of China AVIC
Priority to CN202110610648.0A priority Critical patent/CN113507561B/en
Publication of CN113507561A publication Critical patent/CN113507561A/en
Application granted granted Critical
Publication of CN113507561B publication Critical patent/CN113507561B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Biomedical Technology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • General Health & Medical Sciences (AREA)
  • Dermatology (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The application relates to a computer processing technology, in particular to a personalized display system design method based on eye movement data analysis. The method comprises the steps of S1, obtaining an initial display picture of a human-computer display interface, wherein the initial display picture comprises a plurality of preset display elements which are positioned at different positions; step S2, acquiring tasks executed by the pilot and related to the human-computer display interface; step S3, eye movement data of the pilot during the task is collected; step S4 is to adjust the positions of the display elements in the initial display screen based on the eye movement data to form a new display screen. The method and the device effectively solve the contradiction between the highly integrated processed information and the workload of the pilot, and improve the human-computer efficiency performance.

Description

Personalized display system design method based on eye movement data analysis
Technical Field
The application relates to a computer processing technology, in particular to a personalized display system design method based on eye movement data analysis.
Background
With the continuous deep development of the high integration of military information, the requirement on the man-machine efficiency of a pilot for processing various information visually perceived by a man-machine interface is higher and higher. How to realize the effective integration of the human-computer system design and fully, efficiently and safely improve the system performance becomes the key point of human-computer efficacy research attention in the military machine development process. The display system design method based on visual cognition (eye tracking data) has personalized customization, no interference and effectiveness, can effectively improve a human-computer display interface in the development process of military equipment and reduce the operation pressure of a pilot in the task execution process, can effectively solve the contradiction between the highly integrated processed information and the workload of the pilot, and is a key technology for improving the human-computer efficacy performance.
Currently, an effective design method for a human-computer interface method based on eye movement data is not implemented for existing models or projects.
Disclosure of Invention
In order to solve the above problems, the present application provides a design method of a personalized display system based on eye movement data analysis, so as to improve human-machine efficiency.
The design method of the personalized display system based on the eye movement data analysis mainly comprises the following steps:
step S1, obtaining an initial display picture of a human-computer display interface, wherein the initial display picture comprises a plurality of preset display elements positioned at different positions;
step S2, acquiring tasks executed by the pilot and related to the human-computer display interface;
step S3, eye movement data of the pilot during the task is collected;
step S4 is to adjust the positions of the display elements in the initial display screen based on the eye movement data to form a new display screen.
Preferably, step S3 includes collecting eye movement data of a plurality of pilots performing the same mission.
Preferably, in step S3, the eye movement data is acquired by a head-mounted eye movement tracking tool.
Preferably, in step S3, the eye movement data includes: the system comprises a focus time sequence table and a staring time sequence table, wherein the focus time sequence table comprises jump time sequences of a pilot among different display elements in the process of executing the task, and the staring time sequence table comprises staring time lengths of the pilot on the different display elements in the process of executing the task.
Preferably, in step S4, the adjusting the initial display screen based on the eye movement data includes:
adjusting the position relation of each display element according to the attention time sequence of the pilot to each display element so as to reduce the scanning visual angle of the pilot; and
and adjusting the display mode of each display element according to the staring time of the pilot on each display element so as to increase the identification of the display elements.
Preferably, the increasing the identification of the display elements comprises:
determining the stay time of the pilot on all display elements when searching for a specific display element in the process of executing the task;
obtaining the background color of the display element with the longest retention time as an adjustment color;
the display mode of each display element is adjusted by applying the adjustment color to each display element.
Preferably, the dwell time of each display element is determined by multiple trials or trial data from multiple pilots within a specified time while the pilot is looking for a particular display element.
Preferably, step S4 further includes:
acquiring identity information of a pilot;
and aiming at different pilots, adjusting the positions of all display elements in the initial display picture based on the eye movement data of the pilots to form a new display picture adapted to the pilots.
Due to the adoption of the scheme, the following advantages can be realized: 1) in the process of developing a human-computer display interface, under a test environment, the eye movement data of a pilot in the process of an appointed task is collected through an eye movement tracking technology, so that an attention time sequence table and a gaze time length table are generated, the display system is designed in an individualized mode, the human-computer interaction performance is improved and optimized, and maintenance and change after delivery are avoided. 2) After the development of the man-machine display interface is finished, the method can be still adopted to optimize the display system, and reference datum data are provided for the subsequent model development as a basis. 3) The data obtained through the eye tracking technology has authenticity, no interference and effectiveness, and compared with means such as questionnaires and video recording, the human-computer efficacy of a human-computer display interface can be objectively and quantitatively improved.
Drawings
Fig. 1 is a flowchart of a preferred embodiment of a method for designing a personalized display system based on eye movement data analysis according to the present application.
Detailed Description
In order to make the implementation objects, technical solutions and advantages of the present application clearer, the technical solutions in the embodiments of the present application will be described in more detail below with reference to the accompanying drawings in the embodiments of the present application. In the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The described embodiments are some, but not all embodiments of the present application. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application, and should not be construed as limiting the present application. All other embodiments obtained by a person of ordinary skill in the art without any inventive work based on the embodiments in the present application are within the scope of protection of the present application. Embodiments of the present application will be described in detail below with reference to the drawings.
As shown in fig. 1, the present application provides a method for designing a personalized display system based on eye movement data analysis, which mainly includes:
step S1, obtaining an initial display picture of a human-computer display interface, wherein the initial display picture comprises a plurality of preset display elements positioned at different positions;
step S2, acquiring tasks executed by the pilot and related to the human-computer display interface;
step S3, eye movement data of the pilot during the task is collected;
step S4 is to adjust the positions of the display elements in the initial display screen based on the eye movement data to form a new display screen.
Wherein the eye movement data comprises: the system comprises a focus time sequence table and a staring time sequence table, wherein the focus time sequence table comprises jump time sequences of a pilot among different display elements in the process of executing the task, and the staring time sequence table comprises staring time lengths of the pilot on the different display elements in the process of executing the task.
Correspondingly, in step S4, adjusting the initial display screen according to the eye movement data includes:
according to the attention time sequence of the pilot to each display element, the position relation of each display element is adjusted to reduce the glance angle of the pilot, so that the continuity of the fixation point is improved, and according to the fixation duration of the pilot to each display element, the display mode of each display element is adjusted to increase the identification of the display element.
FIG. 1 is a flow chart of a specific display system design method. As shown in fig. 1, the personalized display system design method based on eye movement data analysis of the present application may include: eye movement data is first collected during the performance of the same mission by multiple pilots within a specified time using a head-mounted eye tracking tool. Secondly, determining that the pilot executes an execution task related to the information of the human-computer display interface, and requiring the pilot to be tested for executing a specified task within a task limited time for a plurality of times. And then, extracting data of the saccade amplitude and the staring time in the process that the pilot repeatedly executes the specified task, and carrying out statistical calculation on each group of eye movement data to be used as an index for displaying the personalized design of the system. And then, matching and binding the display elements with the glance amplitude data and the gaze time data of the pilot in the task execution process to generate an attention time sequence table and a gaze time length table for displaying the pilot. And finally, comparing the current display system interface to perform personalized optimization design according to the generated attention time sequence table and the gaze duration table.
The method comprises the steps that a head-mounted eye tracking tool is used for collecting eye movement data including fixation and saccade data of multiple pilots in the process of executing the same task within a specified time, and human-computer efficacy data collection based on a visual channel is completed; according to display element jumping and eye movement data in a task execution process, generating an attention time sequence table and a gaze time length table of a pilot, wherein the attention time sequence table represents the jumping process of the pilot among different display elements, the gaze time length table represents the gaze time length of the pilot to a certain display element at a certain stage, the gaze time represents the attention degree of the pilot to the display element, the display elements are adjusted according to the attention time sequence table to reduce visual angle scanning of the pilot, the display elements are adjusted according to the gaze time length table, and personalized design is carried out according to the personal habits of the pilot; eye movement indexes are calculated through a statistical method, and man-machine efficacy performance is objectively improved through optimization of a display system.
In some alternative embodiments, said increasing the identification of display elements comprises:
determining the stay time of the pilot on all display elements when searching for a specific display element in the process of executing the task;
obtaining the background color of the display element with the longest retention time as an adjustment color;
the display mode of each display element is adjusted by applying the adjustment color to each display element.
In some alternative embodiments, the dwell time for each display element is determined by multiple trials or trial data from multiple pilots over a specified time while the pilot is looking for a particular display element.
For example, an executive task T related to the information of the human-machine interface is first determined, and the pilot is required to repeatedly execute a specified task within a task limited time T, wherein the executive task T can be, for example, searching for a specific display element on the display interface, and T is 5 seconds. In this example, a display interface has a plurality of display elements with ground colors, a specific display element is found within a specified time, eye movement data is collected during a task, eye movement process data and a task execution process display interface are matched to generate a fixation proportion percentage of each display module during the task execution, and then an optimization scheme based on human-machine efficiency of a display system is proposed according to the fixation proportion.
When the pilot monitors the human-computer display interface, the attention is influenced by various factors such as test environment, psychological state and the like, the time and the environment for the pilot to execute tasks are limited for avoiding the interference of the collected eye movement data, namely: during the process of eye movement data acquisition, the pilot is required to repeatedly complete the same designated task in the same test environment and within the same time.
Since the eye movement data includes a lot of data, in order to simplify the processing, the eye movement data acquisition index of the present application mainly includes a gaze time index, and in other alternative embodiments, other eye movement indexes, such as blinking, through hole variation, AOI statistics, and the like, may also be acquired.
In some optional embodiments, step S4 further includes:
acquiring identity information of a pilot;
and aiming at different pilots, adjusting the positions of all display elements in the initial display picture based on the eye movement data of the pilots to form a new display picture adapted to the pilots.
It can be understood that the optimal display interface layout can be found through multiple tests for different pilots, the layout mode is bound with the identity of the pilot, and the corresponding display interface is called according to the preset binding relationship after the system identifies different pilots, so that the individuation is improved, and the human-computer efficacy is improved.
The method and the device can effectively solve the contradiction between the highly integrated processed information and the workload of the pilot, and improve the human-computer efficiency performance. Vision is a direct path for the user to interact with the interface, and over 80% of the information is captured by the eye, where it is processed in the brain to affect the user's behavior. At present, an eye tracking tool is mature and reliable in performance, a display system design method based on an eye tracking technology has the characteristics of authenticity, no interference and individuation, the man-machine efficacy performance can be objectively and quantitatively improved, and the problem of insufficient man-machine efficacy consideration in the design process of a display system in the current model is solved.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. A personalized display system design method based on eye movement data analysis is characterized by comprising the following steps:
step S1, obtaining an initial display picture of a human-computer display interface, wherein the initial display picture comprises a plurality of preset display elements positioned at different positions;
step S2, acquiring tasks executed by the pilot and related to the human-computer display interface;
step S3, eye movement data of the pilot during the task is collected;
step S4 is to adjust the positions of the display elements in the initial display screen based on the eye movement data to form a new display screen.
2. The method of claim 1, wherein step S3 includes collecting eye movement data of a plurality of pilots performing the same task.
3. The method for designing a personalized display system based on eye movement data analysis according to claim 1, wherein in step S3, the eye movement data is collected by a head-mounted eye movement tracking tool.
4. The method for designing a personalized display system based on eye movement data analysis according to claim 1, wherein in step S3, the eye movement data comprises: the system comprises a focus time sequence table and a staring time sequence table, wherein the focus time sequence table comprises jump time sequences of a pilot among different display elements in the process of executing the task, and the staring time sequence table comprises staring time lengths of the pilot on the different display elements in the process of executing the task.
5. The method for designing a personalized display system based on eye movement data analysis according to claim 4, wherein in step S4, adjusting the initial display frame according to the eye movement data comprises:
adjusting the position relation of each display element according to the attention time sequence of the pilot to each display element so as to reduce the scanning visual angle of the pilot; and
and adjusting the display mode of each display element according to the staring time of the pilot on each display element so as to increase the identification of the display elements.
6. The method of claim 5, wherein the increasing recognition of display elements comprises:
determining the stay time of the pilot on all display elements when searching for a specific display element in the process of executing the task;
obtaining the background color of the display element with the longest retention time as an adjustment color;
the display mode of each display element is adjusted by applying the adjustment color to each display element.
7. The method of claim 6, wherein the dwell time of each display element is determined by a plurality of trials or trial data of a plurality of pilots within a predetermined time while the pilot is looking for a specific display element.
8. The method for designing a personalized display system based on eye movement data analysis of claim 1, wherein the step S4 further comprises:
acquiring identity information of a pilot;
and aiming at different pilots, adjusting the positions of all display elements in the initial display picture based on the eye movement data of the pilots to form a new display picture adapted to the pilots.
CN202110610648.0A 2021-06-01 2021-06-01 Personalized display system design method based on eye movement data analysis Active CN113507561B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110610648.0A CN113507561B (en) 2021-06-01 2021-06-01 Personalized display system design method based on eye movement data analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110610648.0A CN113507561B (en) 2021-06-01 2021-06-01 Personalized display system design method based on eye movement data analysis

Publications (2)

Publication Number Publication Date
CN113507561A true CN113507561A (en) 2021-10-15
CN113507561B CN113507561B (en) 2023-06-20

Family

ID=78008701

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110610648.0A Active CN113507561B (en) 2021-06-01 2021-06-01 Personalized display system design method based on eye movement data analysis

Country Status (1)

Country Link
CN (1) CN113507561B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116824954A (en) * 2023-07-03 2023-09-29 中国民用航空飞行学院 Simulation machine flight training comment system and method for eye movement and flight data

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150234457A1 (en) * 2012-10-15 2015-08-20 Umoove Services Ltd. System and method for content provision using gaze analysis
CN109917914A (en) * 2019-03-05 2019-06-21 河海大学常州校区 Interactive interface analysis and optimization method based on visual field position
CN110096328A (en) * 2019-05-09 2019-08-06 中国航空工业集团公司洛阳电光设备研究所 A kind of HUD interface optimization layout adaptive approach and system based on aerial mission
CN110572632A (en) * 2019-08-15 2019-12-13 中国人民解放军军事科学院国防科技创新研究院 Augmented reality display system, helmet and method based on sight tracking
CN111045519A (en) * 2019-12-11 2020-04-21 支付宝(杭州)信息技术有限公司 Human-computer interaction method, device and equipment based on eye movement tracking

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150234457A1 (en) * 2012-10-15 2015-08-20 Umoove Services Ltd. System and method for content provision using gaze analysis
CN109917914A (en) * 2019-03-05 2019-06-21 河海大学常州校区 Interactive interface analysis and optimization method based on visual field position
CN110096328A (en) * 2019-05-09 2019-08-06 中国航空工业集团公司洛阳电光设备研究所 A kind of HUD interface optimization layout adaptive approach and system based on aerial mission
CN110572632A (en) * 2019-08-15 2019-12-13 中国人民解放军军事科学院国防科技创新研究院 Augmented reality display system, helmet and method based on sight tracking
CN111045519A (en) * 2019-12-11 2020-04-21 支付宝(杭州)信息技术有限公司 Human-computer interaction method, device and equipment based on eye movement tracking

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116824954A (en) * 2023-07-03 2023-09-29 中国民用航空飞行学院 Simulation machine flight training comment system and method for eye movement and flight data
CN116824954B (en) * 2023-07-03 2024-03-01 中国民用航空飞行学院 Simulation machine flight training comment system and method for eye movement and flight data

Also Published As

Publication number Publication date
CN113507561B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
US10664050B2 (en) Human-computer interface using high-speed and accurate tracking of user interactions
CN109582131B (en) Asynchronous hybrid brain-computer interface method
CN105279494A (en) Human-computer interaction system, method and equipment capable of regulating user emotion
CN113467619B (en) Picture display method and device, storage medium and electronic equipment
CN1937813A (en) Automaticadjusting method and adjusting system for mobile phone picture
CN113507561B (en) Personalized display system design method based on eye movement data analysis
CN113208593A (en) Multi-modal physiological signal emotion classification method based on correlation dynamic fusion
Häggström et al. Examining the gaze behaviors of harvester operators: an eye-tracking study
Das et al. Unsupervised approach for measurement of cognitive load using EEG signals
KR101571848B1 (en) Hybrid type interface apparatus based on ElectronEncephaloGraph and Eye tracking and Control method thereof
CN109480870B (en) RSVP brain-computer interface-oriented mental load identification method
CN111124124A (en) Human-computer efficacy evaluation method based on eye movement tracking technology
CN107272880B (en) A kind of positioning of cursor, cursor control method and device
Tomar et al. Expressions recognition based on human face
CN112783314B (en) Brain-computer interface stimulation paradigm generating and detecting method, system, medium and terminal based on SSVEP
CN111222578A (en) Online processing method of motor imagery EEG signal
CN115509355A (en) MI-BCI interaction control system and method under integrated vision
CN115392287A (en) Electroencephalogram signal online self-adaptive classification method based on self-supervision learning
CN111967333B (en) Signal generation method, system, storage medium and brain-computer interface spelling device
CN112070141A (en) SSVEP asynchronous classification method fused with attention detection
Marghi et al. A parametric eeg signal model for bcis with rapid-trial sequences
Dal Seno et al. On-line detection of p300 and error potentials in a BCI speller
RU2008106937A (en) DATA PROCESSING METHOD FOR DETERMINING VISUAL PICTURES ON A VISUAL SCENE
CN117519551B (en) Visual interaction method, device and equipment of touch display screen and storage medium
CN114327046B (en) Method, device and system for multi-mode human-computer interaction and intelligent state early warning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant