CN115331142A - Portable human-computer interaction evaluation system - Google Patents
Portable human-computer interaction evaluation system Download PDFInfo
- Publication number
- CN115331142A CN115331142A CN202210927841.1A CN202210927841A CN115331142A CN 115331142 A CN115331142 A CN 115331142A CN 202210927841 A CN202210927841 A CN 202210927841A CN 115331142 A CN115331142 A CN 115331142A
- Authority
- CN
- China
- Prior art keywords
- evaluation
- human
- data
- unit
- task
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/46—Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/163—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4088—Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Human Computer Interaction (AREA)
- Artificial Intelligence (AREA)
- Pathology (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Pulmonology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- Software Systems (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Automatic Analysis And Handling Materials Therefor (AREA)
Abstract
The invention relates to the field of human-computer interaction measurement and evaluation, in particular to a portable human-computer interaction evaluation system. Various acquisition devices are integrated, and different types of required contents such as data, video files, texts and the like can be obtained; various display interfaces and hardware systems are universal, and the non-invasive design characteristics meet the human-computer interaction evaluation requirements of all kinds and all elements; the method supports a plurality of display control platforms to operate simultaneously, can perform collaborative evaluation on tasks requiring collaborative cooperation of a plurality of control platforms, and considers collaborative efficiency measurement; the human-computer work efficiency level of the human-computer interaction interface can be comprehensively evaluated according to the evaluation index by combining human physiological data.
Description
Technical Field
The invention relates to the field of human-computer interaction measurement and evaluation, in particular to a portable human-computer interaction evaluation system.
Background
Under the traditional mode, when carrying out human-computer interaction evaluation under the normal exercise training condition, required equipment is more, and the wearing time overlength just influences operating personnel's work performance for sensitivity and the accuracy nature of evaluating can't guarantee.
The equipment such as the electroencephalograph also seriously influences the actual sense of the operator, is too invasive, and cannot judge whether the reason causing electroencephalogram fluctuation is a task factor or an equipment factor, so that the evaluation result is not accurate enough.
Signals such as electroencephalogram, eye movement and the like need higher theoretical level and actual technical level for calculation, so that the universal degree of evaluation cannot meet the requirement; meanwhile, the information acquisition of the electroencephalogram and eye movement signals needs computer equipment, and the computer cannot be carried without permission due to the consideration of confidentiality, and meanwhile, the equipment cannot be plugged into a computer used for operation.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide a convenient human-computer interaction evaluation system.
In order to achieve the purpose, the invention is realized by adopting the following technical scheme.
A portable man-machine interaction evaluation system comprises an information acquisition module, an information processing module, an interaction evaluation module and an information transmission module;
the information acquisition module comprises acquisition equipment and an information storage unit;
the acquisition equipment comprises a camera, a physiological bracelet, an eye tracker, a displacement sensor, a force sensor and a rotation sensor; the camera is used for collecting video information; the physiological bracelet is used for collecting the heart rate and the respiratory rate; the eye movement instrument is used for collecting eye movement data; the displacement sensor is used for acquiring displacement data; the force sensor is used for collecting stress data; the rotation sensor is used for collecting rotation amplitude data; the information storage unit is used for storing the data acquired by the acquisition equipment;
the information processing module comprises a subjective data analysis unit, a video stream analysis unit, a task data analysis unit and a physiological index analysis unit;
the subjective data analysis unit is used for calculating to obtain a task load index according to the subjective data; wherein the subjective data is a questionnaire filled by an appraiser according to the appraisal content, such as a NASA-NLX scale;
the video stream analysis unit is used for obtaining an abnormal operation index according to the video information; identifying abnormal operation on video information by using a video key frame extraction technology and an improved Mask-RCNN algorithm to obtain the times of the abnormal operation and realize the quantification of abnormal operation behaviors;
the task data analysis unit is used for obtaining the time utilization rate according to the ratio of the time of normal operation to the total time of task completion and obtaining the task accuracy rate according to the ratio of the number of correctly completed tasks to the total number of tasks;
the physiological index analysis unit is used for obtaining an eye movement load index according to the eye movement data;
the interaction evaluation module comprises a task frame evaluation unit, a cognitive load evaluation unit, a human-computer interface evaluation unit, an operation comfort evaluation unit and an output unit;
the task frame evaluation unit is used for carrying out quantitative evaluation on the task frame according to the task load index, the time utilization rate and the task accuracy rate by combining an improved FAHP-TOPSIS method to obtain a task frame quantitative evaluation value;
the cognitive load evaluation unit is used for quantitatively evaluating the cognitive load according to the heart rate, the respiratory rate, the abnormal operation index and the eye movement load index by combining an improved FAHP-TOPSIS method to obtain a quantitative evaluation value of the cognitive load;
the human-computer interface evaluation unit is used for carrying out quantitative evaluation on the human-computer interface by combining an improved FAHP-TOPSIS method according to prestored expert evaluation scores of 4 indexes of practicability, functionality, fluency and harmony to obtain a human-computer interface quantitative evaluation value;
the operation comfort evaluation unit is used for carrying out quantitative evaluation on the operation comfort according to the displacement data, the stress data and the rotation amplitude data of the human body by combining an improved FAHP-TOPSIS method to obtain an operation comfort quantitative evaluation value;
the output unit is used for obtaining a human-computer interaction numerical solution and a related index distribution diagram according to the task frame quantitative evaluation value, the cognitive load quantitative evaluation value, the human-computer interface quantitative evaluation value and the operation comfort quantitative evaluation value to complete evaluation on a human-computer interaction system;
the information transmission module is used for transmitting the data acquired by the information acquisition module to the information processing module and transmitting the index data acquired by the information processing module to the interactive evaluation module.
Furthermore, displacement sensor, force transducer and rotation sensor all set up hand, upper limbs and the lower back at operating personnel.
Compared with the prior art, the invention has the beneficial effects that: various acquisition devices are integrated, and different types of required contents such as data, video files, texts and the like can be obtained; various display interfaces and hardware systems are universal, and the non-invasive design characteristics meet the human-computer interaction evaluation requirements of all kinds and all elements; the method supports a plurality of display control platforms to operate simultaneously, can perform collaborative evaluation on tasks requiring collaborative cooperation of a plurality of control platforms, and considers collaborative efficiency measurement; the human-computer work efficiency level of the human-computer interaction interface can be comprehensively evaluated according to the evaluation index by combining with the human physiological data.
Drawings
The invention is described in further detail below with reference to the figures and specific embodiments.
FIG. 1 is a schematic diagram of a portable human-computer interaction evaluation system according to the present invention;
FIG. 2 is a functional block diagram of an interactive rating module of the present invention;
FIG. 3 is a flow chart of the improved Mask-RCNN algorithm of the present invention;
FIG. 4 is a flow chart of the improved FAHP-TOPSIS process of the present invention.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to examples, but it will be understood by those skilled in the art that the following examples are only illustrative of the present invention and should not be construed as limiting the scope of the present invention.
Referring to fig. 1, a portable human-computer interaction evaluation system includes an information acquisition module, an information processing module, an interaction evaluation module, and an information transmission module;
the information acquisition module comprises acquisition equipment and an information storage unit;
the acquisition equipment comprises a camera, a physiological bracelet, an eye tracker, a displacement sensor, a force sensor and a rotation sensor; the camera is used for collecting video information; the physiological bracelet is used for collecting the heart rate and the respiratory rate; the eye movement instrument is used for collecting eye movement data; the displacement sensor is used for acquiring displacement data; the force sensor is used for collecting stress data; the rotation sensor is used for acquiring rotation amplitude data; the information storage unit is used for storing the data acquired by the acquisition equipment;
the information processing module comprises a subjective data analysis unit, a video stream analysis unit, a task data analysis unit and a physiological index analysis unit;
the subjective data analysis unit is used for calculating to obtain a task load index according to the subjective data; wherein, the subjective data is a questionnaire filled by an appraiser according to the appraisal content, such as a NASA-NLX scale;
the video stream analysis unit is used for obtaining an abnormal operation index according to the video information; identifying abnormal operation on video information by using a video key frame extraction technology and an improved Mask-RCNN algorithm as shown in FIG. 3 to obtain the times of the abnormal operation and realize the quantification of abnormal operation behaviors;
the task data analysis unit is used for obtaining the time utilization rate according to the ratio of the time of normal operation to the total time of task completion and obtaining the task accuracy rate according to the ratio of the number of correctly completed tasks to the total number of tasks;
the physiological index analysis unit is used for obtaining an eye movement load index according to the eye movement data;
the interaction evaluation module comprises a task frame evaluation unit, a cognitive load evaluation unit, a human-computer interface evaluation unit, an operation comfort evaluation unit and an output unit;
referring to fig. 2 and 4, the task frame evaluation unit is configured to quantitatively evaluate the task frame according to the task load index, the time utilization rate, and the task accuracy in combination with the improved FAHP-TOPSIS method to obtain a task frame quantitative evaluation value;
referring to fig. 2 and 4, the cognitive load evaluation unit is used for quantitatively evaluating the cognitive load according to the heart rate, the respiratory rate, the abnormal operation index and the eye movement load index by combining the improved FAHP-TOPSIS method to obtain a quantitative cognitive load evaluation value;
referring to fig. 2 and 4, the human-machine interface evaluation unit is configured to perform quantitative evaluation on the human-machine interface according to the prestored expert evaluation scores of 4 indicators of practicality, functionality, fluency and harmony, by combining with an improved FAHP-TOPSIS method, so as to obtain a human-machine interface quantitative evaluation value;
referring to fig. 2 and 4, the operation comfort evaluation unit is configured to quantitatively evaluate the operation comfort according to the displacement data, the force data and the rotation amplitude data of the human body in combination with an improved FAHP-TOPSIS method to obtain a quantized evaluation value of the operation comfort;
referring to fig. 2, the output unit is configured to obtain a human-computer interaction numerical solution and a related index distribution diagram according to the task frame quantitative evaluation value, the cognitive load quantitative evaluation value, the human-computer interface quantitative evaluation value, and the operation comfort quantitative evaluation value, and complete evaluation of the human-computer interaction system;
the information transmission module is used for transmitting the data acquired by the information acquisition module to the information processing module and transmitting the index data acquired by the information processing module to the interactive evaluation module.
Further, displacement sensor, force transducer and rotation sensor all set up in operating personnel's hand, upper limbs and back down.
Furthermore, a subjective data input unit, an information storage unit, an information processing module and an interactive evaluation module of the information acquisition module use a computer as a carrier.
Although the present invention has been described in detail in this specification with reference to specific embodiments and illustrative embodiments, it will be apparent to those skilled in the art that modifications and improvements can be made thereto based on the present invention. Accordingly, such modifications and improvements are intended to be within the scope of the invention as claimed.
Claims (3)
1. A portable human-computer interaction evaluation system is characterized by comprising an information acquisition module, an information processing module, an interaction evaluation module and an information transmission module;
the information acquisition module comprises acquisition equipment and an information storage unit;
the acquisition equipment comprises a camera, a physiological bracelet, an eye tracker, a displacement sensor, a force sensor and a rotation sensor; the camera is used for collecting video information; the physiological bracelet is used for collecting the heart rate and the respiratory rate; the eye tracker is used for collecting eye movement data; the displacement sensor is used for acquiring displacement data; the force sensor is used for acquiring stress data; the rotation sensor is used for acquiring rotation amplitude data; the information storage unit is used for storing the data acquired by the acquisition equipment;
the information processing module comprises a subjective data analysis unit, a video stream analysis unit, a task data analysis unit and a physiological index analysis unit;
the subjective data analysis unit is used for calculating to obtain a task load index according to the subjective data; the subjective data is a questionnaire filled by an assessment person according to the assessment content;
the video stream analysis unit is used for obtaining an abnormal operation index according to the video information;
the task data analysis unit is used for obtaining the time utilization rate according to the ratio of the time of normal operation to the total time of task completion and obtaining the task accuracy rate according to the ratio of the number of correctly completed tasks to the total number of tasks;
the physiological index analysis unit is used for obtaining an eye movement load index according to the eye movement data;
the interaction evaluation module comprises a task frame evaluation unit, a cognitive load evaluation unit, a human-computer interface evaluation unit, an operation comfort evaluation unit and an output unit;
the task frame evaluation unit is used for carrying out quantitative evaluation on the task frame according to the task load index, the time utilization rate and the task accuracy rate by combining an improved FAHP-TOPSIS method to obtain a task frame quantitative evaluation value;
the cognitive load evaluation unit is used for quantitatively evaluating the cognitive load according to the heart rate, the respiratory rate, the abnormal operation index and the eye movement load index by combining an improved FAHP-TOPSIS method to obtain a quantitative evaluation value of the cognitive load;
the human-computer interface evaluation unit is used for carrying out quantitative evaluation on the human-computer interface by combining an improved FAHP-TOPSIS method according to prestored expert evaluation scores of 4 indexes of practicability, functionality, fluency and harmony to obtain a human-computer interface quantitative evaluation value;
the operation comfort evaluation unit is used for carrying out quantitative evaluation on the operation comfort according to the displacement data, the stress data and the rotation amplitude data of the human body by combining an improved FAHP-TOPSIS method to obtain an operation comfort quantitative evaluation value;
the output unit is used for obtaining a human-computer interaction numerical solution and a related index distribution diagram according to the task frame quantitative evaluation value, the cognitive load quantitative evaluation value, the human-computer interface quantitative evaluation value and the operation comfort quantitative evaluation value to complete evaluation on a human-computer interaction system;
the information transmission module is used for transmitting the data acquired by the information acquisition module to the information processing module and transmitting the index data acquired by the information processing module to the interactive evaluation module.
2. The portable human-computer interaction assessment system according to claim 1, wherein the displacement sensor, the force sensor and the rotation sensor are all disposed on the hands, upper limbs and lower back of the operator.
3. The portable human-computer interaction assessment system according to claim 1, wherein the subjective data input unit and the information storage unit of the information acquisition module, the information processing module and the interaction evaluation module are carried by a computer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210927841.1A CN115331142A (en) | 2022-08-03 | 2022-08-03 | Portable human-computer interaction evaluation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210927841.1A CN115331142A (en) | 2022-08-03 | 2022-08-03 | Portable human-computer interaction evaluation system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115331142A true CN115331142A (en) | 2022-11-11 |
Family
ID=83921039
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210927841.1A Pending CN115331142A (en) | 2022-08-03 | 2022-08-03 | Portable human-computer interaction evaluation system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115331142A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116089250A (en) * | 2023-04-11 | 2023-05-09 | 苏州市世为科技有限公司 | Man-machine interaction optimization management system and management method |
CN116881678A (en) * | 2023-09-08 | 2023-10-13 | 中国标准化研究院 | Efficacy analysis system based on man-machine interaction |
-
2022
- 2022-08-03 CN CN202210927841.1A patent/CN115331142A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116089250A (en) * | 2023-04-11 | 2023-05-09 | 苏州市世为科技有限公司 | Man-machine interaction optimization management system and management method |
CN116881678A (en) * | 2023-09-08 | 2023-10-13 | 中国标准化研究院 | Efficacy analysis system based on man-machine interaction |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115331142A (en) | Portable human-computer interaction evaluation system | |
US20190059801A1 (en) | Collection of medical data | |
US8677281B2 (en) | System, apparatus and method for emotional experience time sampling via a mobile graphical user interface | |
Sisti et al. | Computerized spiral analysis using the iPad | |
CN101504696A (en) | Interactive physiological analysis method | |
Longo | Mental workload in medicine: foundations, applications, open problems, challenges and future perspectives | |
CN113520395A (en) | Real-time mental state assessment system and method | |
Müller et al. | Muscular load and performance compared between a pen and a computer mouse as input devices | |
Surangsrirat et al. | Android application for spiral analysis in Parkinson's Disease | |
CN108962379B (en) | Mobile phone auxiliary detection system for cranial nerve system diseases | |
CN112890785A (en) | Health management system using non-contact image type physiological detection technology | |
CN104574245A (en) | Health integrated machine | |
US20190206566A1 (en) | Movement disorders monitoring and treatment support system for elderly care | |
US20240008785A1 (en) | Information processing system, information processing device, information processing method, and information processing program | |
CN108962397B (en) | Pen and voice-based cooperative task nervous system disease auxiliary diagnosis system | |
Pham et al. | Electrocardiogram (ECG) circuit design and using the random forest to ECG arrhythmia classification | |
Ahamed et al. | Design and development of an automated, portable and handheld tablet personal computer-based data acquisition system for monitoring electromyography signals during rehabilitation | |
CN109717831B (en) | Non-interference type nervous system disease auxiliary detection system based on touch gestures | |
CN109431499A (en) | Auxiliary system and householder method are nursed by plant person family | |
KR20130017642A (en) | Display method for bio-signal | |
CN102551739A (en) | Multiple psychological assessment system for groups | |
CN108742550A (en) | A kind of pulse condition therapeutic equipment | |
Girouard | Adaptive brain-computer interface | |
AU2013200948B2 (en) | The Collection of Medical Data | |
RU129681U1 (en) | SYSTEM FOR DETERMINING THE FUNCTIONAL CONDITION OF A GROUP OF FEEDBACK PEOPLE |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |