CN111887803A - Multi-dimensional monitoring and evaluation system for man-machine work efficiency of aircraft cockpit - Google Patents

Multi-dimensional monitoring and evaluation system for man-machine work efficiency of aircraft cockpit Download PDF

Info

Publication number
CN111887803A
CN111887803A CN202010813551.5A CN202010813551A CN111887803A CN 111887803 A CN111887803 A CN 111887803A CN 202010813551 A CN202010813551 A CN 202010813551A CN 111887803 A CN111887803 A CN 111887803A
Authority
CN
China
Prior art keywords
unit
human
machine
diagnosis
work efficiency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010813551.5A
Other languages
Chinese (zh)
Other versions
CN111887803B (en
Inventor
王臻
陆燕玉
傅山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN202010813551.5A priority Critical patent/CN111887803B/en
Publication of CN111887803A publication Critical patent/CN111887803A/en
Application granted granted Critical
Publication of CN111887803B publication Critical patent/CN111887803B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/0245Detecting, measuring or recording pulse rate or heart rate by using sensing means generating electric signals, i.e. ECG signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments
    • B64D43/02Arrangements or adaptations of instruments for indicating aircraft speed or stalling conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • A61B2503/22Motor vehicles operators, e.g. drivers, pilots, captains

Abstract

A multidimensional monitoring and evaluation system for man-machine work efficiency of an aircraft cockpit comprises: the system comprises a man-machine state monitoring unit, a feature extraction unit, a multi-channel feature synchronous normalized processing unit, an ergonomic factor description unit, a human-machine diagnosis unit and a man-machine ergonomic comprehensive evaluation unit, wherein the man-machine state monitoring unit is connected with the feature extraction unit by outputting original man-machine state measurement data, the feature extraction unit is connected with the ergonomic factor description unit by outputting feature extraction results, the multi-channel feature synchronous normalized processing unit is used for performing time alignment processing and feature index normalized processing on output results of all modules of the feature extraction unit and grouping the features according to a feature formation mechanism, the ergonomic factor description unit is connected with the human-machine diagnosis unit and the human-machine ergonomic comprehensive evaluation unit by outputting results of multi-dimensional ergonomic factor description, the human-machine diagnosis unit provides results of a plurality of human-machine diagnosis indexes for a user, and the man-machine ergonomic comprehensive evaluation unit calculates and generates a continuous result, And (5) quantitative human-machine work efficiency overall evaluation results. The invention solves the problem that diagnosis is neglected when the traditional human-computer work efficiency research emphasizes decision making, and can provide an effective human-computer objective analysis tool for design optimization and risk tracing.

Description

Multi-dimensional monitoring and evaluation system for man-machine work efficiency of aircraft cockpit
Technical Field
The invention relates to a technology in the field of aviation human factor safety, in particular to a man-machine efficiency multidimensional monitoring and evaluation system for an aircraft cockpit.
Background
Accidents caused by unit operation errors in civil aircraft flight accidents account for about 60% -80% of total accidents. The evaluation of the man-machine efficiency of the cockpit is of great importance for optimizing the design of the man-machine interface of the cockpit, examining the relevant terms of the man-machine factor in the airworthiness of the airplane, detecting the man-machine factor risk in the operation process and ensuring the flight safety.
Through investigation, a series of limitations exist in the man-machine work efficiency evaluation aspect of the current airplane cockpit: firstly, the existing cockpit human-machine work efficiency evaluation method based on anthropometry is local and static, can only meet the evaluation requirement in the accessibility aspect, and is difficult to evaluate the cognitive state and mental workload of pilots in the task process; secondly, the consistency of the evaluation results of the conventional common subjective evaluation method such as NASA-TLX and the like is poor, convincing is difficult, and continuous measurement cannot be carried out in the task process; third, existing evaluation methods fail to provide diagnostic information to guide design optimization.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a multi-dimensional monitoring and evaluating system for man-machine work efficiency of an airplane cockpit, which utilizes an objective measurement technology to carry out multi-directional continuous monitoring on man-machine states in a man-machine interaction process, utilizes a mathematical statistical method to integrate multi-source work efficiency characteristics, realizes objective and comprehensive evaluation of man-machine work efficiency in a task process, and solves the problem that the existing work efficiency evaluation excessively depends on static measurement and subjective feeling; and the work efficiency diagnosis index is constructed, the source of the work efficiency problem is analyzed and positioned, the problem that diagnosis is neglected due to the fact that the traditional man-machine work efficiency research pays attention to decision making is solved, and an effective human-factor objective analysis tool can be provided for design optimization and risk tracing.
The invention is realized by the following technical scheme:
the invention relates to a multi-dimensional monitoring and evaluating system for man-machine work efficiency of an aircraft cockpit, which comprises: the system comprises a human-machine state monitoring unit, a feature extraction unit, a multi-channel feature synchronous standardized processing unit, an ergonomic factor description unit, a human-machine diagnosis unit and a human-machine ergonomic comprehensive evaluation unit, wherein: the human-machine state monitoring unit is connected with the feature extraction unit through outputting original human-machine state measurement data, the feature extraction unit is connected with the human-machine work efficiency element description unit through outputting a feature extraction result, the multi-channel feature synchronous normalized processing unit performs time alignment processing and feature index normalized processing on output results of all modules of the feature extraction unit and groups features according to a feature formation mechanism, the human-machine work efficiency element description unit is connected with the human-machine diagnosis unit and the human-machine work efficiency comprehensive evaluation unit through outputting multi-dimensional work efficiency element description results, the human-machine diagnosis unit provides the results of a plurality of human-machine diagnosis indexes for a user, and the human-machine work efficiency comprehensive evaluation unit calculates and generates a continuous and quantitative human-machine work efficiency overall evaluation result.
The man-machine state monitoring unit comprises: at present mainstream wear-type eye movement appearance, chest belt cardiopulmonary activity measurement system, cockpit control recording equipment, flight parameter recording equipment, wherein: the instrument for measuring the movement of the eyes collects the coordinate of a fixation point and the diameter of a pupil of a pilot, the heart-lung activity measuring system collects the heart rate, the respiratory frequency and the respiratory amplitude of the pilot, the cockpit control recording equipment collects the position measurement values of a flight rocker, a throttle lever and a pedal, and the flight parameter recording equipment collects the longitude and latitude, the height, the airspeed and the acceleration measurement values of an airplane.
The feature extraction unit includes: the device comprises an eye movement feature extraction module, a control feature extraction module, a physiological feature extraction module and a flight task feature extraction module.
The ergonomic element description unit comprises: the system comprises a visual activity description module, a control activity description module, a workload description module and a task performance description module.
The human diagnosis unit comprises: the system comprises a visual perception difficulty diagnosis module, an operation difficulty diagnosis module, an information significance diagnosis module and an operation efficiency diagnosis module.
Technical effects
The invention integrally solves the problems of comprehensively, quantitatively and continuously and objectively evaluating the man-machine work efficiency of the cockpit in the flight task process and positioning the source of the work efficiency problem from the design angle.
Compared with the prior art, the invention utilizes the cognitive psychology mechanism to define the factors to be investigated for the human-machine work efficiency evaluation problem; by utilizing a continuous objective measurement technology, a quantitative feature extraction method and a multi-source feature fusion method, the quantitative description and the objective comprehensive evaluation of human-computer work efficiency elements of the cockpit in the task process are realized, and the problems that the current static measurement and subjective evaluation are the main and a continuous objective evaluation method is lacked in the human-computer work efficiency field are effectively solved; the relevance among evaluation dimensions is utilized to construct the work efficiency diagnosis index, targeted auxiliary information can be provided for design optimization, and the problem that diagnosis is neglected due to the fact that the traditional work efficiency analysis focuses on decision is solved.
Drawings
FIG. 1 is a schematic diagram of a multi-dimensional monitoring and evaluation system for man-machine efficiency of an aircraft cockpit;
FIG. 2 is a schematic diagram of an objective human-machine efficiency evaluation characteristic index system of an aircraft cockpit;
FIG. 3 is a multi-dimensional graphical presentation of a comprehensive evaluation result of human-machine ergonomics of an aircraft cockpit constructed in accordance with the present invention.
Detailed Description
As shown in fig. 1, the present embodiment relates to a multidimensional monitoring and evaluating system for human-machine efficiency of an aircraft cockpit, which includes: the system comprises a human-machine state monitoring unit, a feature extraction unit, a multi-channel feature synchronization standardization processing unit, an ergonomic factor description unit, a human-machine diagnosis unit and a human-machine ergonomic comprehensive evaluation unit.
The embodiment relates to a man-machine ergonomics monitoring and evaluating method using the system, which comprises the following steps: the method comprises the steps of man-machine state monitoring in a flight task, primary measurement data processing and work efficiency feature extraction, work efficiency factor calculation, diagnosis index calculation, man-machine work efficiency comprehensive evaluation result calculation and evaluation result presentation.
A head-mounted eye tracker in the man-machine state monitoring unit positions the position of a pupil and detects the size of the pupil from an image through an eye camera and a target detection method, the sight direction is calculated, and finally the pixel coordinate of a fixation point in a foreground image and the diameter value of the pupil are output.
A chest belt type cardio-pulmonary activity measuring system in the man-machine state monitoring unit measures electrocardiogram data of a tested person through an ECG sensor, and changes of voltage values on a pressure-sensitive sensor in a chest belt reflect changes of chest cavity outlines caused by respiration.
A cockpit control recording device in the man-machine state monitoring unit collects deflection angle numerical values of a flight rocker, an accelerator rod and a pedal relative to a neutral position.
And a flight parameter recording device in the man-machine state monitoring unit acquires and outputs longitude and latitude, altitude, airspeed and acceleration values of the airplane.
An eye movement feature extraction module in the feature extraction unit calculates the movement speed of the fixation point by using the fixation point coordinates and the sampling time, when the movement speed of the fixation point is less than 30 pixels/second, the movement speed is recorded as fixation activity, when the movement speed of the fixation point is more than 30 pixels/second, the movement speed is recorded as saccade activity, the length of each fixation activity sequence is counted to obtain the value of the fixation time feature, and the occurrence frequency of the saccade activity in unit time is counted to obtain the value of the saccade frequency feature. And (3) detecting a 0 value in the pupil diameter measurement sequence, recording the 0 value as blinking when the 0 value appears, and counting the number of sections of the blinking sequence in unit time to obtain a value of the blinking frequency characteristic.
A control feature extraction module in the feature extraction unit performs differential processing on angles of a flight rocker, an accelerator lever and a pedal in a time sequence to obtain values of characteristics of control rates of ailerons, elevators, rudders and accelerators.
A physiological characteristic extraction module in the characteristic extraction unit detects R waves from electrocardiogram data, and the value of the heart rate characteristic is calculated through the reciprocal of an R-R interval. The amplitude of the voltage value on the pressure-sensitive sensor is used as the value of the respiration depth characteristic, the maximum value in the chest circumference waveform is obtained by detecting the position where the first derivative is 0 and the second derivative is less than 0 in the voltage waveform on the pressure-sensitive sensor, and the respiration frequency characteristic is obtained by calculating the reciprocal of the interval time between the two maximum values.
A flight task feature extraction module in the feature extraction unit obtains a value of flight path deviation features by calculating the deviation between actual flight path longitude and latitude and a planned flight path, obtains a value of altitude deviation features by calculating the difference between actual flight altitude and planned flight altitude, obtains a value of speed deviation features by calculating the difference between actual airspeed and planned airspeed, and obtains a value of acceleration features by calculating an acceleration vector model value.
The multichannel feature synchronization normalization processing unit realizes multichannel feature data synchronization by searching and matching time stamps of various feature data, and performs normalization processing on each feature by calculating z-score to eliminate dimension difference between the features. And the characteristics are grouped according to the mechanism of the characteristics, and a man-machine work efficiency objective evaluation characteristic index system is constructed, as shown in figure 3.
A visual activity description module in the ergonomic element description unit fuses fixation time, saccade frequency and blink frequency characteristics by using a principal component analysis method. Firstly, calculating the eigenvalue and the eigenvector of a matrix formed by the above characteristic data, and projecting the original data to the direction of the eigenvector to obtain each principal component data. And then, carrying out normalization processing on the characteristic values to represent the variance interpretation rate of the principal components, and weighting and summing the principal components by using the variance interpretation rate to obtain the result of the visual activity description index.
And a control activity description module in the work efficiency element description unit fuses the characteristics of the aileron control rate, the elevator control rate, the rudder control rate and the accelerator control rate by using a principal component analysis method. Firstly, calculating the eigenvalue and the eigenvector of a matrix formed by the above characteristic data, and projecting the original data to the direction of the eigenvector to obtain each principal component data. And then, carrying out normalization processing on the characteristic values to represent the variance interpretation rate of the principal component, and weighting and summing the principal components by using the variance interpretation rate to obtain a result of controlling the activity description index.
A workload description module in the work efficiency element description unit fuses heart rate, respiratory amplitude and pupil diameter characteristics by using a principal component analysis method. Firstly, calculating the eigenvalue and the eigenvector of a matrix formed by the above characteristic data, and projecting the original data to the direction of the eigenvector to obtain each principal component data. And then, normalizing the characteristic values to represent the variance interpretation rate of the principal components, and weighting and summing the principal components by using the variance interpretation rate to obtain the result of the workload description index.
And a task performance description module in the work efficiency element description unit fuses the aerocraft deviation, the altitude deviation, the speed deviation and the acceleration characteristic by using a principal component analysis method. Firstly, calculating the eigenvalue and the eigenvector of a matrix formed by the above characteristic data, and projecting the original data to the direction of the eigenvector to obtain each principal component data. And then, normalizing the characteristic values to represent the variance interpretation rate of the principal components, and weighting and summing the principal components by using the variance interpretation rate to obtain the result of the task performance description index.
And a visual perception difficulty diagnosis module in the human factor diagnosis unit calculates a Pearson correlation coefficient between the visual activity description index and the workload description index in a sliding window of 30 seconds to obtain a visual perception difficulty diagnosis result.
And a control difficulty diagnosis module in the human factor diagnosis unit calculates a Pearson correlation coefficient between the control activity description index and the workload description index in a sliding window of 30 seconds to obtain a control difficulty diagnosis result.
And an information significance diagnosis module in the human factor diagnosis unit obtains an information significance diagnosis result by calculating a Pearson correlation coefficient between the visual activity description index and the task performance description index in a sliding window of 30 seconds.
And a control efficiency diagnosis module in the human factor diagnosis unit obtains a control efficiency diagnosis result by calculating a Pearson correlation coefficient between the control activity description index and the workload description index in a sliding window of 30 seconds.
And the human-machine work efficiency comprehensive evaluation unit fuses visual activity description results, control activity description results, workload description results and task performance description result characteristics. Firstly, calculating the eigenvalue and the eigenvector of a matrix formed by the data, and projecting the original data to the direction of the eigenvector to obtain each principal component data. And then, carrying out normalization processing on the characteristic values to represent the variance interpretation rate of the main components, and weighting and summing the main components by using the variance interpretation rate to obtain the result of the man-machine work efficiency comprehensive evaluation index.
In order to provide intuitive and detailed ergonomic evaluation and diagnosis results for designers, evaluators, and analysts, the present embodiment constructs a graphical ergonomic evaluation result presentation method. The graph contains 8 directions, wherein the horizontal and vertical directions contain 4 ergonomic factors of < visual activity >, < control activity >, < workload > and < task performance >, and the diagonal directions contain 4 diagnostic tips of < visual perception difficulty >, < manipulation difficulty >, < information significance > and < manipulation efficiency >. The man-machine efficiency can be evaluated in detail and visually in the graphic presentation mode at each moment in the flight task process.
The embodiment firstly proposes that the human-machine work efficiency is defined by utilizing 4 work efficiency elements of visual perception activity, control activity, work load and task performance, the elements are measured through an objective measurement technology, a human-machine work efficiency objective evaluation feature index system is constructed through data processing and feature extraction, and finally the feature indexes are grouped, hierarchically and comprehensively through a multivariate mathematical statistics method, so that the human-machine work efficiency objective comprehensive evaluation of the airplane cockpit is realized. Meanwhile, the method is also a main innovation of the embodiment by utilizing the correlation among the evaluation dimensions, and the introduction of the diagnosis indexes can help designers and testers to locate the source of the work efficiency problem, so that the problems can be located and improved in a targeted manner, and the defect that the existing work efficiency evaluation method is weak in diagnosis capability is overcome.
The system and the method constructed by the embodiment are applied to the process of a real pilot task. Experimental data show that the embodiment distinguishes work efficiency differences caused by different task difficulties, different flight experiences and different cockpit designs.
Compared with the prior art, the performance index of this embodiment promotes and lies in: compared with the most common man-machine work efficiency subjective evaluation method in the prior art, the embodiment provides an objective quantitative evaluation method which is not affected by the subjective consciousness of evaluators; the embodiment combines the cognitive science theory to monitor and evaluate the man-machine work efficiency of the aircraft cockpit from multiple aspects, and the result is more comprehensive and effective than that of the existing method; the method can be used for continuously and dynamically evaluating the task process, the frequency of the evaluation result reaches 20Hz, the time resolution is higher than that of the existing evaluation method, and more detailed information can be provided for the work efficiency change in the task process.
The foregoing embodiments may be modified in many different ways by those skilled in the art without departing from the spirit and scope of the invention, which is defined by the appended claims and all changes that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (6)

1. The utility model provides a multidimensional monitoring and evaluation system of man-machine efficiency of aircraft cockpit which characterized in that includes: the system comprises a human-machine state monitoring unit, a feature extraction unit, a multi-channel feature synchronous standardized processing unit, an ergonomic factor description unit, a human-machine diagnosis unit and a human-machine ergonomic comprehensive evaluation unit, wherein: the human-machine state monitoring unit is connected with the feature extraction unit through outputting original human-machine state measurement data, the feature extraction unit is connected with the human-machine work efficiency element description unit through outputting a feature extraction result, the multi-channel feature synchronous normalized processing unit performs time alignment processing and feature index normalized processing on output results of all modules of the feature extraction unit and groups features according to a feature formation mechanism, the human-machine work efficiency element description unit is connected with the human-machine diagnosis unit and the human-machine work efficiency comprehensive evaluation unit through outputting multi-dimensional work efficiency element description results, the human-machine diagnosis unit provides the results of a plurality of human-machine diagnosis indexes for a user, and the human-machine work efficiency comprehensive evaluation unit calculates and generates a continuous and quantitative human-machine work efficiency overall evaluation result.
2. The multi-dimensional monitoring and evaluating system for man-machine efficiency of an airplane cockpit according to claim 1, wherein said man-machine state monitoring unit comprises: at present mainstream wear-type eye movement appearance, chest belt cardiopulmonary activity measurement system, cockpit control recording equipment, flight parameter recording equipment, wherein: the instrument for measuring the movement of the eyes collects the coordinate of a fixation point and the diameter of a pupil of a pilot, the heart-lung activity measuring system collects the heart rate, the respiratory frequency and the respiratory amplitude of the pilot, the cockpit control recording equipment collects the position measurement values of a flight rocker, a throttle lever and a pedal, and the flight parameter recording equipment collects the longitude and latitude, the height, the airspeed and the acceleration measurement values of an airplane.
3. The multi-dimensional monitoring and evaluation system for man-machine efficiency of the airplane cockpit according to claim 1, wherein the feature extraction unit comprises: the device comprises an eye movement feature extraction module, a control feature extraction module, a physiological feature extraction module and a flight task feature extraction module.
4. The multi-dimensional monitoring and evaluating system for human-machine ergonomics of an aircraft cockpit according to claim 1 wherein said ergonomic element description unit comprises: the system comprises a visual activity description module, a control activity description module, a workload description module and a task performance description module.
5. The multi-dimensional monitoring and evaluation system for man-machine efficiency of the aircraft cockpit according to claim 1, wherein said human diagnosis unit comprises: the system comprises a visual perception difficulty diagnosis module, an operation difficulty diagnosis module, an information significance diagnosis module and an operation efficiency diagnosis module.
6. An ergonomic monitoring and evaluation method based on the system of any of the preceding claims, comprising: human-machine state monitoring, primary measurement data processing and work efficiency feature extraction, work efficiency element calculation, diagnosis index calculation, human-machine work efficiency comprehensive evaluation result calculation and evaluation result presentation in a flight task;
a head-mounted eye tracker in the man-machine state monitoring unit positions the pupil position and detects the pupil size from the image through an eye camera and a target detection method, calculates the sight direction, and finally outputs the pixel coordinate of a fixation point in a foreground image and the pupil diameter value;
a chest belt type cardiopulmonary activity measuring system in the human-computer state monitoring unit measures electrocardiogram data of a tested person through an ECG sensor, and changes of the chest contour caused by respiration are reflected through changes of voltage values on a pressure-sensitive sensor in a chest belt;
a cockpit control recording device in the man-machine state monitoring unit acquires deflection angle numerical values of a flight rocker, an accelerator lever and a pedal relative to a neutral position;
a flight parameter recording device in the man-machine state monitoring unit collects and outputs longitude and latitude, altitude, airspeed and acceleration values of the airplane;
an eye movement feature extraction module in the feature extraction unit calculates the movement speed of the fixation point by using the fixation point coordinates and sampling time, when the movement speed of the fixation point is less than 30 pixels/second, the movement speed is recorded as fixation activity, when the movement speed of the fixation point is more than 30 pixels/second, the movement speed is recorded as saccade activity, the length of each fixation activity sequence is counted to obtain the value of the fixation time feature, and the occurrence frequency of the saccade activity in unit time is counted to obtain the value of the saccade frequency feature; detecting a 0 value in a pupil diameter measurement sequence, recording the value as blinking when the 0 value appears, and counting the number of sections of the blinking sequence in unit time to obtain a value of blinking frequency characteristics;
a control feature extraction module in the feature extraction unit performs differential processing on angles of a flight rocker, an accelerator lever and a pedal in a time sequence to obtain values of characteristics of an aileron control speed, an elevator control speed, a rudder control speed and an accelerator control speed;
a physiological characteristic extraction module in the characteristic extraction unit detects R waves from electrocardiogram data, and the value of the heart rate characteristic is calculated through the reciprocal of an R-R interval; the amplitude of the voltage value on the pressure-sensitive sensor is used as the value of the respiration depth characteristic, the maximum value in the chest circumference waveform is obtained by detecting the position on the pressure-sensitive sensor where the first derivative is 0 and the second derivative is less than 0, and the respiration frequency characteristic is obtained by calculating the reciprocal of the interval time between two maximum values;
a flight task feature extraction module in the feature extraction unit obtains a value of flight path deviation features by calculating the deviation between actual flight path longitude and latitude and a planned flight path, obtains a value of altitude deviation features by calculating the difference between actual flight altitude and planned flight altitude, obtains a value of speed deviation features by calculating the difference between actual airspeed and planned airspeed, and obtains a value of acceleration features by calculating an acceleration vector model value;
the multichannel feature synchronization normalization processing unit realizes multichannel feature data synchronization by searching and matching timestamps of various feature data, and performs normalization processing on each feature by calculating z-score to eliminate dimension difference between the features; grouping the characteristics according to the mechanism of the characteristics, and constructing a human-machine work efficiency objective evaluation characteristic index system;
a visual activity description module in the work efficiency element description unit fuses the fixation time, the saccade frequency and the blink frequency characteristics by using a principal component analysis method; firstly, calculating an eigenvalue and an eigenvector of a matrix formed by the above characteristic data, and projecting the original data to the direction of the eigenvector to obtain each principal component data; then, normalizing the characteristic values to represent the variance interpretation rates of the principal components, and weighting and summing the principal components by using the variance interpretation rates to obtain a result of the visual activity description index;
a control activity description module in the work efficiency element description unit fuses characteristics of aileron control rate, elevator control rate, rudder control rate and accelerator control rate by using a principal component analysis method; firstly, calculating an eigenvalue and an eigenvector of a matrix formed by the above characteristic data, and projecting the original data to the direction of the eigenvector to obtain each principal component data; then, normalization processing is carried out on the characteristic values, the characteristic values are used for representing the variance interpretation rate of the principal component, and the weight and summation are carried out on each principal component by utilizing the variance interpretation rate to obtain a result of controlling the activity description index;
a workload description module in the work efficiency element description unit fuses heart rate, respiratory amplitude and pupil diameter characteristics by using a principal component analysis method; firstly, calculating an eigenvalue and an eigenvector of a matrix formed by the above characteristic data, and projecting the original data to the direction of the eigenvector to obtain each principal component data; then, normalization processing is carried out on the characteristic values, the characteristic values are used for representing the variance interpretation rate of the principal component, and the weight and summation are carried out on each principal component by utilizing the variance interpretation rate to obtain the result of the workload description index;
a task performance description module in the work efficiency element description unit fuses aeronautical aircraft deviation, altitude deviation, speed deviation and acceleration characteristics by using a principal component analysis method; firstly, calculating an eigenvalue and an eigenvector of a matrix formed by the above characteristic data, and projecting the original data to the direction of the eigenvector to obtain each principal component data; then, normalization processing is carried out on the characteristic values, the characteristic values are used for representing the variance interpretation rate of the principal component, and weighting and summing are carried out on each principal component by utilizing the variance interpretation rate to obtain a result of the task performance description index;
a visual perception difficulty diagnosis module in the human factor diagnosis unit calculates a Pearson correlation coefficient between a visual activity description index and a workload description index in a sliding window of 30 seconds to obtain a visual perception difficulty diagnosis result;
a control difficulty diagnosis module in the human factor diagnosis unit calculates a Pearson correlation coefficient between the control activity description index and the workload description index in a sliding window of 30 seconds to obtain a control difficulty diagnosis result;
an information significance diagnosis module in the human factor diagnosis unit obtains an information significance diagnosis result by calculating a Pearson correlation coefficient between the visual activity description index and the task performance description index in a sliding window of 30 seconds;
a control efficiency diagnosis module in the human factor diagnosis unit calculates a Pearson correlation coefficient between the control activity description index and the workload description index in a sliding window of 30 seconds to obtain a control efficiency diagnosis result;
the man-machine work efficiency comprehensive evaluation unit fuses visual activity description results, control activity description results, workload description results and task performance description result characteristics; firstly, calculating the eigenvalue and the eigenvector of a matrix formed by the data, and projecting the original data to the direction of the eigenvector to obtain each principal component data; and then, carrying out normalization processing on the characteristic values to represent the variance interpretation rate of the main components, and weighting and summing the main components by using the variance interpretation rate to obtain the result of the man-machine work efficiency comprehensive evaluation index.
CN202010813551.5A 2020-08-13 2020-08-13 Multi-dimensional monitoring and evaluating system for artificial efficiency of aircraft cockpit Active CN111887803B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010813551.5A CN111887803B (en) 2020-08-13 2020-08-13 Multi-dimensional monitoring and evaluating system for artificial efficiency of aircraft cockpit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010813551.5A CN111887803B (en) 2020-08-13 2020-08-13 Multi-dimensional monitoring and evaluating system for artificial efficiency of aircraft cockpit

Publications (2)

Publication Number Publication Date
CN111887803A true CN111887803A (en) 2020-11-06
CN111887803B CN111887803B (en) 2024-01-26

Family

ID=73230339

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010813551.5A Active CN111887803B (en) 2020-08-13 2020-08-13 Multi-dimensional monitoring and evaluating system for artificial efficiency of aircraft cockpit

Country Status (1)

Country Link
CN (1) CN111887803B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113655883A (en) * 2021-08-17 2021-11-16 中国人民解放军军事科学院战争研究院 Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method
CN116196002A (en) * 2023-04-28 2023-06-02 中国人民解放军空军特色医学中心 Method and system for evaluating physiological stress of pilot in air

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130268146A1 (en) * 2012-04-04 2013-10-10 Eurocopter Method and a device for adapting the man-machine interface of an aircraft depending on the level of the pilot's functional state
CN106022631A (en) * 2016-05-30 2016-10-12 南京航空航天大学 Index weight analysis method
KR20180134310A (en) * 2017-06-08 2018-12-18 고려대학교 산학협력단 Appratus for controlling integrated supervisory of pilots status and method for guiding task performance ability of pilots using the same
US20190090800A1 (en) * 2017-09-22 2019-03-28 Aurora Flight Sciences Corporation Systems and Methods for Monitoring Pilot Health

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130268146A1 (en) * 2012-04-04 2013-10-10 Eurocopter Method and a device for adapting the man-machine interface of an aircraft depending on the level of the pilot's functional state
CN106022631A (en) * 2016-05-30 2016-10-12 南京航空航天大学 Index weight analysis method
KR20180134310A (en) * 2017-06-08 2018-12-18 고려대학교 산학협력단 Appratus for controlling integrated supervisory of pilots status and method for guiding task performance ability of pilots using the same
US20190090800A1 (en) * 2017-09-22 2019-03-28 Aurora Flight Sciences Corporation Systems and Methods for Monitoring Pilot Health

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘伟, 袁修干, 柳忠起, 康卫勇, 马锐: "人机显示/控制界面适配性综合评价指标和评价方法", 中国安全科学学报, no. 04, pages 32 - 35 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113655883A (en) * 2021-08-17 2021-11-16 中国人民解放军军事科学院战争研究院 Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method
CN113655883B (en) * 2021-08-17 2022-10-14 中国人民解放军军事科学院战争研究院 Human-computer interface eye movement interaction mode ergonomics experimental analysis system and method
CN116196002A (en) * 2023-04-28 2023-06-02 中国人民解放军空军特色医学中心 Method and system for evaluating physiological stress of pilot in air

Also Published As

Publication number Publication date
CN111887803B (en) 2024-01-26

Similar Documents

Publication Publication Date Title
Gateau et al. In silico vs. over the clouds: on-the-fly mental state estimation of aircraft pilots, using a functional near infrared spectroscopy based passive-BCI
CN111887803B (en) Multi-dimensional monitoring and evaluating system for artificial efficiency of aircraft cockpit
Kuravsky et al. Mathematical foundations of flight crew diagnostics based on videooculography data
CN103431859A (en) Experimental method for determining brain load in multitask visual cognition
CN111951637A (en) Task scenario-related unmanned aerial vehicle pilot visual attention distribution mode extraction method
Gao et al. Effects of mental workload and risk perception on pilots’ safety performance in adverse weather contexts
CN115191018A (en) Evaluation of a person or system by measuring physiological data
Li et al. Estimation of cognitive workload by approximate entropy of EEG
Jiang et al. Transformer network intelligent flight situation awareness assessment based on pilot visual gaze and operation behavior data
Shao et al. The influence of pilot's attention allocation on instrument reading during take-off: The mediating effect of attention span
Mengtao et al. Leveraging eye-tracking technologies to promote aviation safety-a review of key aspects, challenges, and future perspectives
Wang et al. Research on influencing factor selection of pilot’s intention
CN102551741A (en) Experimental system for measuring brain load in multi-task visual cognition and method
Xinyao et al. Measuring the Situation Awareness of Tower Controllers by Using Eye Movement Analysis.
Alaimo et al. Human heart-related indexes behavior study for aircraft pilots allowable workload level assessment
Jiang et al. Mental workload artificial intelligence assessment of pilots’ EEG based on multi-dimensional data fusion and LSTM with attention mechanism model
CN109145485B (en) Man-machine efficiency testing method and system
Chen et al. A pilot workload evaluation method based on EEG data and physiological data
Wei et al. A theoretical model of mental workload in pilots based on multiple experimental measurements
Sun et al. The Influence of HUD Information Visual Coding on pilot's Situational Awareness
Grandi et al. Application of innovative tools to design ergonomic control dashboards
Shmelova et al. Application Artificial Intelligence for Real-Time Monitoring, Diagnostics, and Correction Human State.
Hebbar et al. Using Eye Tracker To Evaluate Cockpit Design--A Flight Simulation Study
Li et al. Usability evaluation of hybrid 2D-3D visualization tools in basic air traffic control operations
Wang et al. Automatic Landing Control of Aircraft Based on Cognitive Load Theory and DDPG

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant