CN107562202B - Method and device for identifying human errors of process operators based on sight tracking - Google Patents

Method and device for identifying human errors of process operators based on sight tracking Download PDF

Info

Publication number
CN107562202B
CN107562202B CN201710828111.5A CN201710828111A CN107562202B CN 107562202 B CN107562202 B CN 107562202B CN 201710828111 A CN201710828111 A CN 201710828111A CN 107562202 B CN107562202 B CN 107562202B
Authority
CN
China
Prior art keywords
cognitive state
determining
identified
process operator
percentage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710828111.5A
Other languages
Chinese (zh)
Other versions
CN107562202A (en
Inventor
胡瑾秋
张来斌
胡静桦
张鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Petroleum Beijing
Original Assignee
China University of Petroleum Beijing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Petroleum Beijing filed Critical China University of Petroleum Beijing
Priority to CN201710828111.5A priority Critical patent/CN107562202B/en
Publication of CN107562202A publication Critical patent/CN107562202A/en
Application granted granted Critical
Publication of CN107562202B publication Critical patent/CN107562202B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a method and a device for identifying human errors of process operators based on sight tracking, wherein the method comprises the following steps: acquiring eye movement data of a process operator to be identified in real time; the eye movement data comprise sight line falling points of the process operators to be identified on the operation interface at each sampling moment; determining interest areas corresponding to the sight line drop points on the operation interface, and determining sight line watching time length percentages of the process operators to be identified in the interest areas on the operation interface; constructing a sample characteristic matrix according to the sight line fixation time length percentage; and determining the difference between the sample characteristic matrix and a preset cognitive state characteristic matrix, and determining the cognitive state of the process operator to be identified according to the difference. The method and the device can improve the accuracy of recognizing the cognitive state of the process operator.

Description

Method and device for identifying human errors of process operators based on sight tracking
Technical Field
The application relates to the technical field of oil and gas production accident prevention, in particular to a method and a device for identifying human errors of process operators based on sight tracking.
Background
According to statistics, 70% of chemical accidents are related to manual technological operation errors in the oil and gas production process. Human misoperation mainly comprises misoperation and illegal operation of operators, and the result can cause equipment shutdown and factory shutdown and even cause various disasters and accidents. Therefore, in order to ensure the safe operation of the production process, it is necessary to study the human misoperation of the process operators in the chemical production process.
Currently, researchers have studied the type of human error and the probability of error from a qualitative or quantitative perspective. The qualitative Analysis method includes a safety checklist method and a Hazard and Operability Analysis (HAZOP) method. The qualitative analysis method mainly analyzes the action of process operators in the production process and the types of human errors which may occur. The quantitative method mainly utilizes a human error probability calculation model to calculate the operation reliability of a process operator in a specific task. However, the above method is heavily dependent on the subjective experience of the analyst and lacks perception and measurement of the cognitive behavior of the person, so that it is difficult to accurately identify the abnormal cognitive behavior of the process operator.
Disclosure of Invention
An object of the embodiments of the present application is to provide a method and an apparatus for identifying human errors of process operators based on gaze tracking, so as to improve the accuracy of recognizing the cognitive state of the process operators.
In order to achieve the above object, in one aspect, an embodiment of the present application provides a method for identifying human errors of a process operator based on gaze tracking, including:
acquiring eye movement data of a process operator to be identified in real time; the eye movement data comprise sight line falling points of the process operators to be identified on the operation interface at each sampling moment;
determining interest areas corresponding to the sight line drop points on the operation interface, and determining sight line watching time length percentages of the process operators to be identified in the interest areas on the operation interface;
constructing a sample characteristic matrix according to the sight line fixation time length percentage;
and determining the difference between the sample characteristic matrix and a preset cognitive state characteristic matrix, and determining the cognitive state of the process operator to be identified according to the difference.
Preferably, the acquiring the eye movement data of the process operator to be identified in real time includes:
and acquiring a sight line falling point of the process operator to be identified on the operation interface, which is acquired by the eye tracker in real time, and converting the sight line falling point into coordinate information corresponding to the operation interface.
Preferably, the determining the percentage of the gaze fixation time length of the process operator to be identified in each region of interest on the operation interface includes:
and determining the average value and the standard deviation of the percentage of the sight line fixation time length of the process operator to be identified in each interest area on the operation interface.
Preferably, the cognitive state feature matrix is obtained in advance by:
dividing interest areas of the operation interface, and determining the range of each interest area on the operation interface;
determining interest areas corresponding to all sight line falling points in the historical eye movement number on the operation interface according to historical eye movement data of different process operators;
determining the average value and the standard deviation of the percentage of the watching duration of the process operating personnel in each interest area under various cognitive states;
establishing a corresponding cognitive state characteristic matrix according to the average value and standard deviation of the staring time percentage of the process operator in each interest area under various cognitive states
Figure BDA0001408005570000021
Wherein x isniThe watching time duration percentage of the process operator watching the i-th type interest area in the n-th type cognitive state is shown.
Preferably, the weight matrix of the cognitive state feature matrix includes:
Figure BDA0001408005570000022
wherein, wniAnd the weight of the ith row and the ith column in the cognitive state feature matrix is obtained.
Preferably, each weight in the weight matrix is obtained by the following formula:
Figure BDA0001408005570000023
wherein, VniIs the coefficient of variation of the ith row and column elements in the cognitive state feature matrix, and
Figure BDA0001408005570000031
Figure BDA0001408005570000032
the percentage mean value of watching duration of a process operator watching the i-th interest area in the nth cognitive state is obtained; sigmaniThe standard deviation of the percentage of the watching duration of the process operator watching the i-th type interest area in the n-th type cognitive state is shown.
Preferably, the determining the difference between the sample feature matrix and a preset cognitive state feature matrix, and determining the cognitive state of the process operator to be identified according to the difference comprises:
determining the weighted Euclidean distance between the sample characteristic matrix and a preset cognitive state characteristic matrix, and determining the cognitive state of the process operator to be identified according to the weighted Euclidean distance.
Preferably, the weighted euclidean distance between the sample feature matrix and the preset cognitive state feature matrix is determined according to the following formula:
Figure BDA0001408005570000033
wherein d ismnIs a weighted Euclidean distance, w, between the sample feature matrix m and the cognitive state feature matrix nniIs the weight, X, of the element in the ith row and column in the cognitive state feature matrixmiIs the watching time length percentage, X, of the i-th interest area watched under the m-th cognitive state in the sample feature matrixniAnd the watching duration percentage of the ith interest zone watched under the nth cognitive state in the cognitive state feature matrix.
Preferably, the determining the cognitive state of the process operator to be identified according to the weighted euclidean distance includes:
determining weighted Euclidean distances between each column vector in the sample characteristic matrix and each corresponding column vector in the cognitive state characteristic matrix to obtain a weighted Euclidean distance set;
and determining the smallest one in the weighted Euclidean distance set, and determining the cognitive state corresponding to the smallest one as the cognitive state of the process operator to be identified.
On the other hand, the embodiment of the present application further provides a device for identifying human errors of process operators based on gaze tracking, including:
the eye movement data acquisition module is used for acquiring eye movement data of a process operator to be identified in real time; the eye movement data comprise sight line falling points of the process operators to be identified on the operation interface at each sampling moment;
the watching duration acquisition module is used for determining an interest area corresponding to the sight line drop point on the operation interface and determining the sight line watching duration percentage of the process operator to be identified in each interest area on the operation interface;
the feature matrix construction module is used for constructing a sample feature matrix according to the sight gaze duration percentage;
and the cognitive state identification module is used for determining the difference between the sample characteristic matrix and a preset cognitive state characteristic matrix and determining the cognitive state of the process operator to be identified according to the difference.
According to the technical scheme provided by the embodiment of the application, the embodiment of the application firstly acquires the eye movement data of the process operator to be identified in real time; the eye movement data comprises sight line falling points of process operators to be identified on the operation interface at each sampling moment; secondly, determining interest areas corresponding to sight line falling points on an operation interface, and determining the sight line watching time length percentage of the process operators to be identified in each interest area on the operation interface; then constructing a sample characteristic matrix according to the sight fixation duration percentage; and then determining the difference between the sample characteristic matrix and a preset cognitive state characteristic matrix, and determining the cognitive state of the process operator to be identified according to the difference, wherein the cognitive state characteristic matrix reflects the characteristics of each cognitive state, so that the cognitive state of the process operator can be accurately identified by comparing the difference between the sample characteristic matrix and the cognitive state characteristic matrix.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without any creative effort. In the drawings:
FIG. 1 is a flow chart illustrating a method for identifying human errors of a process operator based on line-of-sight tracking according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of interest region division in an embodiment of the present application;
FIG. 3 is a schematic view of eye movement data collected in an embodiment of the present application;
FIG. 4 is a diagram illustrating weighted Euclidean distances obtained in an embodiment of the present application;
fig. 5 is a block diagram of an apparatus for identifying human errors of a process operator based on line-of-sight tracking according to an embodiment of the present disclosure.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In some embodiments of the application, before the cognitive state of the process operator to be identified is identified, a cognitive state feature matrix of various cognitive state features needs to be established in advance, so that a basis can be provided for the subsequent identification of the cognitive state of the process operator to be identified. The cognitive state may include a normal state and a human error state. In different fields of application, there are different definitions of human errors, such as in the scientific rationale of safety behavior, the concept classifies all human errors as: deviation, inattention and error, which in practice may correspond to three common human error states, distraction, high tension and operation carelessness, respectively.
How to establish the cognitive state feature matrix is explained below.
Firstly, dividing interest areas of an operation interface, and determining the range of each interest area on the operation interface. Since the same human error state may be represented differently in different regions of interest, it is necessary to perform region of interest division on the operation interface to facilitate accurate identification of the human error state type. Generally, the operation interface can be divided into different interest areas according to the functions of the operation interface. Taking a certain operation interface shown in fig. 2 as an example, according to different functions, the operation interface can be divided into a slider interest area, a parameter display interest area, a trend window interest area and an alarm window interest area. After the interest areas are divided, the occupied range of each interest area on the operation interface is determined.
Secondly, according to historical eye movement data of operators in different processes, determining interest areas corresponding to sight falling points in the historical eye movement number on the operation interface. These historical eye movement data may include eye movement data for normal conditions and for various types of fault conditions for different process operators. The acquisition of the eye movement data can be realized by a sight tracking technology based on an eye tracker, for example, on the basis of the eye tracker, the eye movement condition of a process operator is tracked by a pupil-cornea reflection method, so that a sight drop point of the process operator on an operation interface is acquired, and the sight drop point can be converted into x and y coordinates corresponding to the operation interface.
Then, the mean and standard deviation of the percentage of the process operator's fixation duration in each area of interest under each type of cognitive state is determined. Taking a certain operation interface shown in fig. 2 as an example, the slider interest area, the parameter tag interest area, the trend window interest area and the alarm window interest area are taken as units respectively, and the operation interfaces in different cognitive states are used for processing the informationThe process operators are grouped into different error modes (i.e., human error status modes), and if the process operators are drowsy, the process operators can be considered as distraction, and the process operators can be considered as careless and careless operations if the process operators do not operate in place, etc. Thus, the watching time distribution of the process operators in each interest area under various fault conditions can be calculated in groups, for example, the average value of the watching time percentages of the process operators in each interest area under different cognitive conditions
Figure BDA0001408005570000051
And σniStandard deviation, e.g.
Figure BDA0001408005570000052
May represent the average of the percentage of line-of-sight fixation time in the second type of region of interest for all process operators in the first type of fault condition. Wherein, the fixation duration percentage refers to the percentage of the fixation time in each interest area to the whole operation time.
Secondly, establishing a corresponding cognitive state characteristic matrix according to the average value and standard deviation of the staring time percentage of the process operating personnel in each interest area under various cognitive states
Figure BDA0001408005570000061
Wherein x isniThe watching time duration percentage of the process operator watching the i-th type interest area in the n-th type cognitive state is shown. Each row of the cognitive state feature matrix represents different cognitive states, and each column of the cognitive states represents the watching time duration percentage of the sight of a process operator in different areas, such as xniIndicating the percentage of the fixation duration of the process operator in the ith type of interest area in the nth type of fault condition.
Meanwhile, a weight matrix of each element in the cognitive state feature matrix can be established as follows:
Figure BDA0001408005570000062
wherein, wniAnd the weight of the ith row and the ith column in the cognitive state feature matrix is obtained. Each row of the weight matrix represents different cognitive states, and each column of the weight matrix represents index importance of duration of injection in different areas under the corresponding cognitive state. In an exemplary embodiment of the application, the weight of each element in the cognitive state feature matrix can be determined by a coefficient of variation method, and the basic idea is as follows:
in the standard system formed by characteristic parameters, the indexes with more obvious numerical value fluctuation are the indexes which are more difficult to realize, and the importance degree of the indexes is higher because the indexes can reflect the difference between units participating in evaluation. Therefore, the coefficient of variation of each index is introduced in the embodiment of the present application
Figure BDA0001408005570000063
And measuring the difference degree of each index value by calculating the variation coefficient of each index. Wherein, VniThe coefficient of variation of the ith row element in the cognitive state feature matrix is obtained;
Figure BDA0001408005570000064
the percentage mean value of watching duration of a process operator watching the i-th interest area in the nth cognitive state is obtained; sigmaniThe standard deviation of the percentage of the watching duration of the process operator watching the i-th type interest area in the n-th type cognitive state is shown. The weight w of the ith row and column element in the cognitive state feature matrixniCan be expressed as
Figure BDA0001408005570000065
On the basis of establishing a cognitive state feature matrix, referring to fig. 1, a method for identifying human errors of a process operator based on gaze tracking according to an embodiment of the present application may include the following steps:
s101, acquiring eye movement data of a process operator to be identified in real time; the eye movement data comprise sight line falling points of the process operators to be identified on the operation interface at each sampling moment.
In some embodiments of the present application, the obtaining the eye movement data of the process operator to be identified in real time may include: and acquiring a sight line falling point of the process operator to be identified on the operation interface, which is acquired by the eye tracker in real time, and converting the sight line falling point into coordinate information corresponding to the operation interface. In an exemplary embodiment of the present application, the collected eye movement data is shown in fig. 3, for example.
S102, determining the interest areas corresponding to the sight line falling points on the operation interface, and determining the sight line watching time length percentage of the process operators to be identified in each interest area on the operation interface.
After the interest areas are divided and the plane coordinate system of the operation interface is established, the coordinate range contained in each interest area on the operation interface is determined, so that the interest area to which the corresponding sight line drop point belongs can be easily determined after the coordinate information of the sight line drop point of the process operator to be identified on the operation interface is determined.
S103, constructing a sample characteristic matrix according to the sight line fixation time length percentage.
In some embodiments of the present application, the constructed sample feature matrix and the pre-established cognitive state feature matrix have the same dimension, and the corresponding dimension attributes are the same. Therefore, the step of constructing the sample feature matrix can be referred to the above section of establishing the cognitive state feature matrix.
S104, determining the difference between the sample characteristic matrix and a preset cognitive state characteristic matrix, and determining the cognitive state of the process operator to be identified according to the difference.
In some embodiments of the application, the difference between the sample feature matrix and a preset cognitive state feature matrix is determined, and the cognitive state of the process operator to be identified is determined according to the difference. The difference may be, for example, a weighted euclidean distance between the sample feature matrix and a preset cognitive state feature matrix. Correspondingly, the cognitive state of the process operator to be identified can be determined according to the weighted euclidean distance, and specifically, the method may include:
according to the formula
Figure BDA0001408005570000071
Determining weighted Euclidean distances between each column vector in the sample feature matrix and each corresponding column vector in the cognitive state feature matrix to obtain a weighted Euclidean distance set (for example, as shown in FIG. 4);
and determining the smallest one in the weighted Euclidean distance set, and determining the cognitive state corresponding to the smallest one as the cognitive state of the process operator to be identified.
Wherein d ismnIs a weighted Euclidean distance, w, between the sample feature matrix m and the cognitive state feature matrix nniIs the weight, X, of the element in the ith row and column in the cognitive state feature matrixmiIs the watching time length percentage, X, of the i-th interest area watched under the m-th cognitive state in the sample feature matrixniAnd the watching duration percentage of the ith interest zone watched under the nth cognitive state in the cognitive state feature matrix.
In some embodiments of the application, the similarity between the sample feature matrix and a preset cognitive state feature matrix can be determined, and the cognitive state of the process operator to be identified can be determined according to the similarity. Different from the difference, the cognitive state corresponding to the column with the largest similarity between each column vector in the sample feature matrix and each corresponding column vector in the cognitive state feature matrix may be determined as the cognitive state of the process operator to be identified.
To facilitate an understanding of the present application, an exemplary embodiment of the present application is described below. In an exemplary embodiment of the present application, the operation interface shown in fig. 2 is used to divide an interest area into examples, and in fig. 2, V101, V102, V201, V301, and V401 are all valves; p101 is a compressor;
Figure BDA0001408005570000081
are all flow meters;
Figure BDA0001408005570000082
Figure BDA0001408005570000083
are all thermometers;
Figure BDA0001408005570000084
is a matter concentration meter;
Figure BDA0001408005570000085
is a liquid level height gauge; m is a motor; HX101 is a condenser; HX102 is a reboiler; the CSTR is a continuous stirred reactor. Presetting the cognitive state to be recognized may include operating carelessness, distraction, high tension, and the like. On the basis of the divided interest areas and the cognitive states, counting the sight watching conditions of the process operators to be identified in groups, and calculating the watching time percentage of each cognitive state in each area and the index importance of the watching time in different areas. And finally, respectively establishing a cognitive state characteristic matrix Y and a weight matrix W of each element in the cognitive state characteristic matrix Y according to the statistical result.
Figure BDA0001408005570000086
Figure BDA0001408005570000087
After the cognitive state feature matrix Y and the weight matrix W are established, 8 test samples in the following table 1 are tested, wherein the cognitive states of the samples 1 and 5 in the trial operation are normal states, the cognitive states of the samples 2 and 6 in the trial operation are operation carelessness, the cognitive states of the samples 3 and 7 in the trial operation are mental distraction, and the cognitive states of the samples 4 and 8 in the trial operation are high tension.
TABLE 1 test specimen for operating conditions of an operator
Figure BDA0001408005570000088
Figure BDA0001408005570000091
TABLE 2 identification of the state of each group of samples
Sample(s) Is normal Operation of health and care Tension Fatigue Recognizing a state Identifying whether it is accurate
1 0.0577 0.1121 0.1028 0.1087 Is normal Is that
2 0.1316 0.0260 0.1080 0.0735 Operation of health and care Is that
3 0.1410 0.0801 0.1093 0.0486 Distraction of mental capacity Is that
4 0.0539 0.1310 0.0536 0.1090 High tension Is that
5 0.0471 0.1168 0.1019 0.1202 Is normal Is that
6 0.1434 0.0505 0.1315 0.0832 Operation of health and care Is that
7 0.1499 0.1044 0.1099 0.0579 Distraction of mental capacity Is that
8 0.0689 0.1340 0.0470 0.1120 High tension Is that
The manual error identification method for the process operator based on the sight tracking identifies the operation sample of part of the process operator, and the obtained identification result is shown in the table 2, so that the accuracy of the identification result can be verified.
While the process flows described above include operations that occur in a particular order, it should be appreciated that the processes may include more or less operations that are performed sequentially or in parallel (e.g., using parallel processors or a multi-threaded environment).
Referring to fig. 5, an apparatus for identifying human errors of a process operator based on line-of-sight tracking according to an embodiment of the present disclosure may include:
the eye movement data acquisition module 51 may be configured to acquire eye movement data of a process operator to be identified in real time; the eye movement data comprise sight line falling points of the process operators to be identified on the operation interface at each sampling moment;
the gaze duration acquisition module 52 may be configured to determine an interest region corresponding to the gaze drop point on the operation interface, and determine a gaze duration percentage of the to-be-identified process operator in each interest region on the operation interface;
a feature matrix construction module 53, configured to construct a sample feature matrix according to the gaze duration percentage;
the cognitive state identification module 54 may be configured to determine a difference between the sample feature matrix and a preset cognitive state feature matrix, and determine the cognitive state of the process operator to be identified according to the difference.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functionality of the units may be implemented in one or more software and/or hardware when implementing the present application.
The present invention has been described with reference to flowchart illustrations and/or block diagrams of methods and apparatus according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (8)

1. A method for identifying human errors of process operators based on sight tracking is characterized by comprising the following steps:
acquiring eye movement data of a process operator to be identified in real time; the eye movement data comprise sight line falling points of the process operators to be identified on the operation interface at each sampling moment;
determining interest areas corresponding to the sight line drop points on the operation interface, and determining sight line watching time length percentages of the process operators to be identified in the interest areas on the operation interface;
constructing a sample characteristic matrix according to the sight line fixation time length percentage;
determining the difference between the sample characteristic matrix and a preset cognitive state characteristic matrix, and determining the cognitive state of the process operator to be identified according to the difference;
the cognitive state feature matrix is obtained in advance through the following modes:
dividing interest areas of the operation interface, and determining the range of each interest area on the operation interface;
determining interest areas corresponding to all sight line falling points in the historical eye movement number on the operation interface according to historical eye movement data of different process operators;
determining the average value and the standard deviation of the percentage of the watching duration of the process operating personnel in each interest area under various cognitive states;
establishing a corresponding cognitive state characteristic matrix according to the average value and standard deviation of the staring time percentage of the process operator in each interest area under various cognitive states
Figure FDA0002147083200000011
Wherein x isniThe watching duration percentage of the process operator watching the i-th interest area in the nth cognitive state is calculated;
the weight matrix of the cognitive state feature matrix comprises:
Figure FDA0002147083200000012
wherein, wniThe weight of the ith row and column element in the cognitive state feature matrix is obtained;
each weight in the weight matrix is obtained by the following formula:
Figure FDA0002147083200000013
wherein, VniIs the coefficient of variation of the ith row and column elements in the cognitive state feature matrix, and
Figure FDA0002147083200000014
Figure FDA0002147083200000015
the percentage mean value of watching duration of a process operator watching the i-th interest area in the nth cognitive state is obtained; sigmaniThe standard deviation of the watching duration percentage of the process operator watching the ith interest area in the nth cognitive state is obtained;
determining a weighted Euclidean distance between the sample characteristic matrix and a preset cognitive state characteristic matrix according to the following formula:
Figure FDA0002147083200000021
wherein d ismnIs a weighted Euclidean distance, w, between the sample feature matrix m and the cognitive state feature matrix nniIs the weight, X, of the element in the ith row and column in the cognitive state feature matrixmiIs the watching time length percentage, X, of the i-th interest area watched under the m-th cognitive state in the sample feature matrixniThe watching duration percentage of the ith interest area under the nth cognitive state in the cognitive state feature matrix is set;
determining the cognitive state of the process operator to be identified according to the weighted Euclidean distance, comprising the following steps of:
determining weighted Euclidean distances between each column vector in the sample characteristic matrix and each corresponding column vector in the cognitive state characteristic matrix to obtain a weighted Euclidean distance set;
and determining the smallest one in the weighted Euclidean distance set, and determining the cognitive state corresponding to the smallest one as the cognitive state of the process operator to be identified.
2. The method for identifying human error of a process operator based on gaze tracking according to claim 1, wherein said obtaining eye movement data of the process operator to be identified in real time comprises:
and acquiring a sight line falling point of the process operator to be identified on the operation interface, which is acquired by the eye tracker in real time, and converting the sight line falling point into coordinate information corresponding to the operation interface.
3. The method for identifying human error of a process operator based on gaze tracking according to claim 1, wherein said determining the percentage of time duration of gaze fixation of said process operator to be identified in each region of interest on said operator interface comprises:
and determining the average value and the standard deviation of the percentage of the sight line fixation time length of the process operator to be identified in each interest area on the operation interface.
4. The method for identifying human errors of process operators based on sight line tracking according to claim 1, wherein the step of determining the difference between the sample feature matrix and a preset cognitive state feature matrix and determining the cognitive state of the process operator to be identified according to the difference comprises the following steps:
determining the weighted Euclidean distance between the sample characteristic matrix and a preset cognitive state characteristic matrix, and determining the cognitive state of the process operator to be identified according to the weighted Euclidean distance.
5. A process operator human error identification device based on eye tracking, comprising:
the eye movement data acquisition module is used for acquiring eye movement data of a process operator to be identified in real time; the eye movement data comprise sight line falling points of the process operators to be identified on the operation interface at each sampling moment;
the watching duration acquisition module is used for determining an interest area corresponding to the sight line drop point on the operation interface and determining the sight line watching duration percentage of the process operator to be identified in each interest area on the operation interface;
the feature matrix construction module is used for constructing a sample feature matrix according to the sight gaze duration percentage;
the cognitive state identification module is used for determining the difference between the sample characteristic matrix and a preset cognitive state characteristic matrix and determining the cognitive state of the process operator to be identified according to the difference;
the cognitive state feature matrix is obtained in advance through the following modes:
dividing interest areas of the operation interface, and determining the range of each interest area on the operation interface;
determining interest areas corresponding to all sight line falling points in the historical eye movement number on the operation interface according to historical eye movement data of different process operators;
determining the average value and the standard deviation of the percentage of the watching duration of the process operating personnel in each interest area under various cognitive states;
establishing a corresponding cognitive state characteristic matrix according to the average value and standard deviation of the staring time percentage of the process operator in each interest area under various cognitive states
Figure FDA0002147083200000031
Wherein x isniThe watching duration percentage of the process operator watching the i-th interest area in the nth cognitive state is calculated;
the weight matrix of the cognitive state feature matrix comprises:
Figure FDA0002147083200000032
wherein, wniThe weight of the ith row and column element in the cognitive state feature matrix is obtained;
each weight in the weight matrix is obtained by the following formula:
Figure FDA0002147083200000033
wherein, VniIs the coefficient of variation of the ith row and column elements in the cognitive state feature matrix, and
Figure FDA0002147083200000034
Figure FDA0002147083200000035
the percentage mean value of watching duration of a process operator watching the i-th interest area in the nth cognitive state is obtained; sigmaniThe standard deviation of the watching duration percentage of the process operator watching the ith interest area in the nth cognitive state is obtained;
determining a weighted Euclidean distance between the sample characteristic matrix and a preset cognitive state characteristic matrix according to the following formula:
Figure FDA0002147083200000041
wherein d ismnIs a weighted Euclidean distance, w, between the sample feature matrix m and the cognitive state feature matrix nniIs the weight, X, of the element in the ith row and column in the cognitive state feature matrixmiIs the watching time length percentage, X, of the i-th interest area watched under the m-th cognitive state in the sample feature matrixniThe watching duration percentage of the ith interest area under the nth cognitive state in the cognitive state feature matrix is set;
determining the cognitive state of the process operator to be identified according to the weighted Euclidean distance, comprising the following steps of:
determining weighted Euclidean distances between each column vector in the sample characteristic matrix and each corresponding column vector in the cognitive state characteristic matrix to obtain a weighted Euclidean distance set;
and determining the smallest one in the weighted Euclidean distance set, and determining the cognitive state corresponding to the smallest one as the cognitive state of the process operator to be identified.
6. The sight tracking based human error identification apparatus for a process operator according to claim 5, wherein said obtaining eye movement data of the process operator to be identified in real time comprises:
and acquiring a sight line falling point of the process operator to be identified on the operation interface, which is acquired by the eye tracker in real time, and converting the sight line falling point into coordinate information corresponding to the operation interface.
7. The human-error identification apparatus for a process operator based on gaze tracking according to claim 5, wherein said determining the percentage of time duration of gaze fixation of said process operator to be identified in each region of interest on said operator interface comprises:
and determining the average value and the standard deviation of the percentage of the sight line fixation time length of the process operator to be identified in each interest area on the operation interface.
8. The human error recognition apparatus for process operators based on eye gaze tracking of claim 5 wherein said determining the magnitude of the difference between said sample feature matrix and a predetermined cognitive state feature matrix and determining the cognitive state of said process operator to be recognized based on said magnitude of difference comprises:
determining the weighted Euclidean distance between the sample characteristic matrix and a preset cognitive state characteristic matrix, and determining the cognitive state of the process operator to be identified according to the weighted Euclidean distance.
CN201710828111.5A 2017-09-14 2017-09-14 Method and device for identifying human errors of process operators based on sight tracking Active CN107562202B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710828111.5A CN107562202B (en) 2017-09-14 2017-09-14 Method and device for identifying human errors of process operators based on sight tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710828111.5A CN107562202B (en) 2017-09-14 2017-09-14 Method and device for identifying human errors of process operators based on sight tracking

Publications (2)

Publication Number Publication Date
CN107562202A CN107562202A (en) 2018-01-09
CN107562202B true CN107562202B (en) 2020-03-13

Family

ID=60979939

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710828111.5A Active CN107562202B (en) 2017-09-14 2017-09-14 Method and device for identifying human errors of process operators based on sight tracking

Country Status (1)

Country Link
CN (1) CN107562202B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108985164A (en) * 2018-06-11 2018-12-11 厦门大学 Eye based on object moving state pays close attention to prediction technique
CN108921199A (en) * 2018-06-11 2018-11-30 厦门大学 Eye based on object table symptom state pays close attention to preference prediction technique
CN110458441A (en) * 2019-08-06 2019-11-15 北京七鑫易维信息技术有限公司 Checking method, device, system and the storage medium of quality inspection
CN110327061B (en) * 2019-08-12 2022-03-08 北京七鑫易维信息技术有限公司 Character determining device, method and equipment based on eye movement tracking technology
CN116071812A (en) * 2023-01-05 2023-05-05 中国石油大学(北京) Drilling operation unsafe behavior early warning method, device, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101779960A (en) * 2010-02-24 2010-07-21 沃建中 Test system and method of stimulus information cognition ability value
CN106251065A (en) * 2016-07-28 2016-12-21 南京航空航天大学 A kind of Effectiveness of Regulation appraisal procedure moving behavioral indicator system based on eye

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101779960A (en) * 2010-02-24 2010-07-21 沃建中 Test system and method of stimulus information cognition ability value
CN106251065A (en) * 2016-07-28 2016-12-21 南京航空航天大学 A kind of Effectiveness of Regulation appraisal procedure moving behavioral indicator system based on eye

Also Published As

Publication number Publication date
CN107562202A (en) 2018-01-09

Similar Documents

Publication Publication Date Title
CN107562202B (en) Method and device for identifying human errors of process operators based on sight tracking
US9984334B2 (en) Method for anomaly detection in time series data based on spectral partitioning
US9779495B2 (en) Anomaly diagnosis method and apparatus
EP1116150B1 (en) Method and multidimensional system for statistical process control
US11170332B2 (en) Data analysis system and apparatus for analyzing manufacturing defects based on key performance indicators
US8046318B2 (en) Automated system for checking proposed human adjustments to operational or planning parameters at a plant
EP4160342A1 (en) Abnormal modulation cause identification device, abnormal modulation cause identification method, and abnormal modulation cause identification program
CN115691722B (en) Quality control method, device, equipment, medium and program product for medical data detection
EP4160339A1 (en) Abnormality/irregularity cause identifying apparatus, abnormality/irregularity cause identifying method, and abnormality/irregularity cause identifying program
JP2018151821A (en) Abnormality diagnosis system of facility apparatus
CN110188793B (en) Data anomaly analysis method and device
Chiang et al. Industrial implementation of on-line multivariate quality control
CN113052475B (en) Engineering machinery icon visual performance test method, device and storage medium
US8732528B1 (en) Measuring test effects using adjusted outlier data
CN115932144B (en) Chromatograph performance detection method, chromatograph performance detection device, chromatograph performance detection equipment and computer medium
CN112102903A (en) Quality control system based on clinical laboratory testing result
US20180060281A1 (en) Graphs with normalized actual value measurements and baseline bands representative of normalized measurement ranges
Rooker et al. Machining centre performance monitoring with calibrated artefact probing
RU2470352C1 (en) Statistical process control method (versions)
US10316690B2 (en) Method for validation of an investigated sensor and corresponding machine
CN114004138A (en) Building monitoring method and system based on big data artificial intelligence and storage medium
CN108762959B (en) Method, device and equipment for selecting system parameters
Kachroo et al. Model-based methodology for validation of traffic flow detectors by minimizing human bias in video data processing
EP3712777A1 (en) Data classification device
CN111735976B (en) Automatic data result display method based on detection equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant