CN112890834B - Attention-recognition-oriented machine learning-based eye electrical signal classifier - Google Patents

Attention-recognition-oriented machine learning-based eye electrical signal classifier Download PDF

Info

Publication number
CN112890834B
CN112890834B CN202110226571.7A CN202110226571A CN112890834B CN 112890834 B CN112890834 B CN 112890834B CN 202110226571 A CN202110226571 A CN 202110226571A CN 112890834 B CN112890834 B CN 112890834B
Authority
CN
China
Prior art keywords
signal
electro
eye
ocular
classifier
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110226571.7A
Other languages
Chinese (zh)
Other versions
CN112890834A (en
Inventor
张文萱
李玉榕
郑楠
施正义
林栋�
林丽莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Maixing Life Medical Technology Co ltd
Original Assignee
Fujian Maixing Life Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Maixing Life Medical Technology Co ltd filed Critical Fujian Maixing Life Medical Technology Co ltd
Priority to CN202110226571.7A priority Critical patent/CN112890834B/en
Publication of CN112890834A publication Critical patent/CN112890834A/en
Application granted granted Critical
Publication of CN112890834B publication Critical patent/CN112890834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Psychiatry (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Signal Processing (AREA)
  • General Health & Medical Sciences (AREA)
  • Physiology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Developmental Disabilities (AREA)
  • Hospice & Palliative Care (AREA)
  • Child & Adolescent Psychology (AREA)
  • Educational Technology (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Power Engineering (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention provides an attention recognition-oriented machine learning-based eye electrical signal classifier, which comprises the following steps of: step S1: selecting an electro-ocular signal for constructing a training set and a verification set, and preprocessing the selected original electro-ocular signal; step S2: performing primary feature extraction on each preprocessed electro-ocular signal segment, wherein the primary feature extraction comprises time domain features, frequency domain features and time-frequency domain features; step S3: carrying out feature selection by using a genetic algorithm to obtain an optimal feature set consisting of d features; step S4: training a classifier: after an input sample set which can be used for inputting a classifier is generated, manually judging and determining a sample label through an eye tracker, and inputting the sample label into the classifier for training; step S5: and sending the verification set into a trained electro-oculogram classifier for classification verification, and voting out an optimal classifier according to the evaluation index selected by the operator. The eye signal acquisition device can achieve the purpose of judging whether the acquired eye signal is in a state of concentration in real time.

Description

Attention-recognition-oriented machine learning-based eye electrical signal classifier
Technical Field
The invention belongs to the technical field of machine learning and computer data processing, and particularly relates to an eye electrical signal classifier based on machine learning and oriented to attention recognition.
Background
In daily life, more than 80% of information interaction between human bodies and the environment is completed by eyes. Eye movement can express human will to a certain extent, and therefore, the eye movement is applied to specific aspects such as alertness analysis, the trip of disabled people, electric appliance control and the like. Meanwhile, the movement track and the rule of the eyeballs also contain a large amount of information, so that whether the user is interested in the displayed object or not can be judged, and whether the attention of the user is concentrated or not is reflected; in a police criminal investigation system, a technician decodes the psychological dynamics of a criminal suspect by analyzing an eye movement track; in the personalized field, a computer pertinently popularizes products to consumers by acquiring a user's gaze point, analyzing and organizing areas in which a user is interested and scientifically arranging the settings of advertisement webpages.
For healthy individuals, attention can be controlled by consciousness to focus as much as possible to achieve a specific effect, but it is difficult for a group with mental disorder to autonomously maintain the attention in a focused state for a long time. The autism population with rising incidence and declining age is taken as an example. The existing medical conditions do not have specific medicines capable of curing, but mostly depend on a professional doctor to observe the attention concentration degree of a patient, and the patient carries out artificial intervention training or acupuncture treatment under the conscious state, so that a better treatment effect is achieved. However, patients do not have normal self-care cognition and language expression in their daily lives, and it is extremely difficult to determine whether or not attention is focused, which is the basis of this, if not, to perform human intervention treatment. Therefore, it is an important research content to be able to objectively determine the degree of concentration in a timely manner in both a healthy population and a patient population with mental disorder.
In the prior art, most of achievements of the prior art use the gaze point track description technology of the eye tracker to judge the mental state of the user, but the eye tracker belongs to medical equipment, so that the eye tracker is applied to household daily detection, and is slightly labored, certain portability and easy operability are lost, even the portable eye tracker can cause certain influence on a test result due to the reflection of light of a lens.
The electrooculogram technology is a research hotspot of interdisciplinary subjects in various fields due to the characteristics of no wound, portability, low cost, no influence of illumination and the like. The prior published technology also shows that the electrooculogram signals contain a plurality of items of information including attention, and the technology has great significance for judging the attention focusing time of the user and further operating or performing long-term and stable treatment according to the attention focusing time.
In the publicly used technology, the data sources are mostly healthy individuals in mental states, and rarely extend to the data field of patients with mental state disorders; the long-term wearing of the eye tracker causes a certain degree of physiological and economic stress for individual users and families; in the prior art, the purpose of pursuing high precision is mostly to carry out offline classification by a deep network, market is realized, and the improvement of the effective rate of signal acquisition and the real-time conversion are needed. In addition, the individual difference between users can realize high-precision attention discrimination only by extracting signal characteristic values with high discrimination aiming at eye movements of different individuals.
In summary, the disclosed or publicly used techniques have the following problems and disadvantages:
1. and in the aspect of application data: the use data is more limited, the data is mostly concentrated on a wide range of healthy populations, and neither data preprocessing nor feature selection is analyzed aiming at a specific population;
2. in the aspect of application fields: the eye movement angle is combined with the electro-oculography to judge the sight direction or perform instrument operation, and the field of attention discrimination is not involved;
3. and (3) application equipment aspect: eye movement detection and classification are performed by using an invasive method for a long time or wearing professional equipment such as an eye tracker, and eye electrical signals are not properly used, so that safety, portability, and ease of operation are poor;
4. and (3) applying an algorithm aspect: in the existing signal classification based on the electro-ocular signals, a method mainly adopted is a transfer learning method, a template matching method or a deep learning method, but the transfer learning method needs to firstly accumulate a certain number of samples, and then generalize to other individual data, so that the accuracy is reduced; the identification precision of the template matching method is related to the number of the templates, and the selection of the templates is troublesome and is not beneficial to use; in order to pursue classification accuracy, deep learning is applied to more classifications of off-line eye electrical signals, and the application is not good enough in the aspect of real-time judgment; in the technology using machine learning, one classifier is used for one-time judgment, and the probability of misjudgment is high.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, and provides an attention recognition-oriented machine learning-based eye electrical signal classifier aiming at the problems that in the prior art, data sources are limited, an eye movement instrument is inconvenient to use as an eye movement judgment, a deep learning classification judgment algorithm is not applicable, and a template is complex to select.
The invention specifically adopts the following technical scheme:
an attention recognition-oriented machine learning-based eye electrical signal classifier is characterized in that a construction method thereof comprises the following steps:
step S1: selecting an electro-ocular signal for constructing a training set and a verification set, and preprocessing the selected original electro-ocular signal, which comprises the following steps: signal difference processing, eye electrical signal noise reduction, sliding window extraction and normalization processing;
step S2: performing primary feature extraction on each preprocessed electro-ocular signal segment, wherein the primary feature extraction comprises time domain features, frequency domain features and time-frequency domain features;
step S3: carrying out feature selection by using a genetic algorithm to obtain an optimal feature set consisting of d features;
step S4: training a classifier: after an input sample set which can be used for inputting a classifier is generated, manually judging and determining a sample label through an eye tracker, and inputting the sample label into the classifier for training; the classifier is a neural network and/or a linear discriminant analysis and/or a support vector machine;
step S5: and sending the verification set into a trained electro-oculogram classifier for classification verification, and voting out an optimal classifier according to the evaluation index selected by the operator.
Preferably, in step S1, the signal difference processing is to perform difference processing on the original horizontal signals of the left and right channels to obtain horizontal eye electrical signals;
the ocular signal noise reduction comprises:
(1) aiming at 50Hz power frequency noise generated by alternating current, a 4-order Butterworth filter is adopted to remove interference;
(2) for baseline drift and other bioelectrical signal noises, noise reduction is completed by utilizing wavelet analysis;
(3) setting the area between the blink starting point and the blink stopping point to zero for processing aiming at the blink disturbance signal;
(4) aiming at the secondary eye electric signal part above 10Hz, filtering by using a Chebyshev low-pass filter;
the sliding window extracts an electro-ocular signal obtained by using a sampling frequency of 250Hz, the length of the sliding window is 500ms, the sliding window slides once every 250ms, and a signal in the time window is extracted to generate an electro-ocular image;
in the normalization processing, at a certain moment, the horizontal channel eye electrical signal extracted by the sliding window is set as h, and the normalization formula is as follows:
Figure BDA0002955587320000031
wherein h ism、hnThe maximum and minimum amplitude values of the horizontal signal segment h are respectively.
Preferably, the specific process of performing denoising by using wavelet analysis is as follows:
step S11: removing the baseline drift phenomenon of the electro-oculogram signal by adopting discrete wavelet transform:
performing p-layer wavelet decomposition on the electro-ocular signals, wherein if the frequency band of the low-frequency coefficient of the p-th layer wavelet is closest to 0.05Hz, the signal corresponding to the wavelet coefficient is the baseline drift component of the electro-ocular signals; directly subtracting the baseline component from the original signal to obtain a signal, namely the base-line removed electrooculogram signal;
step S12: denoising by using wavelet soft threshold filtering:
the soft threshold method is adopted to carry out noise reduction processing on the electro-oculogram signals, and wavelet coefficients are as follows:
Figure BDA0002955587320000041
wherein d isj,kIs wavelet coefficient estimated value; etaj,kIs the wavelet coefficient after processing; lambda [ alpha ]jIs the jth layer threshold;
step S13: and performing wavelet inverse transformation by using the processed wavelet coefficient to obtain a processed reconstructed signal, so as to obtain an optimally restored pure electro-oculogram signal.
Preferably, the specific process of determining the blink disturbance signal is as follows:
step S14: determining a blink starting point and a blink stopping point:
selecting short-time energy as a threshold value discrimination signal of a blink starting point, carrying out windowing on the processed eye electric signal, and dividing the processed eye electric signal into a plurality of data sections with the same length: windowing and framing the time domain discrete signal x (k) of the electro-ocular signal to obtain the signal of the nth frame as xn(m) satisfying:
xn(m)=w(m)x(n+m),0≤m<N
wherein, N is the frame length, N is the frame number, w (m) is the cosine window function which can change the signal frequency spectrum; the nth frame electro-oculogram signal obtained after the frame division processing is xn(m) the sum of the squares of the contained signal amplitudes is the energy of the frame, i.e.:
Figure BDA0002955587320000042
determining the energy E of each frame by frame from the beginning of the signal i1,2, ·, N; defining the frame length as 8 sampling points, and the frame step length as 4 sampling points; carrying out median filtering on the short-time energy, removing outliers and recording as Ei'; setting the threshold value as S to 100 when Ei' when > S, set the signal to 1, EiWhen' < S, setting the signal to 0; carrying out difference on the signals subjected to threshold processing, and marking as F; the point where F is 1 corresponds to the starting point of the blink; when F is equal to-1, entering a termination point for judgment;
introducing monotonicity detection of the signal, and judging that the eye movement is terminated if and only if the signal simultaneously meets a threshold condition and a monotonicity condition; the algorithm for the introduced signal monotonicity detection is as follows:
(1) find the termination decision point E1
(2) Calculation of E1Monotonicity of corresponding eye electrical signals, i.e. difference d between two adjacent sampling points0
(3) When | d0When | < 2 μ V, the signal has ended monotone increase or decrease,E1Namely the blink termination point E;
(4) when | d0When | > 2. mu.V, from E1Starting to calculate 15 more samples later, find the first | d0Marking the moment of | < 2 μ V as a termination point E, otherwise, failing to detect the blink at this time;
step S15: the blink and the eye jump are distinguished to prevent misjudgment:
calculating absolute value of maximum and minimum of the eye electric signal between the starting point and the ending point by | x1I and | x2I represents, let x3=|x1|/|x2L, |; t represents a suspected blink interval in the eye electric signal, and the time interval between the occurrence of the maximum value and the occurrence of the minimum value is T; when x is more than or equal to 0.53When t is less than or equal to 2 and less than or equal to 150ms, the blink is judged; x is more than or equal to 0.53Judging the noise when t is less than or equal to 2 and more than 150 ms; x is the number of3When the eye jump is larger than 2, the eye jump is left; x is the number of3When < 0.5, the eye jump to the right is observed.
Preferably, in step S2, the time domain features include: the average value, the standard deviation, the absolute value average value, the root mean square value, the number of zero-crossing points, the variance value and the peak value of the amplitudes of the plurality of sampling point electro-ocular signals; the frequency domain features include: average power frequency and median frequency of the plurality of sampled eye electrical signals; the adopted time-frequency domain analysis method comprises the following steps: mallat decomposition coefficient algorithm.
Preferably, step S3 specifically includes:
a plurality of features in the primary feature set form an individual through binary coding, the arrangement sequence is fixed, when the q-th bit of the vector is 0, the feature is discarded, and when the q-th bit of the vector is 1, the feature is selected; the population size is
Figure BDA0002955587320000051
Evaluating the dimensionality reduction performance of the individual by using the correlation coefficient as a fitness function; setting the termination condition as the correlation coefficient value is larger than 0.5 or the algebra exceeds 100 generations; using roulette selection, the uniform crossover probability was 0.7 and the variation probability was 0.0001; and repeating the operation until a termination condition is reached, and selecting the d-dimensional characteristics with optimal performance.
Preferably, in step S4, if there are more than 4 consecutive sampling windows, the segment of the electro-ocular signal can be determined as the target electro-ocular signal, and the segment is selected as the electro-ocular training set with concentrated attention, otherwise, the segment is determined as the electro-ocular signal with non-concentrated attention.
Preferably, in step S5, an optimal classifier is selected by selecting the evaluation index.
Compared with the prior art, the method and the optimal scheme thereof eliminate invalid and unconscious signal interference by carrying out proper denoising processing, then carry out sliding window extraction and normalization processing, carry out feature extraction on the primary training set, train 3 machine learning classifiers by using the optimal feature set after feature selection, send the verification set into the classifier, vote and select the optimal classifier suitable for different users according to judgment criteria, and realize the purpose of judging whether the acquired eye electrical signals are in the attention concentration state in real time by using the sliding window in the test stage.
The discriminator can effectively discriminate whether the attention of the user is concentrated or not, and has better performances in the aspects of portability of daily detection, feature set screening aiming at individuals and real-time judgment of the attention state. In particular, the advantages of the invention are mainly:
1. and in the aspect of application data: the method is not limited to healthy individuals, and the users with mental disorders are considered in the preprocessing stage and the classification judging stage, so that the data application coverage is widened;
2. in the aspect of application fields: the electro-oculography is applied to the attention discrimination aspect, and the discrimination of the conscious moment when the attention of the user is concentrated is realized emphatically.
3. And (3) application equipment aspect: the electro-oculogram signals are used for replacing professional equipment such as invasive methods and eye movement instruments, so that users with requirements can easily finish daily detection, judgment and classification, and safety, portability and operability are improved; meanwhile, the problem of the economic pressure of the family of the user for judging the attention by using the eye tracker for a long time is effectively solved, and the continuity in daily detection can be ensured.
4. And (3) applying an algorithm aspect: the method has the advantages that the method carries out feature screening aiming at different individual users, optimizes different personalized feature sets for training, and overcomes the defect of poor generalization capability of transfer learning; the machine learning method overcomes the complexity of template matching, improves the discrimination efficiency and is beneficial to real-time attention discrimination; the classifier with the optimal judgment effect is used for real-time detection and judgment, so that the attention judgment accuracy is improved, and accidental misjudgment is avoided.
In conclusion, the classifier can be used in a human-computer interaction control system, and can be used for decoding the eye electrical signals through the eye electrical signals, analyzing the attention of a user in real time and accurately judging the conscious moment when the attention of the user is concentrated.
Drawings
The invention is described in further detail below with reference to the following figures and detailed description:
FIG. 1 is a schematic diagram of an electro-ocular data collection site according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of left-right tracking eye-scanning electrical signals and the generation principle of the present invention;
FIG. 3 is a general flow chart of the classifier construction process and application according to the embodiment of the present invention;
FIG. 4 is a schematic diagram of a wavelet soft threshold filtering denoising process according to an embodiment of the present invention;
FIG. 5 is a schematic diagram illustrating an exemplary embodiment of a blink signal removal process;
FIG. 6 is a flow chart of a signal preprocessing stage according to an embodiment of the invention.
Detailed Description
In order to make the features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail as follows:
considering that the eye electrical data contains a large amount of eye movement information, the prior art does not apply the eye electrical data to perform real-time attention discrimination, does not provide a targeted measure for improving attention discrimination by using a personalized feature set, and does not get rid of the complexity of template matching, the embodiment provides an eye electrical signal classifier based on machine learning and oriented to attention recognition, and details on the construction process and application of the eye electrical signal classifier.
In the construction process, the effective rate of the eye electrical signals is improved through preprocessing, the optimal feature set for the individual is screened out, three kinds of classic machine learning classifiers are used as classification models, the optimal classifier for the individual user is determined through training and a verification set, the eye electrical signals are extracted through a sliding window and sent to the classifier for real-time analysis, and real-time attention discrimination is achieved.
As shown in fig. 1, the kind of the eye electrical signal needs to be determined first by using the present technical solution. The electric eye signals are divided into a vertical channel and a horizontal channel, but the moving range and the flexibility of eyeballs in the vertical direction are smaller than those in the horizontal direction, and the electric eye signals are influenced by muscles related to movement near eyebrows and cheeks, so that the value of the electric eye signals in practical application cannot be compared with that of the horizontal electric eye signals. And because the magnitude of the horizontal or vertical channel ocular signal amplitude is related only to the magnitude of the angle of deflection of the eyeball in that direction and not to the angular component in the other direction. Therefore, the classifier provided by the present embodiment performs analysis using only the horizontal eye electrical signal. The channel data are respectively from a horizontal eye electrical signal (R), a horizontal eye electrical signal (L) and a signal common terminal (F).
The present embodiment divides the basic form of eye movement into two types: inattention (identified as 0) and attentiveness (identified as 1). When a user is in a state of stuttering or thinking, the eye spirit is unconsciously focused at a certain position, and then the state of relatively stable electrooptic signals is achieved. However, the eyeball is not kept still completely, and natural physiological reactions such as eye tremor and drift can be generated, and the frequency is less than 3 Hz. When suddenly attracted to another object during concentration and then distracted, the line of sight shifts rapidly, with discontinuous jerks, rather than smooth movements. When a user is attracted by some object for a long time, the two eyeballs continuously acquire information by synchronously moving left and right, and the information is expressed in the forms of saccade in a certain range, regular back and forth movement and the like. For the user, the clinical experiment considers that, in the observation period, when the user has more than two thirds of the time in the attention state (continuous attention is not required), the user can consider that the user is consciously watching in the period of time, and the eye electrical signal generated by the eyeball motion can be used as the target eye electrical signal for triggering the therapeutic apparatus to be started.
Taking the left-right following scanning as an example, assuming that the user scans for 3 seconds at a constant speed from left to right and then scans for 3 seconds in a reverse direction, the electrooculogram data in a state of more concentrated attention of the user can present stronger regularity and certain periodicity, and is greatly different from the electrooculogram data at the moment of non-concentrated attention. The retina presents a negative signal relative to the cornea, the cornea presents a positive signal relative to the retina, and the rotation of the eyeball causes the potential difference between the two to vary, i.e. the electrical eye signal. When the eyeball rotates from left to right, an electro-oculogram signal with positive amplitude is generated, and an electro-oculogram signal with negative amplitude is generated in the reverse direction, and the glancing electro-oculogram signal after simple processing is shown in fig. 2.
The construction process of the classifier comprises the following specific steps:
the first step is to pre-process the original signal. The second step is the primary feature set extraction. The third step is feature selection. The fourth step is to train the classifier. And the fifth step is to verify the trained electro-ocular classifier and vote to select the optimal classifier. The flow chart of the implementation of each step is shown in FIG. 3.
The first step is as follows: the original signal is pre-processed. This step includes four aspects, respectively: signal difference processing, eye electrical signal noise reduction, sliding window extraction and normalization processing.
1. And (5) signal differential processing.
And carrying out differential processing on the original horizontal signals of the left channel and the right channel to obtain horizontal eye electrical signals.
2. The electro-ocular signal is denoised.
(1) Aiming at 50Hz power frequency noise generated by alternating current, a 4-order Butterworth filter is adopted to remove interference;
(2) for baseline drift and other bioelectrical signal noises, noise reduction is completed by utilizing wavelet analysis;
the electro-oculogram is not only influenced by the brain electricity, the myoelectricity and the electrocardio, but also the baseline amplitude is not stable in the resting state, and the baseline drift can be generated along with the change of factors such as time, environment and the like. In addition, the electro-ocular signals of different users have different baseline amplitudes. These characteristics all affect the effectiveness of looking at the electrical signal.
Due to the multi-resolution characteristic, the time domain local change characteristic, the base selection flexibility and the decorrelation of the wavelet filtering, the method is suitable for processing the non-stationary electro-oculogram signals, and can ensure that mutation parts and image edges still exist in the obtained effective signals while the purpose of noise reduction is achieved.
First, the baseline shift phenomenon of the electro-ocular signal is removed by discrete wavelet transform.
When the eye is looking straight ahead, the signal baseline is not a straight line near or parallel to the zero line, and changes in the potential difference caused by eye deflection will be superimposed on the baseline drift, typically the component of the signal with frequencies below 0.05 Hz. And (3) performing 6-layer wavelet decomposition on the electro-ocular signal (the specific decomposition layer number N can be determined according to the individual conditions of different patient users), wherein if the frequency band of the 6 th-layer wavelet low-frequency coefficient is just close to 0.05Hz, the signal corresponding to the wavelet coefficient is the baseline drift component of the electro-ocular signal. And directly subtracting the baseline component from the original signal to obtain a signal, namely the base-line removed electrooculogram signal.
Then, denoising is performed by using wavelet soft threshold filtering. The flow is shown in fig. 4.
The wavelet coefficient of the signal after wavelet transform processing is composed of the coefficients of effective signal and interference signal, the signal properties, construction rules and the like of the two signals on each scale after wavelet decomposition are different, and a proper threshold value can be obtained by training on different scales. The threshold value is taken as a basic point, the amplification coefficient is properly reset if the threshold value is larger than the threshold value, otherwise, the amplification coefficient is kept unchanged or returns to zero, so that the noise signals are effectively inhibited or even eliminated, and the requirement of maximally keeping the effective signal coefficient while reducing the interference signal coefficient as much as possible or even completely eliminating the interference coefficient is achieved. This process does not blindly remove the high frequency signal causing information loss. When the wavelet coefficient is processed by adopting the hard threshold method, a fault phenomenon is easy to occur in a reconstruction link, oscillation is caused, and the soft threshold does not have the fault phenomenon, so that the soft threshold method is adopted to perform noise reduction on the eye electrical signal, and the wavelet coefficient is as follows:
Figure BDA0002955587320000091
wherein d isj,kIs wavelet coefficient estimated value; etaj,kIs the wavelet coefficient after processing; lambda [ alpha ]jIs the jth layer threshold.
In the coefficient setting process, the selection of the parameters is mainly evaluated by the quality of the processed effect, and the quality of the processed effect is mainly evaluated by the correlation and the spectrum analysis of signals before and after processing. And then, acquiring the processed reconstructed signal by utilizing wavelet inverse transformation to finally obtain the optimally restored pure electro-oculogram signal.
(3) Setting the area between the blink starting point and the blink stopping point to zero for processing aiming at the blink disturbance signal;
the eye electric signals of the blinks in the horizontal and vertical directions have potential changes, the normal blinking frequency is about 12-19 times per minute under the condition of no external stimulation, the time of each blinking is about 0.1s-0.4s, the waveform contains obvious peaks, and the maximum difference between the conscious blinking and the unconscious blinking is only the difference of eye movement amplitude. The elimination of the blink disturbance is divided into two steps:
first, the blink start and stop points are determined.
And selecting the short-time energy as a threshold value discrimination signal of the blink starting point. The processed electro-ocular signals are processed by windowing and are divided into a plurality of data segments with the same length, only one segment of signals in the window is needed to be observed, and signals outside the window are ignored. Windowing and framing the time domain discrete signal x (k) of the eye electric signal to obtain the signal of the nth frame, and recording as xn(m), then xn(m) satisfies:
xn(m)=w(m)x(n+m),0≤m<N
where N is the frame length, N is the frame number, and w (m) is the window function that can change the signal spectrum. The use of the correct window function is the basis for obtaining stable valid signal characteristics. In an experimental environment, the rectangular window function is often not only a risk of high-frequency noise to the signal but may even leak the signal.The cosine window function adopted by the invention can keep the acquired signal with high resolution all the time and can effectively keep the important information of the signal. The nth frame electro-oculogram signal obtained after the frame division processing is xn(m) the sum of the squares of the contained signal amplitudes is the energy of the frame, i.e.:
Figure BDA0002955587320000101
determining the energy E of each frame by frame from the beginning of the signal i1, 2. The frame length is defined as 8 sampling points, and the frame step length is 4 sampling points. Carrying out median filtering on the short-time energy, removing outliers and recording as Ei'. Setting the threshold value as S to 100 when Ei' when > S, set the signal to 1, EiIf' < S, the signal is set to 0. The difference between the signals after the threshold processing is denoted as F. The point where F is 1 corresponds to the starting point of the blink. And F is equal to-1, entering a termination point for judgment.
The detection of the end point is highly susceptible to noise. The eye movement detection method using the speed, acceleration or short-time energy threshold value alone cannot ensure that a more accurate result can be obtained each time. For example, for a non-main sequence eye jump with asymmetric velocity, it is difficult to accurately locate the end point of the eye jump using a threshold method. In order to more accurately locate the ending time of one-time blink movement, the method introduces monotonicity detection of signals on the basis of a threshold value method, and judges that the eye movement is terminated if and only if the signals simultaneously meet a threshold value condition and a monotonicity condition. The algorithm for the introduced signal monotonicity detection is described as follows:
(1) finding the termination judgment point E according to the method1
(2) Calculation of E1Monotonicity of corresponding eye electrical signals, i.e. difference d between two adjacent sampling points0
(3) When | d0When | < 2 μ V, the signal has ended monotone increase or decrease, E1Namely the blink termination point E;
(4) when | d0When | > 2. mu.V, from E115 are counted again after the beginningSample point, find first | d0And recording the moment of | < 2 μ V as a termination point E, otherwise, failing to detect the blink.
Subsequently, the blinking and the eye jump are distinguished to prevent erroneous judgment.
The rising edge of the blinking pulse is easily detected and is not very different from the edge of a normal eye jump. It is because of the characteristic of the falling edge of a blink that it is ensured that a blink can be correctly detected. Thus, a necessary condition for a blink to be correctly recognized is to have complete rising and falling edges. Although the peak amplitude range of the peak signal generated by blinking is wide, the corresponding difference signal extreme values are different, but the ratio of the maximum value to the minimum value is always in a smaller range. The two extreme values of the difference signal of blinking not only occur at short intervals, but also have a small difference in amplitude. By utilizing the characteristic, the invention uses a recognition method to distinguish the blink from the left and right eye jumps, thereby preventing the error judgment of the blink signal. Calculating absolute value of maximum and minimum values of the eye electrical signal between the starting point and the ending point by | x1I and | x2I represents, let x3=|x1|/|x2L. T represents a suspected blink interval in the eye electric signal, and the time interval between the occurrence of the maximum value and the occurrence of the minimum value is T; when x is more than or equal to 0.53When t is less than or equal to 2 and t is less than or equal to 150ms, the blink is judged; x is more than or equal to 0.53Judging the noise when t is less than or equal to 2 and more than 150 ms; x is the number of3When the eye jump is larger than 2, the eye jump is left; x is the number of3When < 0.5, the eye jump to the right is observed. Compared with blink detection using empirical mode decomposition and blink detection using wavelet analysis, the algorithm is simple. After the start point and the end point are determined, the area between the two points is set to zero, thereby removing the blink signal. The flow chart is shown in fig. 5.
(4) Aiming at the secondary eye electric signal part above 10Hz, filtering by using a Chebyshev low-pass filter;
the frequency of the EOG signal is concentrated at 0.1-38Hz, the main component is at 1-10Hz, and secondary signals need to be filtered out in order to reduce the data processing amount.
3. Sliding window extraction
In the embodiment, an electrooculogram signal obtained by sampling at a frequency of 250Hz is used, a time window is 500ms long, and the electrooculogram signal is extracted to generate an electrooculogram image after sliding once every 250 ms. The data extracted by the sliding window keeps the same number of sampling points, the data length of each training can be reduced, the calculation amount is reduced for the subsequent feature extraction, and the real-time judgment efficiency is improved.
4. Normalization process
The ocular electrical signals of the same type have relatively specific waveforms, but the ocular electrical signals collected at different time intervals belonging to the same type may have different amplitudes, so that after an ocular electrical signal segment is determined through sliding window processing, the amplitudes of the ocular electrical signals need to be subjected to [ -1, 1] interval normalization processing for convenience in subsequent analysis and processing. And (3) setting the horizontal channel eye electrical signal extracted by the sliding window to be h at a certain moment, wherein the normalization formula is as follows:
Figure BDA0002955587320000111
wherein h ism、hnThe maximum and minimum amplitude values of the horizontal signal segment h are respectively.
The flow of the signal preprocessing stage is shown in fig. 6.
The second step is that: primary feature set extraction
And calculating the characteristics of each preprocessed eye electric signal, and comparing the influence of different characteristics on the attention judgment of the eye electric signal.
The time domain analysis method has the advantages of easy understanding, simple calculation and the like when extracting the characteristic value of the electro-oculogram signal, and the characteristic vector parameters comprise the following parameters:
mean (AV):
Figure BDA0002955587320000121
standard deviation (STD):
Figure BDA0002955587320000122
mean Absolute Value (AVA):
Figure BDA0002955587320000123
root mean square value (RMS):
Figure BDA0002955587320000124
zero crossing number (ZC):
Figure BDA0002955587320000125
wherein sgn (x) is a sign function;
sixthly, variance Value (VAR):
Figure BDA0002955587320000126
peak value (TPS).
Wherein x isiThe amplitude of the electro-optic signal is the ith sampling point, N represents the number of sampling points,
Figure BDA0002955587320000127
is the sample mean.
The purpose of frequency domain transformation is to convert the signal amplitude changing with time into changing with frequency, and to search the corresponding relation between the signal amplitude and power and different frequencies. The invention adopts two common frequency domain characteristic values:
average power frequency (MPF):
Figure BDA0002955587320000128
middle Frequency (MF):
Figure BDA0002955587320000129
where p (f) is a function of the power spectral density of the signal.
The time-frequency domain feature extraction method not only retains the characteristics of the signal in the time domain, but also provides the frequency domain characteristics. The time-frequency domain analysis method adopted by the invention comprises the following steps: mallat decomposition coefficient algorithm.
The basic idea of the algorithm is as follows: the original discrete signal is S, the discrete approximation signal is A, and the distance isThe detail signal is D, and after decomposition, the original discrete signal can be expressed as: s is An=Dn+Dn-1+...+D1,AnContains important information of EOG. In MATLAB, the present embodiment implements a multi-level decomposition of the original wavelet using the function wavedec.
The third step: and (4) selecting the characteristics.
The feature selection can not only reserve beneficial parameters with larger contribution degree in the features, reduce the feature space dimension and improve the algorithm execution speed. Since different users have certain difference in the electro-ocular characteristics and the waveform characteristics of the target electro-ocular signals are different, it is necessary to perform a pre-experiment to establish a respective feature set for each user to obtain a better recognition effect.
In order to reduce the feature dimension and improve the identification accuracy, a packaging type feature selection dimension reduction algorithm, namely a genetic algorithm, is adopted for feature selection, and a d-dimensional feature subset which performs better aiming at different individuals is selected from a primary feature set consisting of 10 features extracted in the second step. The n features in the primary feature set form an individual through binary coding (n-dimensional vector, n is the total number of the selected primary features, 10 in the present invention), and the arrangement order is fixed, when the ith bit of the vector is 0, the feature is discarded, and when the ith bit is 1, the feature is selected. The population size is
Figure BDA0002955587320000131
And evaluating the dimensionality reduction performance of the individual by using the correlation coefficient as a fitness function. The termination condition is set to have a correlation coefficient value greater than 0.5 or an algebra greater than 100 generations (where the parameter value can be adjusted according to individual practice). Roulette selection was used in the selection operation, with a uniform crossover probability of 0.7 and a variation probability of 0.0001. And repeating the operation until a termination condition is reached, and selecting the d-dimensional characteristics with optimal performance.
The fourth step: and training a classifier.
Firstly, an input sample meeting the input form of a classifier is generated by the electro-oculogram signal segment subjected to preprocessing and feature extraction, and the label of a sample image (namely, whether the attention of a user is concentrated or not) is judged by a professional doctor by looking back an eye movement track, a scatter point or a sight line hotspot image of the eye movement instrument as a reference according to eye movement instrument matching software. Training and validation sets were as follows 4: 1, and sorting and storing the data with attention concentration and attention non-concentration. For a patient with mental disorder, it is difficult to keep attention for a long time, generally, it is considered that, within 6 consecutive sampling windows, more than 4 windows of a user are in an attention state (continuous attention is not required), that is, the user is considered to be attentive during the period, an eye electrical signal generated during the period is a target eye electrical signal, and the period can be selected as an eye electrical training set of the attention state.
The labeled training set is then input to a classifier for training. The invention adopts three classifiers of a neural network (BP), a Linear Discriminant Analysis (LDA) and a Support Vector Machine (SVM) for training, and stores the trained classifiers.
The fifth step: and inputting the verification set into the trained electro-oculogram classifier for classification verification, and voting out an optimal classifier according to the evaluation indexes.
And testing each trained classifier off-line by using the verification collection eye electrical signal in the step four. And as with the training sample, the label judgment is carried out by a professional according to the eye tracker matching software by looking back the eye tracker eye movement track, scatter point or sight line hot spot diagram as a reference, the optimal classifier is selected as the classifier used for the sixth step of online intention decoding according to the selected evaluation index, and the construction scheme of the training set for training the classifier is the optimal scheme.
For the classification results, Sensitivity (Sensitivity), Specificity (Specificity) and Accuracy (Accuracy) are used for evaluation, i.e., evaluation indexes. Sensitivity reflects the ability of a classification algorithm to identify the amount of the target; specificity describes the ability to exclude non-targets; accuracy measures the accuracy of the classification. Because the invention belongs to the problem of binary classification, TP and TN are both the number of samples which are classified correctly, wherein TP represents the number of positive samples classified into positive samples, and TN represents the number of negative samples classified into negative samples; both FP and FN are the number of samples with wrong classification, wherein FP represents the number of samples with negative classification into positive classification, namely false alarm; FN represents the number of samples classified as negative class, i.e. false negative. The calculation formulas are respectively as follows:
Figure BDA0002955587320000141
Figure BDA0002955587320000142
Figure BDA0002955587320000143
according to the steps, the optimal classifier for the specific user can be obtained, and the attention type of the electro-oculogram signal can be judged.
The process of online decoding attention discrimination is as follows:
and in the online identification stage, pure electro-oculogram data is obtained according to the pre-processing in the first step for the electro-oculogram signals needing to be classified, the same sliding window scheme is used for carrying out sliding window extraction and normalization processing, characteristic values are calculated according to the optimal sub-characteristic set of each user, and the characteristic values are input into the trained optimal classifier. And the optimal classifier performs two classifications of whether the attention is concentrated or not to obtain a real-time judgment result. For each user, the acquisition environment should be as consistent as possible with the stages of collecting training sets and verification sets, including light and shade, personnel walking, gaze distance, and the like, to improve the accuracy of the ocular electrical signals.
In summary, the scheme provided by this embodiment can be understood as follows: firstly, selecting an electro-ocular signal for constructing a training set and a verification set, and carrying out differential processing, denoising, sliding window extraction and normalization pretreatment on the selected original electro-ocular signal; secondly, extracting features of the individual, and selecting the features by using a genetic algorithm to obtain an individualized optimal feature set consisting of d features; after an input sample set which can be used for inputting a classifier is generated, a professional doctor determines a sample label according to the judgment of the eye tracker, and trains the classifier by using a constructed training set in an off-line manner; then, selecting a verification collection eye electric signal for off-line detection of the classifier, and selecting an optimal classifier aiming at the user individual according to the selected classifier evaluation index; taking the best classifier obtained by offline test as a classifier for online real-time judgment; when real-time discrimination is carried out, the data is subjected to the same preprocessing mode as a training set and a verification set, and an attention discrimination result is obtained by extracting an optimal feature set and using an optimal classifier; and (4) taking a judgment result obtained by online judgment as a control signal of a man-machine interface to realize real-time control of subsequent equipment.
The present invention is not limited to the above preferred embodiments, and other various types of attention-oriented machine learning-based eye electrical signal classifiers can be derived by anyone in light of the present invention.

Claims (6)

1. An attention recognition-oriented machine learning-based eye electrical signal classifier is characterized in that a construction method comprises the following steps:
step S1: selecting an electro-ocular signal for constructing a training set and a verification set, and preprocessing the selected original electro-ocular signal, which comprises the following steps: signal difference processing, eye electric signal noise reduction, sliding window extraction and normalization processing;
step S2: performing primary feature extraction on each preprocessed electro-ocular signal segment, wherein the primary feature extraction comprises time domain features, frequency domain features and time-frequency domain features;
step S3: carrying out feature selection by using a genetic algorithm to obtain an optimal feature set consisting of d features;
step S4: training a classifier: after an input sample set which can be used for inputting a classifier is generated, manually judging and determining a sample label through an eye tracker, and inputting 3 kinds of classifiers for training; the classifier is a neural network and/or a linear discriminant analysis and/or a support vector machine;
step S5: sending the verification set into a trained electro-oculogram classifier for classification verification, and voting to select an optimal classifier according to evaluation indexes selected by an operator;
in step S1, the signal difference processing is to perform difference processing on the original horizontal signals of the left and right channels to obtain horizontal eye electrical signals;
the ocular signal noise reduction comprises:
(1) aiming at 50Hz power frequency noise generated by alternating current, a 4-order Butterworth filter is adopted to remove interference;
(2) for baseline drift and other bioelectrical signal noises, noise reduction is completed by utilizing wavelet analysis;
(3) setting the area between the blink starting point and the blink stopping point to zero for processing aiming at the blink disturbance signal;
(4) aiming at the secondary eye electric signal part above 10Hz, filtering by using a Chebyshev low-pass filter;
the sliding window extracts an electro-ocular signal obtained by using a sampling frequency of 250Hz, the length of the sliding window is 500ms, the sliding window slides once every 250ms, and a signal in the time window is extracted to generate an electro-ocular image;
in the normalization processing, at a certain moment, the horizontal channel eye electrical signal extracted by the sliding window is set as h, and the normalization formula is as follows:
Figure FDA0003463960130000011
wherein h ism、hnRespectively the maximum value and the minimum value of the amplitude of the horizontal signal segment h;
the specific process of judging the blink disturbance signal is as follows:
step S14: determining a blink starting point and a blink stopping point:
selecting short-time energy as a threshold discrimination signal of a blink starting point, carrying out windowing processing on the processed eye electrical signal, and dividing the processed eye electrical signal into a plurality of data segments with the same length: windowing and framing the time domain discrete signal x (k) of the electro-ocular signal to obtain the signal of the nth frame as xn(m) satisfying:
xn(m)=w(m)x(n+m),0≤m<N
wherein N is the frame length, N is the frame number, w (m) is the cosine window function capable of changing the signal frequency spectrum(ii) a The nth frame electro-oculogram signal obtained after the frame division processing is xn(m) the sum of the squares of the contained signal amplitudes is the energy of the frame, i.e.:
Figure FDA0003463960130000021
determining the energy E of each frame by frame from the beginning of the signali1,2, ·, N; defining the frame length as 8 sampling points, and the frame step length as 4 sampling points; carrying out median filtering on the short-time energy, removing outliers and recording as Ei'; setting the threshold value as S to 100 when Ei' when > S, set the signal to 1, EiWhen' < S, setting the signal to 0; carrying out difference on the signals subjected to threshold processing, and marking as F; the point where F is 1 corresponds to the starting point of the blink; judging an entering termination point when F is equal to-1;
introducing monotonicity detection of the signal, and judging that the eye movement is terminated if and only if the signal simultaneously meets a threshold condition and a monotonicity condition; the algorithm for the introduced signal monotonicity detection is as follows:
(1) find the termination decision point E1
(2) Calculation of E1Monotonicity of corresponding eye electrical signals, i.e. difference d between two adjacent sampling points0
(3) When | d0When | < 2 μ V, the signal has ended monotone increase or decrease, E1Namely the blink termination point E;
(4) when | d0When | > 2. mu.V, from E1Starting to calculate 15 more samples later, find the first | d0Marking the moment of | < 2 μ V as a termination point E, otherwise, failing to detect the blink at this time;
step S15: the blink and the eye jump are distinguished to prevent misjudgment:
calculating absolute value of maximum and minimum of the eye electric signal between the starting point and the ending point by | x1I and | x2I denotes, let x3=|x1|/|x2L, |; t represents a suspected blink interval in the eye electric signal, and the time interval between the occurrence of the maximum value and the occurrence of the minimum value is T; when x is more than or equal to 0.53When t is less than or equal to 2 and less than or equal to 150ms, the blink is judged; x is more than or equal to 0.53Judging the noise when t is less than or equal to 2 and more than 150 ms; x is the number of3When the eye jump is larger than 2, the eye jump is left; x is the number of3When < 0.5, the eye jump to the right is observed.
2. The attention-recognition-oriented machine-learning-based ocular electrical signal classifier of claim 1, wherein: the specific process of using wavelet analysis to complete denoising is as follows:
step S11: removing the baseline drift phenomenon of the electro-oculogram signal by adopting discrete wavelet transform:
performing p-layer wavelet decomposition on the electro-ocular signals, wherein if the frequency band of the low-frequency coefficient of the p-th layer wavelet is closest to 0.05Hz, the signal corresponding to the wavelet coefficient is the baseline drift component of the electro-ocular signals; directly subtracting the baseline component from the original signal to obtain a signal, namely the base-line removed electrooculogram signal;
step S12: denoising by using wavelet soft threshold filtering:
the soft threshold method is adopted to carry out noise reduction processing on the electro-oculogram signals, and wavelet coefficients are as follows:
Figure FDA0003463960130000031
wherein d isj,kIs wavelet coefficient estimated value; etaj,kIs the wavelet coefficient after processing; lambda [ alpha ]jIs the jth layer threshold;
step S13: and performing wavelet inverse transformation by using the processed wavelet coefficient to obtain a processed reconstructed signal, so as to obtain an optimally restored pure electro-oculogram signal.
3. The attention-recognition-oriented machine-learning-based ocular electrical signal classifier of claim 1, wherein: in step S2, the time domain features include: the average value, the standard deviation, the absolute value average value, the root mean square value, the number of zero-crossing points, the variance value and the peak value of the amplitudes of the plurality of sampling point electro-ocular signals; the frequency domain features include: average power frequency and median frequency of the plurality of sampled eye electrical signals; the adopted time-frequency domain analysis method comprises the following steps: mallat decomposition coefficient algorithm.
4. The attention-recognition-oriented machine-learning-based ocular electrical signal classifier of claim 1, wherein: step S3 specifically includes:
a plurality of features in the primary feature set form an individual through binary coding, the arrangement sequence is fixed, when the q-th bit of the vector is 0, the feature is discarded, and when the q-th bit of the vector is 1, the feature is selected; the population size is
Figure FDA0003463960130000032
Evaluating the dimensionality reduction performance of the individual by using the correlation coefficient as a fitness function; setting the termination condition as the correlation coefficient value is larger than 0.5 or the algebra exceeds 100 generations; using roulette selection, the uniform crossover probability was 0.7 and the variation probability was 0.0001; and repeating the operation until a termination condition is reached, and selecting the d-dimensional characteristics with optimal performance.
5. The attention-recognition-oriented machine-learning-based ocular electrical signal classifier of claim 1, wherein: in step S4, if the electro-ocular signal is classified as being in the attention-focused state in more than 4 consecutive 6 sampling windows, the electro-ocular signal is determined as the target electro-ocular signal, the region is selected as the attention-focused electro-ocular training set, otherwise, the electro-ocular signal is determined as the non-attention-focused electro-ocular signal.
6. The attention-recognition-oriented machine-learning-based ocular electrical signal classifier of claim 1, wherein: in step S5, the classifier with the highest judgment accuracy is selected by selecting the evaluation index.
CN202110226571.7A 2021-03-01 2021-03-01 Attention-recognition-oriented machine learning-based eye electrical signal classifier Active CN112890834B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110226571.7A CN112890834B (en) 2021-03-01 2021-03-01 Attention-recognition-oriented machine learning-based eye electrical signal classifier

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110226571.7A CN112890834B (en) 2021-03-01 2021-03-01 Attention-recognition-oriented machine learning-based eye electrical signal classifier

Publications (2)

Publication Number Publication Date
CN112890834A CN112890834A (en) 2021-06-04
CN112890834B true CN112890834B (en) 2022-05-13

Family

ID=76107264

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110226571.7A Active CN112890834B (en) 2021-03-01 2021-03-01 Attention-recognition-oriented machine learning-based eye electrical signal classifier

Country Status (1)

Country Link
CN (1) CN112890834B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113157101B (en) * 2021-06-07 2022-08-19 成都华脑科技有限公司 Fragmentation reading habit identification method and device, readable medium and electronic equipment
CN113344295B (en) * 2021-06-29 2023-02-14 华南理工大学 Method, system and medium for predicting residual life of equipment based on industrial big data
CN114305420A (en) * 2021-12-30 2022-04-12 上海卓道医疗科技有限公司 Cognitive rehabilitation training evaluation system and method
CN114343638B (en) * 2022-01-05 2023-08-22 河北体育学院 Fatigue degree assessment method and system based on multi-mode physiological parameter signals
CN115227504B (en) * 2022-07-18 2023-05-26 浙江师范大学 Automatic lifting sickbed system based on electroencephalogram-eye electric signals
CN116548928B (en) * 2023-07-11 2023-09-08 西安浩阳志德医疗科技有限公司 Nursing service system based on internet
CN116602691B (en) * 2023-07-14 2023-10-10 北京元纽科技有限公司 Denoising method and device for electroencephalogram signals, electronic equipment and storage medium
CN116661609B (en) * 2023-07-27 2024-03-01 之江实验室 Cognitive rehabilitation training method and device, storage medium and electronic equipment

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102968072A (en) * 2012-11-09 2013-03-13 上海大学 Electro-oculogram control system and method based on correction/training
CN109032347A (en) * 2018-07-06 2018-12-18 昆明理工大学 One kind controlling mouse calibration method based on electro-ocular signal
CN109934089B (en) * 2018-10-31 2020-10-23 北京航空航天大学 Automatic multi-stage epilepsia electroencephalogram signal identification method based on supervised gradient raiser
CN110275621A (en) * 2019-06-26 2019-09-24 陕西科技大学 Barrier movable platform is helped based on electro-ocular signal control

Also Published As

Publication number Publication date
CN112890834A (en) 2021-06-04

Similar Documents

Publication Publication Date Title
CN112890834B (en) Attention-recognition-oriented machine learning-based eye electrical signal classifier
Liu et al. Subject-independent emotion recognition of EEG signals based on dynamic empirical convolutional neural network
Karthikeyan et al. Detection of human stress using short-term ECG and HRV signals
Aboalayon et al. Efficient sleep stage classification based on EEG signals
CN114781465B (en) rPPG-based non-contact fatigue detection system and method
Sengupta et al. A multimodal system for assessing alertness levels due to cognitive loading
Tong et al. Emotion recognition based on photoplethysmogram and electroencephalogram
Ren et al. Comparison of the use of blink rate and blink rate variability for mental state recognition
Baghdadi et al. Dasps: a database for anxious states based on a psychological stimulation
CN112328072A (en) Multi-mode character input system and method based on electroencephalogram and electrooculogram
Hu et al. A real-time electroencephalogram (EEG) based individual identification interface for mobile security in ubiquitous environment
Li et al. Eye-tracking signals based affective classification employing deep gradient convolutional neural networks
Li et al. A novel spatio-temporal field for emotion recognition based on EEG signals
Trigka et al. A survey on signal processing methods for EEG-based brain computer interface systems
CN112869743B (en) Exercise initiation intention neural analysis method considering cognitive distraction
CN117770821A (en) Human-caused intelligent cabin driver fatigue detection method, device, vehicle and medium
Ouyang et al. Design and implementation of a reading auxiliary apparatus based on electrooculography
Wu et al. Stress detection using wearable devices based on transfer learning
Gao et al. An ICA/HHT hybrid approach for automatic ocular artifact correction
Ouyang et al. Vigilance analysis based on continuous wavelet transform of eeg signals
Murad et al. Unveiling Thoughts: A Review of Advancements in EEG Brain Signal Decoding into Text
Wang et al. The effects of channel number on classification performance for sEMG-based speech recognition
ElSayed et al. Multimodal analysis of electroencephalographic and electrooculographic signals
Savarimuthu et al. An investigation on mental stress detection from various physiological signals
Romaniuk et al. Intelligent eye gaze localization method based on EEG analysis using wearable headband

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220216

Address after: Room 283, west side, 1st floor, old neighborhood committee, lianpanxin village, 435 Guohuo East Road, Xiangyuan street, Jin'an District, Fuzhou City, Fujian Province, 350000

Applicant after: Fujian maixing Life Medical Technology Co.,Ltd.

Address before: Fuzhou University, No.2, wulongjiang North Avenue, Fuzhou University Town, Minhou County, Fuzhou City, Fujian Province

Applicant before: FUZHOU University

GR01 Patent grant
GR01 Patent grant