CN107348958B - Robust glance EOG signal identification method and system - Google Patents

Robust glance EOG signal identification method and system Download PDF

Info

Publication number
CN107348958B
CN107348958B CN201710695426.7A CN201710695426A CN107348958B CN 107348958 B CN107348958 B CN 107348958B CN 201710695426 A CN201710695426 A CN 201710695426A CN 107348958 B CN107348958 B CN 107348958B
Authority
CN
China
Prior art keywords
independent
matrix
eye movement
frequency
movement data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710695426.7A
Other languages
Chinese (zh)
Other versions
CN107348958A (en
Inventor
吕钊
张贝贝
吴小培
周蚌艳
张超
高湘萍
郭晓静
卫兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui University
Original Assignee
Anhui University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui University filed Critical Anhui University
Priority to CN201710695426.7A priority Critical patent/CN107348958B/en
Publication of CN107348958A publication Critical patent/CN107348958A/en
Application granted granted Critical
Publication of CN107348958B publication Critical patent/CN107348958B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/398Electrooculography [EOG], e.g. detecting nystagmus; Electroretinography [ERG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7225Details of analog processing, e.g. isolation amplifier, gain or sensitivity adjustment, filtering, baseline or drift compensation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Signal Processing (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Mathematical Physics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Power Engineering (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a robust glance EOG signal identification method and a system, belonging to the technical field of electro-oculography, wherein the method comprises the steps of collecting EOG multi-channel eye movement data to obtain eye movement data on a time domain; preprocessing the eye movement data in the time domain to obtain eye movement data in the frequency domain; performing blind source separation on the eye movement data by adopting a complex value ICA algorithm on a frequency domain to obtain frequency domain independent components of each source signal on corresponding frequency points; compensating the independent components of each frequency point, restoring the real proportion of the independent components in the observed components, and solving the sorting fuzzy problem by a constrained DOA algorithm; carrying out short-time inverse Fourier transform processing on the compensated and sequenced independent components of each frequency point, and recovering the independent components into multi-channel eye movement data of a time domain; and extracting power spectral density characteristics of the multi-channel eye movement data on the time domain, and sending the extracted characteristics into a Support Vector Machine (SVM) for identification. The invention can accurately identify the EOG signal.

Description

Robust glance EOG signal identification method and system
Technical Field
The invention relates to the technical field of Electrooculography (EOG), in particular to a robust saccade EOG signal identification method and system based on multi-channel blind deconvolution.
Background
Human-Computer Interaction (HCI) based on bioelectricity is a supplement to the conventional Human-Computer Interaction method, and in some special application scenarios, for example: the interaction between the disabled group and the external environment, the monitoring of clinical patients, the communication in special environment, the fatigue detection of drivers and the like have strong practical application value. Because the EOG can reflect the eye movement patterns in different behavior states, the EOG-based human-computer interaction technology has become a new research hotspot.
The eyeball is a bipolar sphere, the cornea presents a positive potential in the eyeball system, and the retina presents a negative potential, so-called eyeball signals are caused by the potential difference generated between the cornea and the retina when the eyeball rotates. The potential is initiated by retinal epithelium and photoreceptor cells, the anode of the potential is positioned at a photoreceptor end, the cathode of the potential is positioned at a retinal pigment epithelium end, and the generated current flows from the retinal end to a corneal end, so that the potential with the amplitude of about 0.4 mV-10 mV, the cornea as the anode and the retina as the cathode is formed.
The types of eye movements of EOG signals are mainly classified into three categories: saccades, fixations and blinks, saccadic movements have a crucial role in implementing EOG-based human behavior recognition systems, have a high frequency of use, and contain this large amount of behavior information. However, in the process of acquiring the EOG signal, noise interference including baseline wander, Electromyogram (EMG), electrocardiographic ECG, electroencephalogram (EEG), and slight movement of a motor position inevitably occurs, and in order to complete robust recognition of the saccade EOG signal and improve the performance of the EOG-based human behavior recognition system, a band-pass filtering method has been used to eliminate and remove the blink signal, but since the saccade EOG signal and the blink signal overlap in frequency band, a part of information of the saccade EOG signal is lost while filtering the blink signal, and thus, the recognition effect is not ideal, and it is difficult to be put into practical use.
Disclosure of Invention
The invention aims to provide a robust glance EOG signal identification method and a robust glance EOG signal identification system so as to improve the accuracy of EOG signal identification.
To achieve the above object, in a first aspect of the present invention, there is provided a robust glance EOG signal identification method, including the steps of:
s1, acquiring multi-channel EOG data to obtain eye movement data in a time domain;
s2, preprocessing the eye movement data in the time domain to obtain eye movement data in the frequency domain;
s3, in the frequency domain, blind source separation is carried out on the eye movement data of each frequency point by adopting a complex value ICA algorithm, and frequency domain independent components of each independent source signal on the corresponding frequency point are obtained;
s4, carrying out scale compensation on the independent components on each frequency point, and restoring the real proportional components of the independent components in the observed components;
s5, processing the compensated independent components by adopting a constrained DOA algorithm, so that the independent sources on each frequency point f are arranged from small to large according to the direction angle;
s6, carrying out short-time Fourier inverse transformation processing on the independent components of the frequency points after scale compensation and sequencing to obtain complete time signals of independent sources of each channel in the time domain;
and S7, extracting power spectral density characteristics of the complete time signal of each channel independent source in the time domain, and sending the extracted characteristics into a Support Vector Machine (SVM) for identification to obtain an EOG signal identification result.
Wherein, step S2 further includes:
performing band-pass filtering and mean value removing processing on the eye movement data in the time domain to obtain processed eye movement data;
using a sliding window with the window length of 256 and the window shift of 128 to perform short-time Fourier transform on the processed eye movement data, transforming the processed eye movement data from a time domain to a frequency domain, and acquiring the eye movement data at a certain frequency point fkEnough frequency domain observation data.
In step S4, the performing scale compensation on the independent component at each frequency point specifically includes:
obtaining a mixed matrix of corresponding frequency points according to a separation matrix of each frequency point in a complex value ICA algorithm, wherein the separation matrix and the mixed matrix are inverse matrixes;
and compensating the independent components of the frequency points by using the coefficients of the mixed matrix to obtain the compensated independent components of the frequency points.
In step S5, the sorting process of the compensated independent components by using the constrained DOA algorithm specifically includes:
a. initializing an angle for each independent source;
b. calculating different rows of each frequency point through a Root-Music algorithm to obtain the estimation of each source direction, wherein the rows of the separation matrix correspond to different independent sources;
c. setting the proximity measurement of the direction angle and the initialization angle of each independent source as epsilon (y, theta), and judging whether the angle and the initialization angle of each independent source are the same in the iteration process;
d. if the two are the same, executing the step e, and if the two are not the same, executing the step f;
e. will epsilon (y)jj) Setting to 0, and setting a direction angle matrix T to calculate an adjustment matrix Q;
f. will epsilon (y)jj) Setting the value to be 1, returning to the iterative process to recalculate the separation matrix W;
j ═ 1,2,3 … M; m represents an independent source sjThe number of (2).
Wherein, the step e specifically comprises:
setting a direction angle matrix T according to the arrangement sequence of the independent sources on each frequency point f from small to large;
calculating an adjustment matrix Q ═ TP according to the direction angle matrix T-1Wherein P is a permutation matrix;
adjusting matrix Q ═ TP-1Judging whether the permutation matrix P is the same as the direction angle matrix T
If the frequency points are the same, determining that the independent sources on each frequency point are arranged from small to large according to the angle;
if not, the adjusting matrix Q is multiplied by the permutation matrix P to obtain a new permutation matrix P';
and according to the new permutation matrix P', the independent sources on each frequency point f are arranged from small to large according to the direction angle after calculation.
The specific steps of the short-time inverse fourier transform in step S6 include:
carrying out short-time inverse Fourier transform processing on the independent components of the frequency points after the scale compensation and the sequencing;
carrying out inversion operation on the obtained time-frequency matrix according to columns to obtain time signals of the eye movement data at different window positions;
and splicing the time signals according to the sequence of the small time window to the large time window to obtain the complete time signal of each channel independent source on the time domain.
Wherein, step S7 specifically includes:
calculating the kurtosis value of each channel in a time domain according to the fact that the kurtosis value of the saccade EOG signal is larger than that of other signal sources;
selecting two channel independent sources with the largest kurtosis value to extract features according to the kurtosis value sequence of each independent source calculated in the time domain;
and sending the extracted features into a Support Vector Machine (SVM) for recognition to obtain an EOG signal recognition result.
In a second aspect, the present invention provides a robust glance EOG signal identification system comprising: the method comprises the following steps: the device comprises an acquisition module, a preprocessing module, a blind source separation module, a compensation module, a sequencing module, a time domain restoration module and an EOG signal identification module;
the acquisition module acquires the multi-channel EOG data to obtain eye movement data in a time domain;
the preprocessing module is connected with the acquisition module to preprocess the eye movement data on the time domain to obtain the eye movement data on the frequency domain;
the blind source separation module is connected with the preprocessing module so as to carry out blind source separation on the eye movement data of each frequency point on a frequency domain by adopting a complex value ICA algorithm to obtain frequency domain independent components of each independent source signal on the corresponding frequency point;
the compensation module is connected with the blind source separation module and is used for carrying out scale compensation on the independent components on each frequency point and restoring the real proportional components of the independent components in the observed components;
the sorting module is connected with the compensation module and is used for processing the compensated independent components by adopting a constrained DOA algorithm, so that the independent sources on each frequency point f are arranged from small to large according to the direction angle;
the time domain reduction module is connected with the sequencing module and is used for carrying out short-time Fourier inverse transformation processing on the independent components of the frequency points after the scale compensation and the sequencing to obtain complete time signals of each independent source in the time domain;
and the EOG signal identification module is connected with the time domain restoration module so as to extract power spectral density characteristics of each channel independent source in the time domain, and the extracted characteristics are sent to a Support Vector Machine (SVM) for identification to obtain an EOG signal identification result.
Wherein, the compensation module is specifically used for:
obtaining a mixed matrix of corresponding frequency points according to a separation matrix of each frequency point in a complex value ICA algorithm, wherein the separation matrix and the mixed matrix are inverse matrixes;
and carrying out scale compensation on the independent components of the frequency points by using the coefficient of the mixed matrix to obtain the independent components of the frequency points after the scale compensation.
Compared with the prior art, the invention has the following technical effects: the invention carries out blind source separation on the eye movement data of each frequency point on the frequency domain. After blind source separation is carried out, aiming at the problem of scale uncertainty in blind source separation, the obtained mixed matrix coefficient is utilized to carry out scale compensation on Independent components of each frequency point, the real proportional components of the Independent components in the observed components are reduced, the inherent scale uncertainty problem of a complex value ICA (Independent Component Analysis, ICA) is solved, and meanwhile, aiming at the problem of sorting ambiguity in blind source separation, the Independent sources on each frequency point f are arranged from small to large according to the Direction angle through a DOA (Direction of Arrival, DOA) algorithm, so that the inherent sorting ambiguity problem of the ICA is solved. The integrity of independent components output by each frequency point is ensured, and the accuracy of information carried in the EOG signal is ensured.
Drawings
The following detailed description of embodiments of the invention refers to the accompanying drawings in which:
FIG. 1 is a schematic diagram of the distribution of electrodes on the face of a subject during the acquisition of EOG signals in the present invention;
FIG. 2 is a schematic diagram of an experimental paradigm for experimentally acquiring EOG signals in accordance with the present invention;
FIG. 3 is a flow chart illustrating a robust glance EOG signal identification method of the present invention;
FIG. 4 is a schematic diagram of the basic principle of blind source separation in the present invention;
FIG. 5 is a schematic flow chart illustrating the EOG signal recognition according to the present invention;
FIG. 6 is a schematic diagram comparing time and frequency domain waveforms of an original glance EOG signal and a band-pass filtered time and frequency domain waveform of the glance EOG signal in accordance with the present invention;
FIG. 7 is a schematic diagram showing the comparison of blind source separation results of linear ICA and convolution ICA in the present invention;
FIG. 8 is a time and frequency domain waveform of a second channel segment of a saccade EOG signal after separation of a linear ICA and a convolutional ICA blind source, respectively, in accordance with the present invention;
FIG. 9 is a schematic diagram illustrating the comparison of EOG signals obtained before and after blind source separation of multi-channel eye movement data by using a convolution ICA algorithm in the present invention;
FIG. 10 is a graph comparing the results of EOG signal identification using different algorithms in the present invention;
FIG. 11 is a time domain waveform diagram of six adjacent frequency points of two independent sources obtained after blind source separation in the present invention;
FIG. 12 is a schematic diagram showing a comparison of the flip matrices before and after sorting in accordance with the present invention;
FIG. 13 is a diagram of the waveforms of the EOG signal at different frequency points after separation according to the present invention.
Detailed Description
To further illustrate the features of the present invention, refer to the following detailed description of the invention and the accompanying drawings. The drawings are for reference and illustration purposes only and are not intended to limit the scope of the present disclosure.
First, it should be noted that, in the present invention, before identifying an EOG signal, a process of acquiring the EOG signal is as shown in fig. 1 to 3:
as shown in fig. 1, the EOG signal of the subject is collected by using electrodes, the collection of the eye electrical signal uses Ag/AgCl electrodes, and 9 electrodes are used in the current collection in order to obtain the eye movement information and more spatial position information of the subject in four directions, namely, up, down, left and right, wherein the electrode V1 and the electrode V2 are placed at 1.5cm above and 1.5cm below the eyeball at the left side (or right side) of the subject for collecting the vertical EOG signal; electrode H1 and electrode H2 were placed 1.5cm left and 1.5cm right of the subject's left and right eyes, respectively, to acquire horizontal EOG signals; the electrode Fp1 and the electrode Fp2 are arranged at the forehead position to enhance the spatial information; the reference electrodes C1 and C2 are respectively arranged at the left and right breast bulges, and the grounding electrode D is arranged at the center of the vertex.
As shown in FIG. 2, in the specific experiment acquisition, the subject sits in front of the screen and faces the screen, a "prepare" character appears on the screen with the alarm sound of "beep", the subject can see a red arrow prompt (an upward arrow, a downward arrow, a leftward arrow and a rightward arrow respectively) on the screen after 1 second, the arrow appears on the screen for 6 seconds, and in the time period, the experiment requires that the subject rotates the eyeball in the direction indicated by the arrow after seeing the arrow and rotates back to the central point after seeing the observation point, and the subject cannot blink in the process. After this, there is a brief 2 second rest and the subject can blink and relax.
Specifically, in the experimental acquisition process of the saccade EOG signals, 9 bioelectrodes are arranged on the face of a subject to be used for acquiring the saccade EOG signals, a camera is used for capturing video information in the experimental process, and a Neuroscan amplifier is mainly used for amplifying the acquired saccade EOG signals; the computer records and displays the collected saccade EOG signals.
Example one
As shown in fig. 3, the present embodiment discloses a robust glance EOG signal identification method, which includes the following steps S1 to S6:
s1, acquiring multi-channel EOG data to obtain eye movement data in a time domain;
it should be noted that, here, M independent sources s are connected by N electrode pairsj(j ═ 1,2,3 … M) source signals are collected and amplified to obtain an observed signal vector Xi(i=1,2,ΛN)。
S2, preprocessing the multi-channel eye movement data in the time domain to obtain eye movement data in the frequency domain;
specifically, step S2 includes the following subdivided steps:
and performing band-pass filtering and mean value removing processing on the eye movement data in the time domain to obtain the processed eye movement data, wherein the cut-off frequency of the band-pass filter is 0.01 Hz-8 Hz.
And carrying out short-time Fourier transform on the processed eye movement data to obtain the eye movement data on the frequency domain.
In this embodiment, the sliding window used in the present embodiment has a window length of 256 and a window shift of 128, and performs short-time fourier transform on the processed eye movement data to transform the eye movement data from the time domain to the frequency domain, and acquire the eye movement data at a certain frequency fkEnough frequency domain observation data.
In the embodiment, the eye movement data is subjected to band-pass filtering and mean value removing processing, so that the interference of signals including baseline drift, electromyographic EMG, electrocardio-ECG, electroencephalogram EEG and the like is removed, the interference of different noise signals on the original multi-channel eye movement data is reduced, and the accuracy of EOG signal identification is further improved.
S3, in the frequency domain, blind source separation is carried out on the eye movement data of each frequency point by adopting a complex value ICA algorithm, and frequency domain independent components of each independent source signal on the corresponding frequency point are obtained;
as shown in fig. 4, the process of performing blind source separation on the eye movement data in the frequency domain specifically includes:
1) from multi-channel observation data Xi(i ═ 1,2, Λ N), a covariance matrix R of the observed data is calculatedxThe calculation formula of the covariance matrix is: rx=E{(X-mx)(X-mx)T}TWherein X is observed data, mXTo observe the mean value of the data, (.)TRepresenting transpose operations on formulas in parentheses, and E {. cndot.) represents desired operations on data in parentheses; determining a covariance matrix R of the observed dataxThen, whitening processing is needed to be carried out on the observation data, the orthogonalization of the mixed matrix is realized, the whitening matrix V is calculated,the calculation process is as follows:
the covariance matrix RxThe decomposition is as follows: rx=EDETWherein E is represented by RxD ═ diag (λ) of the normalized orthogonal eigenvector of (a)12,Λ,λN) Is a diagonal matrix formed by eigenvalues corresponding to the eigenvectors.
The resulting whitening matrix is represented in the form: v ═ D-1/2ET
2) Whitening the observation data by using a whitening matrix through a formula Z (t) VX (t), and calculating the fourth-order cumulant of the whitening process, and through a formula N (lambda); n is a radical ofrL 1 < r < M } calculating the number of important features not exceeding M, wherein lambda is the feature vector, NrRepresenting the dimension of observed data, M representing independent source number, and r being an integer not exceeding the independent source number;
3) using a unitary matrix to solve the equation N ═ λ; n is a radical ofrI1 & lt r & lt M & gt, wherein the unitary matrix is U, and a mixed matrix A is calculated by a formula A which is W multiplied by U;
4) since the mixing matrix a and the separation matrix W are inverse matrices to each other, the separation matrix W ═ a-1And blind source separation can be carried out on the observation data on each frequency point according to the separation matrix W.
S4, carrying out scale compensation on the independent components on each frequency point, restoring the real proportion of the independent components in the observed components and outputting the compensated independent components of each frequency point;
specifically, step S4 includes the following subdivided steps:
obtaining a mixed matrix of corresponding frequency points according to a separation matrix of each frequency point in a complex value ICA algorithm, wherein the separation matrix and the mixed matrix are inverse matrixes;
and compensating the independent components of the frequency points by using the coefficients of the mixed matrix to obtain the independent components of the frequency points after the scale compensation.
Specifically, taking the two-dimensional ICA problem as an example, the observed signal is defined as x1、x2The source is s1、s2Then the observed signal can be expressed as:
x1=a11s1+a12s2=v11+v12
x2=a21s1+a22s2=v21+v22
wherein v isij=aijsijRepresenting independent sources sjIn the observation of signal xiThe true component in (1), i.e. the independent source sjIn the observation of signal xiDue to v11、v21All from independent sources s1And v is11、v21And s1But only in amplitude. Likewise, v12、v22From an independent source s2So too does the relationship of (c). Therefore, if W (f)k) For a certain frequency point separation matrix to be estimated, the mixing moment A (f) at the frequency point can be obtainedk)=W-1(fk). Then, the obtained mixed matrix coefficient can be used for compensating the independent component of each frequency point, namely:
wherein, Yj(fkτ) represents the independent component of the j-th channel separated before the scale compensation, vij(fk,τ)=Aij(fk)Yij(fkAnd tau) represents the real component of the jth independent component in the ith observed signal after scale compensation. According to the analysis, the formula is used for a certain frequency point fkAfter the independent components are subjected to scale compensation, N compensated outputs are generated by one frequency domain independent component, and the N compensation structures are subjected to subsequent processing such as elimination of sequencing ambiguity, combination of different frequency points, inverse transformation and the like to obtain N pure signals from the same independent source.
In practical applications, N pure signals from the same independent source may be alternatively output, or N signals from the same independent source may be averaged and output.
S5, processing the compensated independent components by adopting a constrained DOA algorithm, so that the independent sources on each frequency point f are arranged from small to large according to the direction angle;
the method specifically comprises the following steps:
a. initializing an angle for each independent source;
b. calculating different rows of each frequency point through a Root-Music algorithm to obtain the estimation of each source direction, wherein the rows of the separation matrix correspond to different independent sources;
c. setting the proximity measurement of the direction angle and the initialization angle of each independent source as epsilon (y, theta), and judging whether the angle and the initialization angle of each independent source are the same in the iteration process;
d. if the two are the same, executing the step e, and if the two are not the same, executing the step f;
e. will epsilon (y)jj) Setting to 0, and setting a direction angle matrix T to calculate an adjustment matrix Q;
f. will epsilon (y)jj) And setting the value to be 1, and returning to the iterative process to recalculate the separation matrix W.
Note that for each independent source sjInitializing an angle thetajBecause of the uncertainty of the position of the independent source, the angle of the ith independent source is set to be smaller than the angle of the (i + 1) th source, and the initialized angle r (theta) is taken as a constraint condition. On the premise that separation of each frequency point f is successful, the rows of the separation matrix W correspond to different independent sources, and different rows of each frequency point are calculated through a Root-Music algorithm, so that estimation of each source direction can be obtained.
In order to effectively contrast and restrict the capacity of the DOA algorithm for distinguishing and sequencing the sequence errors, the proximity measurement of the angle obtained by each frequency point and the initialized angle is set as epsilon (y)jj) Wherein y isjFor the estimation of the respective source direction, the two angles are compared in an iterative process, if not identical, i.e. ε (y)jj) And returning to the iterative process to recalculate the separation matrix W as 1.
If the two angles are the same, i.e. ε (y)jj) When the value is equal to 0, thenA direction angle matrix T needs to be set to calculate the adjustment matrix Q.
Wherein, the independent sources on each frequency point f are arranged in the order of the angles from small to large, and the direction angle matrix T is set as follows:
in the direction angle matrix T, the diagonal lines show an angle arrangement order, and after scale compensation is performed on the signal after blind source separation, an estimated y of the source signal S can be obtained:
y=P∧S=PV,
where P is the permutation matrix, Λ is the diagonal matrix, and S is the source signal. For convenience of analysis, X in y ═ WX is taken in by AS, and y ═ WAS ═ Ds is obtained. Wherein, W is a separation matrix, X is observation data, A is a mixing matrix, and S is a source signal. From the uncertainty of ICA, it is necessary that only one non-zero element exists for each row and column of matrix D. The matrix can be converted into D ═ P ^, wherein P is a permutation matrix, and Λ is a diagonal matrix. P and Λ introduce uncertainty in the ICA output ordering and amplitude, respectively. Therefore, the embodiment sets an adjustment matrix Q to perform adjustment calculation on the P matrix, thereby solving the problem of ordering uncertainty of ICA.
Further, the process of calculating the adjustment matrix Q by the direction angle matrix T is as follows:
Q=TP-1
at this time, if the permutation matrix P is the same as the direction angle matrix T, the independent sources on each frequency point are arranged in the order from small to large according to the angle, so that the adjustment is not required again;
if the permutation matrix P is different from the direction angle matrix T, the permutation matrix P is subjected to left multiplication by the adjusting matrix Q, and then a new permutation matrix P' is obtained;
by the formula P ═ QP ═ TP-1And processing the permutation matrix P by the equation P ═ T to obtain a new permutation matrix P ', and then obtaining the direction angle of the independent source on the frequency point again by the equation y ═ P ^ S ═ P' V. The independent source at each frequency point f obtained at this time isThe sorting is performed according to the direction angles from small to large, so that the sorting fuzzy problem of the ICA is solved.
Therefore, in this embodiment, the proximity between the direction angle of the independent source obtained through each frequency point and the angle initialized by the corresponding independent source is measured, and when it is determined that the two angles are not consistent, the iteration process is returned to recalculate the separation matrix W. Otherwise, the direction angles of the independent sources are adjusted through the adjusting matrix Q, so that the direction angles of the independent sources are arranged in sequence, the problem of fuzzy sequencing of the direction angles of the independent sources is solved, and the accuracy of EOG signal identification is further improved. The uncertainty in the ordering of the ICA output (persistence algorithm) is an inherent limitation of the ICA algorithm. In the time-frequency domain blind deconvolution, a plurality of frequency point blind separation of different windows are involved, if the ICA separation results of all frequency points are not matched, frequency domain independent components belonging to the same source are combined together, namely sub-band signals from different sources are spliced together by errors, which can greatly affect the final separation effect, so that the signals recovered to the time domain are disordered, and further the identification result of the EOG signal is affected. After the independent components of each frequency point after the scale compensation are sequenced and adjusted through the constraint DOA algorithm, the independent sources on each frequency point f are arranged from small to large according to the direction angle, the problem of fuzzy sequencing on each frequency point can be effectively solved, the quality of blind source separation is improved, and the improvement of the recognition rate is facilitated.
S6, carrying out short-time Fourier inverse transformation processing on the independent components of the frequency points after scale compensation and sequencing to obtain complete time signals of independent sources of each channel in the time domain;
it should be noted that the method is used for performing short-time inverse fourier transform under the condition that the component arrangement corresponding to different sources of each frequency point is ensured to be correct and the amplitude is recovered, and finally, the obtained time domain signals are re-intercepted and combined to obtain the estimation of the source signals.
In the process of performing short-time inverse fourier transform on the signal, it needs to be explained that;
during calculation, the obtained time-frequency matrix is subjected to inverse operation according to columns to obtain time signals at different window positions, and then the time signals are spliced according to the sequence of the small window to the large window to obtain a complete time signal of a source.
In the above operation process, the time signals in the adjacent windows will be partially overlapped, and the length of the overlap is defined when the original observation signal is windowed and framed at the beginning and is half of the frame length. The processing of overlapping data in adjacent windows is typically by adding the second half of the previous window length to the first half of the next window length and then dividing by 2 for averaging.
And S7, extracting power spectral density characteristics of each channel independent source in the time domain, and sending the extracted characteristics into a Support Vector Machine (SVM) for identification to obtain an EOG signal identification result.
It should be noted that, in this embodiment, the EOG signal features in the glance signal are extracted and sent to the SVM for recognition. The specific identification process is as follows:
calculating the kurtosis value of each channel eye movement data of the multi-channel eye movement data restored to the time domain according to the characteristic that the kurtosis value of the saccade EOG signal is greater than that of other signal sources;
according to the calculated magnitude sequence of the kurtosis values of the eye movement data of each channel, selecting two channels of EOG data with the largest kurtosis values to perform feature extraction;
and sending the extracted features into a Support Vector Machine (SVM) for recognition to obtain an EOG signal recognition result.
The embodiment ensures the accuracy of identification while reducing the calculation amount by extracting the features of the EOG data of the two channels with the largest kurtosis value. In the process of acquiring the EOG signal, the EOG signal is inevitably subjected to noise interference of artifact signals, such as electromyographic EMG, baseline drift, electroencephalogram EEG, blink signals, electrocardio-ECG, slight movement of electrode positions and the like, and the artifact signals are difficult to completely filter by preprocessing. Therefore, the actually obtained observation data is mixed data of a plurality of source signals, and different source signals can be separated after blind source separation by an ICA algorithm, namely, the required glance EOG data and other source signals can be separated. In the experiment, if all the channel EOG data are subjected to feature extraction, in addition to the saccade EOG, other signal sources such as artifact signals can be contained in the 'pure' saccade EOG, and the recognition result can be influenced. And only the EOG data of the two most obvious channels of the saccade signals are subjected to feature extraction, so that the interference of other source signals on the EOG data can be effectively avoided, the data precision is improved, the calculated amount can be reduced, and the identification accuracy is improved.
It should be noted that, in this embodiment, while the eye movement data is subjected to traditional band-pass filtering to remove the blink signal, in order to avoid the situation that part of information of the saccade EOG signal is lost, blind source separation is performed on the eye movement data of each frequency point, noise signals are filtered, each source signal is separated from the observed signal, and the separated independent component of each frequency point is compensated, so that the integrity of each source signal after separation is ensured, and the situation that the saccade EOG signal carries information and is lost is avoided. The role of blind source separation here is mainly to recover the source signal implicit therein from the observed signal. The uncertainty of scale of ICA output (scalingcomplexity) can cause the gain error of each frequency point of the same independent source, generate the distortion of frequency spectrum, and finally seriously affect the separation quality. The amplitude compensation can be carried out on the independent components of each frequency point after the blind source separation by reducing the real proportional components of the independent components in the observed components, so that the error caused by the ICA amplitude uncertainty can be improved, and the improvement of the blind source separation quality is facilitated.
Example two
The embodiment discloses a robust glance EOG signal identification system, which comprises: the device comprises an acquisition module, a preprocessing module, a blind source separation module, a compensation module, a sequencing module, a time domain restoration module and an EOG signal identification module;
the acquisition module acquires the EOG multichannel eye movement data to obtain the eye movement data in a time domain;
the preprocessing module is connected with the acquisition module to preprocess the eye movement data on the time domain to obtain the eye movement data on the frequency domain;
the blind source separation module is connected with the preprocessing module so as to carry out blind source separation on the eye movement data of each frequency point on a frequency domain by adopting a complex value ICA algorithm to obtain frequency domain independent components of each independent source signal on the corresponding frequency point;
the compensation module is connected with the blind source separation module and is used for carrying out scale compensation on the independent components on each frequency point and restoring the real proportional components of the independent components in the observed components;
the sorting module is connected with the compensation module and is used for processing the compensated independent components by adopting a constrained DOA algorithm, so that the independent sources on each frequency point f are arranged from small to large according to the direction angle;
the time domain reduction module is connected with the sequencing module and is used for carrying out short-time Fourier inverse transformation processing on the independent components of the frequency points after the scale compensation and the sequencing to obtain complete time signals of each independent source in the time domain;
and the EOG signal identification module is connected with the time domain restoration module so as to extract power spectral density characteristics of each channel independent source in the time domain, and the extracted characteristics are sent to a Support Vector Machine (SVM) for identification to obtain an EOG signal identification result.
Further, the compensation module is specifically configured to:
obtaining a mixed matrix of corresponding frequency points according to a separation matrix of each frequency point in a complex value ICA algorithm, wherein the separation matrix and the mixed matrix are inverse matrixes;
and carrying out scale compensation on the independent components of the frequency points by using the coefficient of the mixed matrix to obtain the independent components of the frequency points after the scale compensation.
It should be noted that, after the conventional band-pass filtering processing is performed on the EOG signal, the blind source separation is performed on the eye movement data in the frequency domain, and after the blind source separation, the scale compensation is performed on the independent components at each frequency point to ensure the integrity of the scale of each independent source signal after the blind source separation, and meanwhile, the directional angle of each independent source is reordered by constraining the DOA algorithm, thereby solving the inherent sorting fuzzy problem of the ICA algorithm. Therefore, the integrity of the information carried by the EOG signal can be ensured, and the accuracy of EOG signal identification is ensured.
Specifically, FIGS. 6- (a), 6- (b) are time and frequency domain waveform diagrams of the original multichannel EOG panning signal and the band-pass filtered multichannel EOG panning signal, respectively, in this example. In the figure, the abscissa of the time domain waveform diagram represents the sampling point, the ordinate represents the amplitude of the signal, the abscissa of the frequency domain waveform diagram represents the frequency, and the ordinate represents the amplitude of the signal. The comparison shows that the original multi-channel EOG saccade signal has a blink artifact signal, the frequency domain frequency band is wide, and the band-pass filtering can inhibit the blink artifact in the multi-channel EOG saccade signal to a certain extent, but the blink artifact cannot be filtered because the blink signal source and the EOG saccade signal source have certain overlap on the frequency band, so the blink artifact has certain influence on the recognition rate of saccade motion, and the recognition rate is reduced.
Fig. 7 to 8 are schematic diagrams showing comparison of blind source separation results of the example by the linear ICA algorithm and the convolution ICA algorithm, respectively. FIGS. 7- (a) and 7- (b) show waveforms of a multi-channel EOG saccade signal after separation of a linear ICA and a convolutional ICA, respectively, where the abscissa shows sample points and the ordinate is the amplitude of the signal. FIGS. 8- (a) and 8- (b) are time and frequency domain waveforms of the EOG saccade signal obtained by slicing a segment of the second channel of FIGS. 7- (a) and 7- (b), respectively. The abscissa of the time domain waveform diagram represents the sampling point, the ordinate represents the amplitude of the signal, the abscissa of the frequency domain waveform diagram represents the frequency, and the ordinate represents the amplitude of the signal. As is clear from the two figures, the artifact signal is not "clean" after linear ICA separation, and a blink signal still exists, and the frequency band of the glance EOG signal after linear ICA separation is wider than that of the glance EOG signal after convolutional ICA separation.
Therefore, in the invention, the convolution ICA algorithm is adopted to carry out blind source separation on the multi-channel EOG data on the time domain, and waveform diagrams of EOG signals obtained before and after the blind separation are shown in FIG. 9. Wherein, 9- (a) is the multi-channel EOG signal waveform diagram before blind source separation, and 9- (b) is the multi-channel EOG signal waveform diagram obtained after blind source separation, the abscissa represents the sampling point, and the ordinate represents the amplitude of the signal. Comparing 9- (a) and 9- (b), the blink artifact source signal is separated after the convolution ICA separation.
It should be noted that the average recognition rate of the present invention using the convolution ICA and the different algorithms is shown in fig. 10. The abscissa represents the order of the subjects, and the ordinate represents the average recognition rate. It can be shown that the average recognition rate obtained by the convolution ICA method is 97.254%, which is improved by 4.854%, 7.168% and 2.64% respectively compared with the band-pass filtering method, the wavelet de-noising method and the linear ICA method. The method has certain effectiveness.
As shown in fig. 11- (a) and 11- (b), in this example, after the two-channel EOG signals are blindly separated by the complex value ICA algorithm, the two independent sources respectively obtain time-frequency domain waveform diagrams of six adjacent frequency points in the frequency domain. The abscissa indicates the position of the sliding window and the ordinate indicates the magnitude of the amplitude of the signal. As can be seen from the two waveform diagrams, the third and fifth channels present a problem of ordering ambiguity.
In the invention, after blind source separation processing is carried out, scale compensation is carried out on the independent components of each frequency point, and the independent sources on each frequency point after the scale compensation are arranged according to the sequence of the direction angles from small to large by a constraint DOA algorithm, so that the problem of fuzzy sequence on each frequency point is solved, and the specific effects are shown in FIGS. 12-13.
FIGS. 12- (a) and 12- (b) are flip matrix diagrams before and after sorting, respectively. The abscissa indicates the frequency point. Comparing the two figures, it can be seen that the number of the fuzzy ordering of the frequency points before the ordering is 32, and after the ordering by the algorithm provided by the invention, the number of the fuzzy ordering of the frequency points is reduced to 10, which shows that the algorithm provided by the invention has a certain effect on solving the problem of the fuzzy ordering.
Fig. 13 is a time-frequency domain waveform diagram of different frequency points of the EOG signal after separation in this example. The abscissa indicates the position of the sliding window and the ordinate indicates the magnitude of the signal amplitude.
FIGS. 13- (a) and 13- (b) are diagrams showing waveforms of the 6-channel EOG saccade signals 65 to 69 in the time-frequency domain before sequencing the consecutive 5 frequency points, and the 66 th of the first and fifth channels can be seen from the diagramsthThe frequency points have a fuzzy ordering problem (shown in red box), and the 67 th channels of the third channel and the fifth channelthThe bins also present a sorting ambiguity problem (shown in the green box). FIGS. 13- (c) and 13- (d) are waveform diagrams of 6-channel EOG signals 65 to 69 with 5 consecutive frequency points in sequence in time-frequency domainIt can be seen that the disordered order after sorting is adjusted.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (9)

1. A robust glance EOG signal identification method, comprising:
s1, acquiring multi-channel EOG data to obtain eye movement data in a time domain;
s2, preprocessing the eye movement data in the time domain to obtain eye movement data in the frequency domain;
s3, in the frequency domain, the eye movement data of each frequency point is separated in a blind source mode by adopting a complex value ICA algorithm to obtain each independent source SjFrequency domain independent components of the signal at corresponding frequency points, j is 1,2,3 …, M represents the total number of independent sources;
s4, carrying out scale compensation on the independent components on each frequency point, and restoring the real proportional components of the independent components in the observed components;
s5, processing the compensated independent components by adopting a constraint DOA algorithm, so that the independent source S on each frequency point fjAll arranged from small to large according to the direction angle;
s6, carrying out short-time inverse Fourier transform processing on the independent components of the frequency points after scale compensation and sequencing to obtain independent source S of each channel in the time domainjA complete time signal;
s7, independent source S for each channel in time domainjAnd extracting power spectral density characteristics from the complete time signal, and sending the extracted characteristics into a Support Vector Machine (SVM) for identification to obtain an EOG signal identification result.
2. The method according to claim 1, wherein the step S2 further comprises:
performing band-pass filtering and mean value removing processing on the eye movement data in the time domain to obtain processed eye movement data;
performing short-time Fourier transform on the processed eye movement data, transforming the processed eye movement data from a time domain to a frequency domain, and acquiring the eye movement data at a certain frequency point fkUpper frequency domain observation data.
3. The method according to claim 1, wherein the performing the scale compensation on the independent component at each frequency point in step S4 specifically includes:
obtaining a mixed matrix of corresponding frequency points according to a separation matrix of each frequency point in a complex value ICA algorithm, wherein the separation matrix and the mixed matrix are inverse matrixes;
and compensating by using the independent component of each frequency point of the coefficient of the mixed matrix to obtain the independent component of each frequency point after compensation.
4. The method according to claim 3, wherein the step S5 of sorting the compensated independent components by using the constrained DOA algorithm specifically includes:
a. for each independent source sjInitializing an angle;
b. different lines of each frequency point are calculated through a Root-Music algorithm, and each independent source s can be obtainedjEstimation of direction, in which the rows of the separation matrix correspond to different independent sources sj
c. Setting individual sources sjIs determined to be s (y, theta) and in an iterative process, each independent source s is determinedjWhether the angle of (a) is the same as the initialization angle;
d. if the two are the same, executing the step e, and if the two are not the same, executing the step f;
e. will epsilon (y)jj) Setting to 0 and setting a direction angle matrix T to calculate an adjustment matrix Q, j equal to 1,2,3 …, M;
f. will epsilon (y)jj) Set to 1 and return to the iterative process to recalculate the separation matrix W, j — 1,2,3 …, M.
5. The method according to claim 4, wherein said step e specifically comprises:
according to independent source s on each frequency point fjSetting a direction angle matrix T according to the arrangement sequence of angles from small to large;
calculating an adjustment matrix Q ═ TP according to the direction angle matrix T-1Wherein P is a permutation matrix;
adjusting matrix Q ═ TP-1Judging whether the permutation matrix P is the same as the direction angle matrix T
If the same, determining independent source s on each frequency pointjThe angles are arranged from small to large;
if not, the adjusting matrix Q is multiplied by the permutation matrix P to obtain a new permutation matrix P';
according to the new permutation matrix P', the independent source s on each frequency point f is calculatedjAre arranged from small to large according to the direction angle.
6. The method as claimed in claim 1, wherein the step of inverse short-time fourier transform in step S6 comprises:
carrying out short-time inverse Fourier transform processing on the independent components of the frequency points after the scale compensation and the sequencing;
carrying out inversion operation on the obtained time-frequency matrix according to columns to obtain time signals of the eye movement data at different window positions;
splicing the time signals according to the sequence of the small time window to the large time window to obtain the independent source s of each channel on the time domainjThe complete time signal.
7. The method according to any one of claims 1 to 6, wherein the step S7 specifically includes:
calculating the kurtosis value of each channel in a time domain according to the fact that the kurtosis value of the saccade EOG signal is larger than that of other signal sources;
according to individual sources s calculated in the time domainjThe order of kurtosis values is selected, and two channel independent sources s with the largest kurtosis value are selectedjCarrying out feature extraction;
and sending the extracted features into a Support Vector Machine (SVM) for recognition to obtain an EOG signal recognition result.
8. A robust glance EOG signal identification system, comprising: the device comprises an acquisition module, a preprocessing module, a blind source separation module, a compensation module, a sequencing module, a time domain restoration module and an EOG signal identification module;
the acquisition module acquires the multi-channel EOG data to obtain eye movement data in a time domain;
the preprocessing module is connected with the acquisition module to preprocess the eye movement data on the time domain to obtain the eye movement data on the frequency domain;
the blind source separation module is connected with the preprocessing module so as to carry out blind source separation on the eye movement data of each frequency point by adopting a complex value ICA algorithm on a frequency domain to obtain each independent source sjFrequency domain independent components of the signal at corresponding frequency points;
the compensation module is connected with the blind source separation module and is used for carrying out scale compensation on the independent components on each frequency point and restoring the real proportional components of the independent components in the observed components;
the sequencing module is connected with the compensation module and is used for processing the compensated independent components by adopting a constrained DOA algorithm so as to enable the independent source s on each frequency point fjAll arranged from small to large according to the direction angle;
the time domain reduction module is connected with the sequencing module and is used for carrying out short-time inverse Fourier transform processing on the independent components of the frequency points after the scale compensation and the sequencing to obtain each independent source s in the time domainjA complete time signal;
the EOG signal identification module is connected with the time domain restoration module to set each channel independent source s on the time domainjAnd extracting power spectral density characteristics, and sending the extracted characteristics into a Support Vector Machine (SVM) for identification to obtain an EOG signal identification result.
9. The system of claim 8, wherein the compensation module is specifically configured to:
obtaining a mixed matrix of corresponding frequency points according to a separation matrix of each frequency point in a complex value ICA algorithm, wherein the separation matrix and the mixed matrix are inverse matrixes;
and carrying out scale compensation on the independent components of the frequency points by using the coefficient of the mixed matrix to obtain the independent components of the frequency points after the scale compensation.
CN201710695426.7A 2017-08-15 2017-08-15 Robust glance EOG signal identification method and system Active CN107348958B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710695426.7A CN107348958B (en) 2017-08-15 2017-08-15 Robust glance EOG signal identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710695426.7A CN107348958B (en) 2017-08-15 2017-08-15 Robust glance EOG signal identification method and system

Publications (2)

Publication Number Publication Date
CN107348958A CN107348958A (en) 2017-11-17
CN107348958B true CN107348958B (en) 2019-12-24

Family

ID=60287351

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710695426.7A Active CN107348958B (en) 2017-08-15 2017-08-15 Robust glance EOG signal identification method and system

Country Status (1)

Country Link
CN (1) CN107348958B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109350030B (en) * 2018-08-17 2020-04-21 西安电子科技大学 System and method for processing human face video heart rate signal based on phase amplification
CN109344754A (en) * 2018-09-21 2019-02-15 电子信息系统复杂电磁环境效应国家重点实验室 A kind of improvement type shortest path is deficient to determine source signal restoration methods
CN110298303B (en) * 2019-06-27 2022-03-25 西北工业大学 Crowd identification method based on long-time memory network glance path learning
CN111657943A (en) * 2020-06-24 2020-09-15 宿州小马电子商务有限公司 Demand perception eye movement information extraction and identification method for children

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7254257B2 (en) * 2002-03-04 2007-08-07 Samsung Electronics Co., Ltd. Method and apparatus of recognizing face using component-based 2nd-order principal component analysis (PCA)/independent component analysis (ICA)
CN101599127B (en) * 2009-06-26 2011-09-14 安徽大学 Method for extracting and identifying characteristics of electro-ocular signal
US8543194B2 (en) * 2010-12-28 2013-09-24 Industrial Technology Research Institute System and method of detecting abnormal movement of a physical object
CN103690163B (en) * 2013-12-21 2015-08-05 哈尔滨工业大学 Based on the automatic eye electrical interference minimizing technology that ICA and HHT merges
CN103892829B (en) * 2014-04-17 2016-04-27 安徽大学 A kind of eye based on common space pattern moves signal recognition system and recognition methods thereof
CN105447475A (en) * 2015-12-21 2016-03-30 安徽大学 Independent component analysis based glancing signal sample optimization method

Also Published As

Publication number Publication date
CN107348958A (en) 2017-11-17

Similar Documents

Publication Publication Date Title
Chen et al. The use of multivariate EMD and CCA for denoising muscle artifacts from few-channel EEG recordings
Somers et al. A generic EEG artifact removal algorithm based on the multi-channel Wiener filter
CN107348958B (en) Robust glance EOG signal identification method and system
Barros et al. Extraction of event-related signals from multichannel bioelectrical measurements
Chen et al. Removal of muscle artifacts from single-channel EEG based on ensemble empirical mode decomposition and multiset canonical correlation analysis
Chen et al. Simultaneous ocular and muscle artifact removal from EEG data by exploiting diverse statistics
Vigon et al. Quantitative evaluation of techniques for ocular artefact filtering of EEG waveforms
CN110269609B (en) Method for separating ocular artifacts from electroencephalogram signals based on single channel
Ghaderi et al. Effects of eye artifact removal methods on single trial P300 detection, a comparative study
Lio et al. Greater robustness of second order statistics than higher order statistics algorithms to distortions of the mixing matrix in blind source separation of human EEG: implications for single-subject and group analyses
Chan et al. The removal of ocular artifacts from EEG signals using adaptive filters based on ocular source components
Zou et al. Automatic EEG artifact removal based on ICA and Hierarchical Clustering
Dasgupta et al. A two-stage framework for denoising electrooculography signals
CN109480832A (en) The removing method of Muscle artifacts in a kind of single pass EEG signals
Jiang et al. Covariance and time-scale methods for blind separation of delayed sources
Kaczorowska et al. Comparison of the ICA and PCA methods in correction of EEG signal artefacts
Turnip Comparison of ICA-based JADE and SOBI methods EOG artifacts removal
Kalantar et al. Adaptive dimensionality reduction method using graph-based spectral decomposition for motor imagery-based brain-computer interfaces
Abd Rahman et al. A review on the current state of artifact removal methods for electroencephalogram signals
Gu et al. AOAR: an automatic ocular artifact removal approach for multi-channel electroencephalogram data based on non-negative matrix factorization and empirical mode decomposition
Zhang et al. Blind source separation and artefact cancellation for single channel bioelectrical signal
Lee et al. Single-trial event-related potential extraction through one-unit ICA-with-reference
Devuyst et al. Removal of ECG artifacts from EEG using a modified independent component analysis approach
Hamner et al. Learning dictionaries of spatial and temporal EEG primitives for brain-computer interfaces
Roy et al. Automatic removal of artifacts from EEG signal based on spatially constrained ICA using daubechies wavelet

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant