CN111966218A - Two-dimensional cursor control method and device - Google Patents

Two-dimensional cursor control method and device Download PDF

Info

Publication number
CN111966218A
CN111966218A CN202010700861.6A CN202010700861A CN111966218A CN 111966218 A CN111966218 A CN 111966218A CN 202010700861 A CN202010700861 A CN 202010700861A CN 111966218 A CN111966218 A CN 111966218A
Authority
CN
China
Prior art keywords
signal
eye movement
user
module
classification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010700861.6A
Other languages
Chinese (zh)
Inventor
范晓丽
王怡静
闫野
印二威
邓宝松
张久松
徐梦菲
谢良
罗治国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
National Defense Technology Innovation Institute PLA Academy of Military Science
Original Assignee
Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
National Defense Technology Innovation Institute PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center, National Defense Technology Innovation Institute PLA Academy of Military Science filed Critical Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
Priority to CN202010700861.6A priority Critical patent/CN111966218A/en
Publication of CN111966218A publication Critical patent/CN111966218A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2134Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on separation criteria, e.g. independent component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • G06F2218/04Denoising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a two-dimensional cursor control method and a device, wherein the method comprises the following steps: inputting the first extraction result and the second extraction result into a preset classification model for classification processing to obtain corresponding classification results; performing regression prediction on the classification result according to a Gaussian process regression model to predict the target position of the two-dimensional cursor; and controlling the two-dimensional cursor to move to the target position. Therefore, by adopting the embodiment of the application, the first extraction result corresponding to the eye movement signal and the second extraction result corresponding to the electroencephalogram signal are input into the preset classification model for classification processing, the corresponding classification result is obtained, regression prediction is carried out on the classification result according to the Gaussian process regression model, the target position of the two-dimensional cursor can be accurately predicted, and the two-dimensional cursor is controlled to move to the predicted target position.

Description

Two-dimensional cursor control method and device
Technical Field
The invention relates to the field of image processing and signal encoding/decoding by combining an eye movement tracking technology, a virtual reality technology and a brain-computer interface technology, in particular to a two-dimensional cursor control method and a two-dimensional cursor control device.
Background
The eye is an important organ for humans to obtain information from the surrounding world. The external information received by the human is 80% from the visual channel established by the eyes, and the human can reflect the activity process on the eye movement when doing thinking or psychological activities. The eye tracking technology is to realize the tracking of eyeball movement by measuring the position of a gazing point of an eye or the movement of an eyeball relative to the head. When the eyes of a person look at different directions, the eyes can slightly change, the changes can generate extractable features, and the computer extracts the features through image capturing or scanning, so that the changes of the eyes are tracked in real time, the state and the demand of a user are predicted and responded, and the aim of controlling the equipment by the eyes is fulfilled. It can be said that the eye tracking technology is the most intuitive and effective way to "look through" human thinking under the condition allowed by the current technology.
Compared with the traditional interaction mode, the Virtual Reality (VR) technology simulates the reality of the environment and the reality of the real world, so that people can feel personally on the scene. Meanwhile, the virtual reality technology has all human perception function systems and super-strong simulation systems, so that human-computer interaction is really realized, and the human can obtain the truest feedback in the environment in the operation process. Due to the immersion, the interactivity, the multi-perceptibility, the imagination and the autonomy, the application scenes of the virtual reality technology are more and more extensive. And the VR head display combined with the eye tracking technology can provide a more real and natural digital interaction mode. The interaction mode can be used for military simulation training, simulating battlefield environments which are difficult to arrange in practice, coping with ultrahigh pressure conditions of war and aiming and hitting targets under the condition that enemy firepower is violent. Simultaneously, along with the improvement of remote control technique, the VR head shows because of possessing wide observation angle and extremely strong substitution sense, will become the best observation remote control platform of equipment such as for military use unmanned aerial vehicle, remote-controlled robot, very big reduction fight personnel's consumption.
Because the interaction capability of the VR depends on the development of stereoscopic display and sensor technology, the existing mechanical and tactile sensing devices are not mature enough, so that the tracking accuracy and the tracking range of the VR equipment are still to be improved.
Disclosure of Invention
The embodiment of the application provides a two-dimensional cursor control method and device. The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview and is intended to neither identify key/critical elements nor delineate the scope of such embodiments. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
In a first aspect, an embodiment of the present application provides a two-dimensional cursor control method, where the method includes:
acquiring eye movement signals and brain electrical signals acquired by a signal acquisition module;
performing first preprocessing on the eye movement signal to obtain a preprocessed eye movement signal, and performing second preprocessing on the electroencephalogram signal to obtain a preprocessed electroencephalogram signal;
performing feature extraction on the preprocessed eye movement signal to obtain a first extraction result at least comprising eye movement signal features; performing feature extraction on the preprocessed electroencephalogram signal to obtain a second extraction result at least comprising electroencephalogram signal features;
inputting the first extraction result and the second extraction result into a preset classification model for classification processing to obtain corresponding classification results;
performing regression prediction on the classification result according to a Gaussian process regression model to predict the target position of the two-dimensional cursor, wherein the Gaussian process regression model is used for predicting the target position of the two-dimensional cursor;
and controlling the two-dimensional cursor to move to the target position.
In a second aspect, an embodiment of the present application provides a two-dimensional cursor control device, where the device includes:
the acquisition module is used for acquiring the eye movement signals and the brain electrical signals acquired by the signal acquisition module;
the preprocessing module is used for performing first preprocessing on the eye movement signal acquired by the acquisition module to obtain a preprocessed eye movement signal and performing second preprocessing on the electroencephalogram signal to obtain a preprocessed electroencephalogram signal;
the feature extraction module is used for performing feature extraction on the preprocessed eye movement signal obtained by the preprocessing module to obtain a first extraction result at least comprising eye movement signal features; performing feature extraction on the preprocessed electroencephalogram signal obtained by the preprocessing module to obtain a second extraction result at least comprising electroencephalogram signal features;
the classification module is used for inputting the first extraction result and the second extraction result extracted by the feature extraction module into a preset classification model for classification processing to obtain corresponding classification results;
the prediction module is used for performing regression prediction on the classification result obtained by the classification module according to a Gaussian process regression model to predict the target position of the two-dimensional cursor, wherein the Gaussian process regression model is used for predicting the target position of the two-dimensional cursor;
and the control module is used for controlling the two-dimensional cursor to move to the target position predicted by the prediction module.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
on the basis of an eye tracking technology and a virtual reality technology, a brain-computer interface technology is combined, namely a mode of pseudo-random sequence coding stimulation is used for inducing a brain coding modulation visual evoked potential signal. And by fusing the characteristic values of the eye movement signal and the coded modulation visual evoked potential signal, a Gaussian process regression model is established, the target position is predicted, and finally the control movement of the two-dimensional cursor is realized.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a schematic flowchart of a two-dimensional cursor control method according to an embodiment of the present application;
fig. 2 is a schematic usage flow chart of a two-dimensional cursor control method provided in an embodiment of the present application;
fig. 3 is a schematic diagram illustrating an operating principle of a two-dimensional cursor control method according to an embodiment of the present application;
fig. 4 is a flow chart of cursor instruction recognition in a two-dimensional cursor control method according to an embodiment of the present application;
fig. 5 is a schematic diagram illustrating a principle of a flicker-encoded stimulation of a two-dimensional cursor control device according to an embodiment of the present application;
FIG. 6 is a block diagram of a two-dimensional cursor control device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of another two-dimensional cursor control device according to an embodiment of the present application.
Detailed Description
The following description and the drawings sufficiently illustrate specific embodiments of the invention to enable those skilled in the art to practice them.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
In the description of the present invention, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Up to now, since the tracking accuracy and the tracking range of the VR device have priority, accurate control of the two-dimensional cursor cannot be realized. Therefore, the present application provides a two-dimensional cursor control method and apparatus to solve the above-mentioned problems in the related art. In the technical scheme provided by the application, on the basis of an eye movement tracking technology and a virtual reality technology, a brain-computer interface technology is combined, namely, a code-modulated visual evoked potentials (cVEPs) signal is induced by a pseudo-random sequence coding stimulation mode. By fusing the characteristic values of the eye movement signals and the cVEPs signals, a gaussian process regression model is established, the target position is predicted, and finally the control movement of the two-dimensional cursor is realized, which is described in detail by adopting an exemplary embodiment.
The two-dimensional cursor control method provided by the embodiment of the present application will be described in detail below with reference to fig. 1 to 5.
The core idea of the two-dimensional cursor control method provided by the embodiment of the present disclosure is that, on the basis of the eye tracking technology and the virtual reality technology, a brain-computer interface technology is combined, that is, a code-Modulated Visual Evoked Potentials (cVEPs) signal is Evoked by a pseudo-random sequence coding stimulation mode. And (3) establishing a Gaussian process regression model by fusing the characteristic values of the eye movement signals and the cVEPs signals, predicting the target position, finally realizing the control of the movement of the two-dimensional cursor, and moving the two-dimensional cursor to the predicted target position.
Referring to fig. 1, a schematic step diagram of a method for controlling a two-dimensional cursor is provided in an embodiment of the present application. As shown in fig. 1, a method for controlling a two-dimensional cursor according to an embodiment of the present application may include the following steps:
s101, acquiring eye movement signals and brain electrical signals acquired by a signal acquisition module;
s102, carrying out first preprocessing on the eye movement signals to obtain preprocessed eye movement signals, and carrying out second preprocessing on the brain electrical signals to obtain preprocessed brain electrical signals;
s103, extracting the features of the preprocessed eye movement signals to obtain a first extraction result at least comprising the features of the eye movement signals; performing feature extraction on the preprocessed electroencephalogram signal to obtain a second extraction result at least comprising electroencephalogram signal features;
in this step, the eye movement signal characteristic includes at least one of:
the user pupil characteristics, the user fixation point number characteristics and the user eye fixation duration characteristics. The above lists only common eye movement signal features, and other eye movement signal features may also be introduced according to the needs of different application scenarios, which are not described herein again.
In one possible implementation manner, the first preprocessing is performed on the eye movement signal, and obtaining a preprocessed eye movement signal includes the following steps:
carrying out data smoothing processing and/or denoising and filtering processing on the eye image to obtain eye to-be-processed data containing eye movement signals;
removing data of an invalid region from the eye to-be-processed data to obtain first valid data corresponding to the valid region;
determining second effective data corresponding to the user region of interest from the first effective data by using a matrix method;
and preprocessing the second effective data through the first preprocessing model to obtain a preprocessed eye movement signal.
It should be noted that the first preprocessing model is a conventional model, and the first preprocessing model can perform convolution preprocessing or normalization preprocessing on the second valid data to obtain a preprocessed eye movement signal.
After the preprocessed eye movement signals are obtained, feature extraction is carried out on the second effective data in time and space through the first feature extraction model, and corresponding eye movement features are obtained.
In this step, the first feature extraction model is a conventional feature extraction model, and can extract features of the second valid data corresponding to the region of interest of the user in time and space. The extracted eye movement signal features may include user pupil features, and may further include user gazing point number features and user eye gazing duration features.
In a possible implementation manner, before determining, by using a matrix method, second valid data corresponding to a region of interest of a user from the first valid data, the two-dimensional cursor control method provided in the embodiment of the present disclosure further includes the following steps:
determining a user region of interest; in this way, in order to improve the efficiency of data processing, only the second valid data corresponding to the region of interest of the user is subjected to feature extraction, and finally, the first extraction result at least comprising the eye movement signal features can be obtained quickly.
In one possible implementation, determining the user interest region includes the following steps:
under the condition that first monitoring data acquired by a first signal acquisition module for acquiring eye movement signals show that the pupils of a user are in an enlarged state, determining that the current area is an area of interest of the user; and/or the presence of a gas in the gas,
under the condition that second monitoring data acquired by a second signal acquisition module for acquiring eye movement signals show that the number of user gazing points is greater than or equal to a preset user gazing point number threshold value, determining that the current area is a user interested area; and/or the presence of a gas in the gas,
and under the condition that third monitoring data acquired by a third signal acquisition module for acquiring the eye movement signal show that the eye watching duration of the user is greater than or equal to a preset eye watching duration threshold value of the user, determining the current area as the area of interest of the user.
The above only lists common method steps for determining the region of interest of the user, and other method steps for determining the region of interest of the user may also be introduced according to the needs of different application scenarios, which are not described herein again.
In practical application, the process of extracting the features of the eye movement signal is specifically as follows:
in the process of extracting the characteristics of the eye movement signals, the eye movement signal collector mainly comprises an eye movement camera and equipment with an infrared light source and is used for collecting the eyeball position information of a user in real time. The acquired eye movement signal is collected, and data smoothing and denoising filtering are performed firstly. And carrying out mean value processing on the data based on the continuous points to obtain smooth data and reduce abrupt noise interference. Second, the valid area of the data is reserved. In general, the eye movement signal data includes data that deviates from the effective area, and the data that deviates from the effective area is removed during processing. And carrying out image identification by using a matrix model established by a matrix method, and reserving the identified region of interest. For images of the region of interest, 3D convolution can better capture the image's characteristic information both temporally and spatially. The convolutional layer performs a convolution operation using 64 7 × 7 × 5 3D convolution kernels (7 × 7 is the spatial dimension and 5 is the temporal dimension, i.e., 5 frames of images are operated on each time). After the eye image signal passes through the convolution layer, the eye image signal is normalized, namely: BN (Batch Normalization) and ReLU (Rectified Linear Unit), resulting in a profile.
In this step, the electroencephalogram signal characteristic includes at least one of:
the frequency characteristic corresponding to the electroencephalogram signal and the phase characteristic corresponding to the electroencephalogram signal. The above lists only common electroencephalogram characteristics, and other electroencephalogram characteristics can be introduced according to the needs of different application scenarios, which are not described herein again.
In this step, the signal collector records the EEG signals in real time through nine electrodes (Pz, PO5, PO3, POz, PO 4-PO 6, O1, Oz and O2) positioned on the top leaf and the occipital leaf, the reference electrode is arranged at Cz, and the signal sampling rate is 1000 Hz. Because the amplitude of the electroencephalogram signal is weak, firstly, the acquired signal is amplified and then correspondingly preprocessed. In the preprocessing process, firstly, the electroencephalogram data are down-sampled to 250Hz, then a 50Hz Chebyshev I type IIR trap is used for removing power frequency interference, and finally an 8-70Hz Chebyshev I type IIR band-pass filter is used for filtering and denoising. After the preprocessing is completed, the system extracts the frequency characteristic and the phase characteristic of the electroencephalogram signal by adopting a corresponding characteristic extraction model constructed by algorithms such as a CCA (typical Correlation Analysis) algorithm, a TRCA (Task-Related Component Analysis) algorithm and the like.
In practical applications, besides the CCA algorithm and the TRCA algorithm, there may be an ICA (Independent Components Analysis) algorithm and a PCA (Principal Component Analysis) algorithm. The algorithm is a conventional algorithm, and the process of constructing the feature extraction model based on the algorithm is also a conventional method, which is not described herein again.
And S104, inputting the first extraction result and the second extraction result into a preset classification model for classification processing to obtain corresponding classification results.
In this step, the predetermined classification model may be a long-short term memory convolutional neural network model.
It should be noted that feature extraction is performed on the eye movement signal features and the electroencephalogram signal features, and a mixed feature vector is generated by the obtained two signal features.
In practical applications, the preset classification model may be a long-term and short-term memory convolutional neural network model, and may also be other neural network models, such as a cyclic neural network model, a time-delay neural network model, a convolutional neural network model, and a deep residual neural network model.
Based on the above-mentioned cyclic neural network model, time delay neural network model, convolutional neural network model and depth residual neural network model are conventional technologies, and are not described herein again.
And S105, performing regression prediction on the classification result according to a Gaussian process regression model, and predicting the target position of the two-dimensional cursor, wherein the Gaussian process regression model is used for predicting the target position of the two-dimensional cursor.
In this step, the gaussian process regression model is also a conventional model, and is not described herein again. The model may be used to predict a target position of a two-dimensional cursor.
And S106, controlling the two-dimensional cursor to move to the target position.
In this step, the application scenario corresponding to the control of the two-dimensional cursor moving to the target position may be: and commanding to control the unmanned aerial vehicle or the remote control robot to move to the target position through the collected eye movement signals and brain electrical signals of the user.
Fig. 2 is a schematic diagram illustrating a usage flow of a two-dimensional cursor control method according to an embodiment of the present application.
The steps in the two-dimensional cursor control method shown in fig. 2 are as follows:
step a 1: the user wears the equipment and turns on the equipment switch. The system detects whether each module is normally started and whether the communication between the modules is normal;
step a 2: if the equipment can not normally run, prompting a user to check the corresponding module;
step a 3: adjusting the picture effect according to the self condition of a user to ensure that the picture is clear and recognizable, and then starting signal acquisition by each signal acquisition module;
step a 4: the acquired signals are transmitted into a signal processing module for preprocessing and characteristic extraction, and the processed signals are gathered and transmitted into an information identification module. The information identification module comprehensively processes eye movement and EEG signals and carries out coordinate identification on the mixed characteristics;
step a 5: detecting whether the communication environment is normal or not, and if so, transmitting the instruction information into a two-dimensional cursor control module to complete communication and external equipment control; if the communication is abnormal, prompting that the communication is abnormal, and carrying out signal acquisition again;
step a 6: after the communication is finished, detecting whether the communication environment is closed; if not, entering a standby state.
Fig. 3 is a schematic diagram illustrating an operating principle of a two-dimensional cursor control method according to an embodiment of the present application. The user wears VR eye tracker and electrode cap equipment, and the scintillation stimulation interface formed by pseudo-random sequence coding is present under the VR environment, and simultaneously gathers user's eye movement signal and brain electrical signal. Because the amplitude of the electroencephalogram signal is weak, firstly, the acquired electroencephalogram signal is amplified, after signal preprocessing is completed, signal feature extraction is performed on two signals (the eye movement signal and the electroencephalogram signal of a user) to obtain corresponding feature values, and the corresponding feature values form a mixed feature vector. And predicting the mixed characteristic vector by establishing a Gaussian process regression model, determining a target position watched by the user, and controlling the two-dimensional cursor to move to the determined target position so as to complete the instruction of the user.
Fig. 4 is a schematic diagram illustrating a cursor instruction identification flow of a two-dimensional cursor control method according to an embodiment of the present application.
In the control method shown in fig. 4, recognition of the cursor command is divided into a training mode and an application mode.
In the training mode, the flicker stimulation coded by the pseudo-random sequence is presented in a VR environment, when a user watches a stimulation target, the signal of the cVEPs can be induced, and meanwhile, the corresponding eye image information of the tested eye is collected. In the signal preprocessing part, the system performs data smoothing and denoising filtering on the acquired image signals and reserves the region of interest of the user. For electroencephalogram signals, before preprocessing, a system firstly amplifies the signals, then retains effective data segments, for example, only intercepts the data segments with stronger response of the cVEPs signals to analyze, and then removes noise and artifacts in the signals. And (3) extracting the features of the preprocessed eye and brain electrical signals to obtain a mixed feature vector, and inputting the mixed feature vector into an LSTM (Long Short-Term Memory network) for model training.
The LSTM long and short term memory network is a time recursive network suitable for processing and predicting significant events of relatively long intervals and delays in a time series. The LSTM is provided for solving the problem of gradient disappearance existing in the structure of the recurrent neural network, and is a special recurrent neural network. At the same time, LSTM clearly avoids the long-term dependence problem in design, which is mainly attributed to the well-designed "gate" structure of LSTM (input gate, forgetting gate and output gate). The 'gate' structure is a method for selectively passing through information and comprises a sigmoid neural network layer and a pointwise multiplication operation. It has the ability to eliminate or add information to the cell state so that the LSTM can remember long-term information. In LSTM, the first phase is the forgetting gate, and the forgetting layer decides which information needs to be forgotten from the cell state. The next phase is the input gate, which determines which new information can be deposited into the cell state. The last stage is an output gate that determines what value is output.
(1) Forget the door: the forgetting gate is an output h of the previous layert-1And sequence data x to be input at this layertAs input, an activation function sigmoid is used to obtain an output ft。ftIs taken to be [0,1 ]]Interval, representing the probability of the previous layer of cell state being forgotten, 1 is "Complete retention "and 0 is" complete discard ".
ft=σ(Wf·[ht-1,xt]+bf)
(2) An input gate: the input gate comprises two parts, the first part uses sigmoid activation function, and the output is itThe second part uses the tanh activation function and the output is
Figure BDA0002592991610000101
it=σ(Wi·[ht-1,xt]+bi)
Figure BDA0002592991610000102
To date, ftIs the output of the forgetting gate, controls C in the last layer of cell statet-1The degree of being forgotten is such that,
Figure BDA0002592991610000103
for two output multiplications of the input gate, it indicates how much new information is retained. Based on this, new information can be updated to C in cell of this layertThe value is obtained.
Figure BDA0002592991610000104
(3) An output gate: the output gate is used to control how much of the cell of the layer is filtered. Firstly, a sigmoid activation function is used to obtain [0,1 ]]O of interval valuetThen, C is addedtProcessed by tanh activation function with otMultiplication, i.e. output h of the layert
ot=σ(Wo·[ht-1,xt]+bo)
ht=ot*tanh(Ct)
In the application mode, the trained LSTM model is used for two-dimensional cursor instruction identification. In practical application, the two-dimensional cursor control method provided by the embodiment of the disclosure performs corresponding preprocessing on eye and brain electrical signals acquired by the signal acquisition module. The electroencephalogram signals are amplified before being preprocessed to enhance the amplitude and the signal-to-noise ratio of the cVEPs signals; and then, transmitting the processed signals into an LSTM convolutional network to obtain a classification result, predicting the target position of the two-dimensional cursor by combining a Gaussian process regression model, and moving the two-dimensional cursor to the predicted target position.
Fig. 5 is a schematic diagram illustrating a principle of a flicker-encoded stimulation of a two-dimensional cursor control device according to an embodiment of the present application. The black rectangle represents a VR display interface, the display interface is uniformly distributed with snowflake points, each snowflake point is a stimulus, and the point A is a target stimulus point watched by a user. Typically, all stimuli are in the user's visual field, but the primary visual cortex activity is controlled primarily by the fovea (the portion shown in the circle), the region of the retina where vision (color discrimination, resolution) is most acute. Thus, the blinking patterns involved in the stimulation are encoded in the measured EEG signal, and the encoded stimulation triggers the primary visual cortex, activating the cVEPs signal. And then obtaining a template by pre-learning the cVEPs reaction under different phases, and finally extracting the cVEPs signal from the acquired electroencephalogram signal to perform template matching to obtain the judgment on target stimulation.
For a stimulus signal, the two-dimensional cursor control method provided by the embodiment of the disclosure is based on a compressive sensing theory, makes full use of sparsity of the signal, and greatly reduces the number of measurements required for restoring a sparse signal. One core problem of compressed sensing is the construction of the coding matrix. Coding matrices are generally divided into two categories: the random sensing matrix is a random sensing matrix, and the deterministic coding matrix has the advantages of small storage space, convenient hardware implementation and the like, and more importantly, compared with the random sensing matrix, the deterministic coding matrix can reconstruct sparse signals at 100% probability (when noise is considered, a support set for reconstructing sparse signals at 100% probability). Therefore, how to construct a deterministic encoding matrix becomes a key point in the current compressed sensing research. The embodiment of the disclosure provides a two-dimensional cursor control method, which constructs a compressed sensing coding matrix based on a pseudo-random sequence set with progressive optimal correlation. The correlation of the pseudo-random sequence theoretically ensures the low coherence of the coding matrix, so that the coding matrix has better performance of recovering sparse signals.
Is provided with
Figure BDA0002592991610000111
Is a sequence of length N, and siEach element in the array is a complex number with a modulo length of 1. Order to
Figure BDA0002592991610000112
S constitutes a sequence set comprising M sequences. Optionally taking two sequences S in SiAnd sjThe cross-correlation function of S is defined as:
Figure BDA0002592991610000113
wherein x is*Representing the conjugate of the complex number x, the addition in the above equation is modulo N to take the remainder. In particular, when i ═ j,
Figure BDA0002592991610000114
referred to as the autocorrelation function. The maximum autocorrelation value and the maximum cross-correlation value are respectively defined as:
Figure BDA0002592991610000115
and
Figure BDA0002592991610000116
the maximum correlation value of the sequence set S is defined as:
θmax=max{θAc}
s is also called (N, M, θ)max) And (4) sequence set. Maximum correlation value theta with respect to sequence set SmaxThere is a well-known Welch bound which constrains its lower bound.For a sequence set S of M sequences with period N,
Figure BDA0002592991610000117
wherein the content of the first and second substances,
Figure BDA0002592991610000118
it is easy to find that f (M) gets closer to 1 as M becomes larger.
This means that in general terms
Figure BDA0002592991610000121
Only a small number of kinds of sequence sets can reach the Welch bound for maximum correlation values so far.
The coding matrix is constructed by a sequence set by adopting a cyclic shift method. Specifically, for the above
Figure BDA0002592991610000122
Note the book
Figure BDA0002592991610000123
Where T represents the transpose of the vector. After the cyclic shift is carried out for one time,
note the book
Figure BDA0002592991610000124
By analogy, the following results are obtained after k times of cyclic shift (k is more than or equal to 1 and is less than or equal to N-1):
Figure BDA0002592991610000125
that is to say by cyclic shifting, s0An N-order square matrix can be generated
Figure BDA0002592991610000126
The square matrix is denoted as A0. Using the same method, each sequence in the sequence set S can generate a square matrix, which is in turn denoted A0,A1,…,AM-1Then the sequence set S may generate an N MN matrix (A)0,A1,…,AM-1). Typically the sequence set S contains a large number of sequences,resulting in MN > N. In practical application, it is not necessary to use a matrix with such a large difference between rows and columns, and one usually selects a part of column vectors to form a coding matrix. The two-dimensional cursor control method provided by the embodiment of the disclosure adopts a coding matrix based on a Gold sequence set, is constructed according to the method, and the selection method of the column number is sequentially selected (A)0,A1,…,AM-1) The first few columns.
The Gold sequence is a pseudo-random sequence with better characteristics, which is provided on the basis of the m sequence, has better autocorrelation and cross-correlation characteristics, and generates more sequences. It is formed by modulo-2 addition of two m-sequence pairs of equal code length and the same code clock rate, preferably. When constructing a coding matrix based on a Gold sequence set, let p be 2, and M be pm+1,d=2k+1. Wherein m is an odd number and gcd (m, k) ═ 1.
Definition of
Figure BDA0002592991610000127
Wherein, the sequence si(t) the construction is as follows:
Figure BDA0002592991610000131
then S is a number θmax=1+2(m+1)/2Is/are as follows
Figure BDA0002592991610000132
And (4) sequence set.
In general, the performance of the coding matrix for constructing the compressed sensing based on the pseudo-random sequence has small variation amplitude. This is because the original square matrix of the random matrix is mostly an orthogonal matrix, and as the difference between rows and columns increases, the cross-correlation of the matrix will be obviously reduced. The matrix constructed by the pseudo-random sequence through shift transformation is an approximately orthogonal system close to the lower theoretical bound (Welch bound), and the cross correlation of the matrix is slightly influenced by the number of the taken columns in the process of constructing the coding matrix, so that the performance of the coding matrix constructed based on the pseudo-random sequence is stable. When the compressed sensing coding matrix is constructed on the basis of a pseudo-random sequence set with progressive optimal correlation, such as a Gold sequence set, the low coherence of the coding matrix is also ensured theoretically, and meanwhile, the spatial entropy of the stimulation signals after coding is maximized.
In the embodiment of the present application, a brain-computer interface technology is combined on the basis of an eye tracking technology and a virtual reality technology, that is, a pattern of pseudo-random sequence coded stimulation is used to induce brain coded modulation visual evoked potential signals. And (3) establishing a Gaussian process regression model by fusing the characteristic values of the eye movement signals and the cVEPs signals, predicting the target position, and finally realizing the control movement of the two-dimensional cursor.
The following is an embodiment of a two-dimensional cursor control device of the present invention, which can be used to implement embodiments of a two-dimensional cursor control method of the present invention. For details that are not disclosed in the embodiments of the two-dimensional cursor control device of the present invention, please refer to the embodiments of the two-dimensional cursor control method of the present invention.
Referring to fig. 6, a block diagram of a two-dimensional cursor control device according to an exemplary embodiment of the present invention is shown. The two-dimensional cursor control device comprises an acquisition module 10, a preprocessing module 20, a feature extraction module 30, a classification module 40, a prediction module 50, a control module 60 and a signal acquisition module 70.
Specifically, the acquiring module 10 is configured to acquire the eye movement signal and the electroencephalogram signal acquired by the signal acquiring module 70;
the preprocessing module 20 is configured to perform first preprocessing on the eye movement signal acquired by the acquisition module 10 to obtain a preprocessed eye movement signal, and perform second preprocessing on the electroencephalogram signal to obtain a preprocessed electroencephalogram signal;
the feature extraction module 30 is configured to perform feature extraction on the pre-processed eye movement signal obtained by the pre-processing module 20 to obtain a first extraction result at least including features of the eye movement signal; performing feature extraction on the preprocessed electroencephalogram signal obtained by the preprocessing module 20 to obtain a second extraction result at least comprising electroencephalogram signal features;
the classification module 40 is configured to input the first extraction result and the second extraction result extracted by the feature extraction module 30 into a preset classification model for classification processing, so as to obtain corresponding classification results;
the prediction module 50 is configured to perform regression prediction on the classification result obtained by classifying the classification module 40 according to a gaussian process regression model, and predict a target position of the two-dimensional cursor, where the gaussian process regression model is used to predict the target position of the two-dimensional cursor;
and a control module 60 for controlling the two-dimensional cursor to move to the target position predicted by the prediction module 50.
Optionally, the apparatus further comprises:
a reading module (not shown in fig. 6) configured to read eye movement signal features before the classification module 40 inputs the first extraction result and the second extraction result into a preset classification model for classification processing, where the eye movement signal features read by the reading module include at least one of the following features: the user pupil characteristics, the user fixation point number characteristics and the user eye fixation duration characteristics.
Optionally, the reading module is further configured to:
before the classification module 40 inputs the first extraction result and the second extraction result into a preset classification model for classification, reading electroencephalogram signal characteristics, wherein the electroencephalogram signal characteristics read by the reading module comprise at least one of the following items: the frequency characteristic corresponding to the electroencephalogram signal and the phase characteristic corresponding to the electroencephalogram signal.
Optionally, the preprocessing module 20 is specifically configured to:
carrying out data smoothing processing and/or denoising and filtering processing on the eye image to obtain eye to-be-processed data containing eye movement signals;
removing data of an invalid region from the eye to-be-processed data to obtain first valid data corresponding to the valid region;
determining second effective data corresponding to the user region of interest from the first effective data by using a matrix method;
and preprocessing the second effective data through the first preprocessing model to obtain a preprocessed eye movement signal.
Optionally, the apparatus further comprises:
a determining module (not shown in fig. 6) for determining the region of interest of the user before the feature extracting module 30 determines the second valid data corresponding to the region of interest of the user from the first valid data by using the matrix model.
Optionally, the determining module is specifically configured to:
under the condition that first monitoring data acquired by a first signal acquisition module for acquiring eye movement signals show that the pupils of a user are in an enlarged state, determining that the current area is an area of interest of the user; and/or the presence of a gas in the gas,
under the condition that second monitoring data acquired by a second signal acquisition module for acquiring eye movement signals show that the number of user gazing points is greater than or equal to a preset user gazing point number threshold value, determining that the current area is a user interested area; and/or the presence of a gas in the gas,
and under the condition that third monitoring data acquired by a third signal acquisition module for acquiring the eye movement signal show that the eye watching duration of the user is greater than or equal to a preset eye watching duration threshold value of the user, determining the current area as the area of interest of the user.
Optionally, the preset classification model for the classification processing by the classification module 40 is a long-term and short-term memory convolutional neural network model.
Fig. 7 is a schematic structural diagram of another two-dimensional cursor control device according to an embodiment of the present disclosure. The two-dimensional cursor control device comprises a signal acquisition module, a signal processing module, a feature extraction module, a classification module 40, a coordinate identification module and a control module. The signal acquisition module consists of an eye movement signal acquisition module and an electroencephalogram signal acquisition module and is respectively used for acquiring an eye movement signal and an electroencephalogram signal. The eye movement signals and/or the brain electrical signals collected by the signal collection module are transmitted into the signal processing module and the feature extraction module to be processed, corresponding eye movement signal feature vectors and brain electrical signal feature vectors are obtained, and then the eye movement signals and/or the brain electrical signals enter the classification module 40 and the coordinate recognition module to obtain a two-dimensional cursor control command. The two-dimensional cursor control device provided by the embodiment of the disclosure performs transmission communication of instructions and control over external equipment through the control module.
It should be noted that fig. 7 refers to the same modules as those in fig. 6, please refer to the same or related descriptions in fig. 6, and the description thereof is not repeated here.
It should be noted that, when the two-dimensional cursor control device provided in the foregoing embodiment executes the two-dimensional cursor control method, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed to different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules, so as to complete all or part of the functions described above. In addition, the two-dimensional cursor control device provided in the above embodiment and the two-dimensional cursor control method embodiment belong to the same concept, and the detailed implementation process is shown in the two-dimensional cursor control method embodiment, which is not described herein again.
In the embodiment of the present application, a brain-computer interface technology is combined on the basis of an eye tracking technology and a virtual reality technology, that is, a pattern of pseudo-random sequence coded stimulation is used to induce brain coded modulation visual evoked potential signals. And (3) establishing a Gaussian process regression model by fusing the characteristic values of the eye movement signals and the cVEPs signals, predicting the target position, and finally realizing the control movement of the two-dimensional cursor.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a read-only memory or a random access memory.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (8)

1. A two-dimensional cursor control method, the method comprising:
acquiring eye movement signals and brain electrical signals acquired by a signal acquisition module;
performing first preprocessing on the eye movement signal to obtain a preprocessed eye movement signal, and performing second preprocessing on the electroencephalogram signal to obtain a preprocessed electroencephalogram signal;
performing feature extraction on the preprocessed eye movement signal to obtain a first extraction result at least comprising eye movement signal features; performing feature extraction on the preprocessed electroencephalogram signal to obtain a second extraction result at least comprising electroencephalogram signal features;
inputting the first extraction result and the second extraction result into a preset classification model for classification processing to obtain corresponding classification results;
performing regression prediction on the classification result according to a Gaussian process regression model to predict the target position of the two-dimensional cursor, wherein the Gaussian process regression model is used for predicting the target position of the two-dimensional cursor;
and controlling the two-dimensional cursor to move to the target position.
2. The method according to claim 1, wherein before the inputting the first extraction result and the second extraction result into a preset classification model for classification processing, the method further comprises:
reading the eye movement signal characteristic or characteristics and,
wherein the eye movement signal characteristic comprises at least one of:
the user pupil characteristics, the user fixation point number characteristics and the user eye fixation duration characteristics.
3. The method according to claim 1, wherein before the inputting the first extraction result and the second extraction result into a preset classification model for classification processing, the method further comprises:
the characteristics of the brain electrical signals are read,
wherein the electroencephalogram signal characteristics include at least one of:
and the frequency characteristic corresponding to the electroencephalogram signal and the phase characteristic corresponding to the electroencephalogram signal.
4. The method of claim 1, wherein the first pre-processing the eye movement signal to obtain a pre-processed eye movement signal comprises:
carrying out data smoothing processing and/or denoising and filtering processing on the eye image to obtain eye to-be-processed data containing the eye movement signal;
removing data of an invalid region from the eye to-be-processed data to obtain first valid data corresponding to a valid region;
determining second effective data corresponding to the user region of interest from the first effective data by using a matrix method;
and preprocessing the second effective data through a first preprocessing model to obtain a preprocessed eye movement signal.
5. The method according to claim 4, wherein before the determining second valid data corresponding to the user region of interest from the first valid data by using a matrix method, the method further comprises:
determining the user region of interest.
6. The method of claim 5, wherein the determining a user region of interest comprises:
under the condition that first monitoring data acquired by a first signal acquisition module for acquiring the eye movement signal show that the pupil of the user is in an enlarged state, determining the current region as a user region of interest; and/or the presence of a gas in the gas,
under the condition that second monitoring data acquired by a second signal acquisition module for acquiring the eye movement signals show that the number of user gazing points is greater than or equal to a preset user gazing point number threshold value, determining that the current area is a user interested area; and/or the presence of a gas in the gas,
and under the condition that third monitoring data acquired by a third signal acquisition module for acquiring the eye movement signal show that the eye watching duration of the user is greater than or equal to a preset eye watching duration threshold value of the user, determining the current area as the area of interest of the user.
7. The method according to any one of claims 1 to 6,
the preset classification model is a long-term and short-term memory convolutional neural network model.
8. A two-dimensional cursor control device, the device comprising:
the acquisition module is used for acquiring the eye movement signals and the brain electrical signals acquired by the signal acquisition module;
the preprocessing module is used for performing first preprocessing on the eye movement signal acquired by the acquisition module to obtain a preprocessed eye movement signal and performing second preprocessing on the electroencephalogram signal to obtain a preprocessed electroencephalogram signal;
the feature extraction module is used for performing feature extraction on the preprocessed eye movement signal obtained by the preprocessing module to obtain a first extraction result at least comprising eye movement signal features; performing feature extraction on the preprocessed electroencephalogram signal obtained by the preprocessing module to obtain a second extraction result at least comprising electroencephalogram signal features;
the classification module is used for inputting the first extraction result and the second extraction result extracted by the feature extraction module into a preset classification model for classification processing to obtain corresponding classification results;
the prediction module is used for performing regression prediction on the classification result obtained by the classification module according to a Gaussian process regression model to predict the target position of the two-dimensional cursor, wherein the Gaussian process regression model is used for predicting the target position of the two-dimensional cursor;
and the control module is used for controlling the two-dimensional cursor to move to the target position predicted by the prediction module.
CN202010700861.6A 2020-07-20 2020-07-20 Two-dimensional cursor control method and device Pending CN111966218A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010700861.6A CN111966218A (en) 2020-07-20 2020-07-20 Two-dimensional cursor control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010700861.6A CN111966218A (en) 2020-07-20 2020-07-20 Two-dimensional cursor control method and device

Publications (1)

Publication Number Publication Date
CN111966218A true CN111966218A (en) 2020-11-20

Family

ID=73361778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010700861.6A Pending CN111966218A (en) 2020-07-20 2020-07-20 Two-dimensional cursor control method and device

Country Status (1)

Country Link
CN (1) CN111966218A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113069125A (en) * 2021-03-18 2021-07-06 上海趣立信息科技有限公司 Head-mounted equipment control system, method and medium based on brain wave and eye movement tracking
CN113325956A (en) * 2021-06-29 2021-08-31 华南理工大学 Eye movement control system based on neural network and implementation method
CN114265527A (en) * 2021-12-20 2022-04-01 中国农业银行股份有限公司 Method, device, medium and electronic equipment for predicting mouse click position

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150023A (en) * 2013-04-01 2013-06-12 北京理工大学 System and method for cursor control based on brain-computer interface
CN106933353A (en) * 2017-02-15 2017-07-07 南昌大学 A kind of two dimensional cursor kinetic control system and method based on Mental imagery and coded modulation VEP
CN109255309A (en) * 2018-08-28 2019-01-22 中国人民解放军战略支援部队信息工程大学 Brain electricity and eye movement fusion method and device towards Remote Sensing Target detection

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150023A (en) * 2013-04-01 2013-06-12 北京理工大学 System and method for cursor control based on brain-computer interface
CN106933353A (en) * 2017-02-15 2017-07-07 南昌大学 A kind of two dimensional cursor kinetic control system and method based on Mental imagery and coded modulation VEP
CN109255309A (en) * 2018-08-28 2019-01-22 中国人民解放军战略支援部队信息工程大学 Brain electricity and eye movement fusion method and device towards Remote Sensing Target detection

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113069125A (en) * 2021-03-18 2021-07-06 上海趣立信息科技有限公司 Head-mounted equipment control system, method and medium based on brain wave and eye movement tracking
CN113325956A (en) * 2021-06-29 2021-08-31 华南理工大学 Eye movement control system based on neural network and implementation method
CN114265527A (en) * 2021-12-20 2022-04-01 中国农业银行股份有限公司 Method, device, medium and electronic equipment for predicting mouse click position
CN114265527B (en) * 2021-12-20 2024-02-20 中国农业银行股份有限公司 Method, device, medium and electronic equipment for predicting click position of mouse

Similar Documents

Publication Publication Date Title
CN108446020B (en) Motor imagery idea control method fusing visual effect and deep learning and application
CN111966218A (en) Two-dimensional cursor control method and device
Cecotti et al. Best practice for single-trial detection of event-related potentials: Application to brain-computer interfaces
Valeriani et al. Enhancement of group perception via a collaborative brain–computer interface
CN110555468A (en) Electroencephalogram signal identification method and system combining recursion graph and CNN
CN111317468A (en) Electroencephalogram signal classification method and device, computer equipment and storage medium
CN109255309B (en) Electroencephalogram and eye movement fusion method and device for remote sensing image target detection
CN109766845B (en) Electroencephalogram signal classification method, device, equipment and medium
Cecotti et al. Optimization of single-trial detection of event-related potentials through artificial trials
Sajda et al. Brain-Computer Interfaces [from the guest editors]
WO2024114480A1 (en) Visual stimulation method, brain-computer training method, and brain-computer training system
KR20190030611A (en) Method for integrated signal processing of bci system
Shelepin et al. Masking and detection of hidden signals in dynamic images
CN112162634A (en) Digital input brain-computer interface system based on SEEG signal
Christoforou et al. Second-order bilinear discriminant analysis
Astolfi et al. Estimate of causality between independent cortical spatial patterns during movement volition in spinal cord injured patients
Li et al. Comparative study of EEG motor imagery classification based on DSCNN and ELM
Attallah Multi-tasks biometric system for personal identification
CN113435234A (en) Driver visual saliency region prediction method based on bimodal video EEG data
CN114246594B (en) Electroencephalogram signal processing method, background electroencephalogram prediction model training method and device
CN116524380A (en) Target detection method based on brain-computer signal fusion
Lei et al. Common spatial pattern ensemble classifier and its application in brain-computer interface
CN116048266A (en) Brain-computer interface system integrating camera-based vision tracking technology
Cecotti et al. Suboptimal sensor subset evaluation in a p300 brain-computer interface
More et al. Using motor imagery and deeping learning for brain-computer interface in video games

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination