CN111651046A - Gesture intention recognition system without hand action - Google Patents

Gesture intention recognition system without hand action Download PDF

Info

Publication number
CN111651046A
CN111651046A CN202010507113.6A CN202010507113A CN111651046A CN 111651046 A CN111651046 A CN 111651046A CN 202010507113 A CN202010507113 A CN 202010507113A CN 111651046 A CN111651046 A CN 111651046A
Authority
CN
China
Prior art keywords
action
electromyographic
motion
intention
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010507113.6A
Other languages
Chinese (zh)
Inventor
于景瑶
陈宏源
樊澹宁
余江波
耿梓航
于洋
盛鑫军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN202010507113.6A priority Critical patent/CN111651046A/en
Publication of CN111651046A publication Critical patent/CN111651046A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • G06F2218/04Denoising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Abstract

The invention provides a gesture intention recognition system without hand action, which comprises: the wearable electromyographic arm ring comprises a micro-array type modularized wearable electromyographic arm ring, an electromyographic signal inverse decomposition module, a motion unit action potential sequence feature extraction and action intention analysis module and a control terminal module; the gesture intention recognition step includes: collecting an electromyographic signal, preprocessing the signal, reversely decomposing the electromyographic signal, extracting action potential sequence characteristics of a motion unit, establishing a correlation model of characteristic information and action intention, and controlling by a terminal. The invention has the beneficial effects that: the action intention of the hand of the user can be recognized under the condition that the hand of the user does not make specific actions, the collection cost can be greatly reduced by adopting a modularized and portable myoelectricity collection mode, and the convenience of signal collection is improved.

Description

Gesture intention recognition system without hand action
Technical Field
The invention relates to the field of bioelectricity control, in particular to a gesture intention recognition technology and system.
Background
Surface Electromyography (sEMG) is a motor neuron electrophysiological signal that governs the contraction of muscle fibers, and a weak electrical signal is formed by conduction through human tissue to the skin Surface. The surface electromyogram signal can reflect the action and behavior intention of the human body. Through decoding the surface myoelectric signals, the relationship between the myoelectric signals and the movement intention can be established, and the human body movement can be further analyzed. The myoelectric control interface with advanced functions at present realizes decoding of 10-20 discrete action modes in a healthy human body experiment, has the accuracy rate of over 95% under a laboratory condition, and can meet the real-time requirement of artificial limb control, but the action intention decomposition of the type based on the mode recognition or machine learning method is easily influenced by the external environment (such as muscle fatigue, electrode offset or the state of an attached interface and the like), and the result is generally expressed as the degradation of the interface performance along with the time (day or day).
Disclosure of Invention
Aiming at the defects of the existing gesture recognition technology in the hand-free action gesture recognition aspect, the invention aims to obtain the discharge information of the motion unit under different action intentions by reversely solving the collected electromyographic signals so as to further realize the analysis and characterization of the action intentions of the human body. Meanwhile, the modular and portable electromyography acquisition mode adopted by the invention can greatly reduce the acquisition cost and improve the convenience of signal acquisition.
We have found that the motor unit action potential sequence (MUAPt) propagates in the muscle fibers forming an electric current field, the surface electromyographic signals being in fact the potential differences detected by the electrodes placed in this electric current field. The action electric potential sequence of the motor unit is used as a nerve signal transmitted to muscles, and does not need to pass through a volume conduction process, so that the influence of time drift and the change of the physiological state of the limb stump is weakened; in addition, the action potential sequence of the motion unit is used as a control signal of a human motion system, and theoretically, the action intention of the human body can be completely reflected. By means of the inverse solution technology, the surface electromyographic signals can be decomposed into a plurality of movement unit action electric potential sequences, the discharge conditions of the movement units under different movement intentions can be analyzed by carrying out feature extraction and classification on the information of the movement unit action electric potential sequences, the movement intentions of the human body can be further analyzed, and the robustness is better.
The gesture recognition without action, namely the gesture intention of the user is obtained by decoding without actual movement by recording the peripheral nerve physiological information of the limbs in vivo without a physical inertia sensing device. The surface electromyographic signals of the arm are collected, the gestures of a person can be pre-judged under the condition of no action through decoding the surface electromyographic signals, and hidden and password type man-machine interaction can be realized.
The characteristics of the technology show that the system is particularly suitable for application scenes that hand movement is limited (such as driving state) or people are subjectively prevented from getting attention (such as hidden communication) and the like, and can serve the fields of special type, military and the like.
A hand motion free gesture intent recognition system comprising: the wearable electromyographic arm ring is worn on an arm by a user and used for collecting multi-channel electromyographic signals of local areas of the arm of the user; the electromyographic signal inverse module analyzes and processes the acquired electromyographic signals and inversely solves a series of motion Unit action electric potential sequences (MUAPt); the motion unit action potential sequence feature extraction and action intention analysis module is used for obtaining the discharge information of the motion unit by utilizing the inverse solution module, extracting feature information related to the action intention and establishing a correlation model with the action intention; and the control terminal module extracts the action intention analyzed by the action intention analysis module according to the action potential sequence characteristics of the motion unit to correspondingly control the controlled object.
Furthermore, wearable flesh electricity arm ring of microarray modularization includes microarray flesh electricity signal acquisition electrode, flesh electricity signal processing module and signal transmission module, gathers user's arm flesh electricity signal, carries out the preliminary treatment to the signal earlier, and the signal transmission who will handle is good is to the flesh electricity signal anti-module of resolving.
Furthermore, the micro-array type modularized wearable myoelectric arm ring collects surface myoelectric signals through a plurality of array type arrangement electrodes and carries out multi-channel signal transmission; the electromyographic signal processing module is used for carrying out signal amplification, filtering, noise reduction and other processing on the acquired electromyographic signals, improving the quality of the signals and converting the analog signals into digital signals; and the signal transmission module simultaneously supports wired and wireless transmission modes and transmits the conditioned signal to the electromyographic signal inverse decomposition module.
Further, the motion unit action potential sequence feature extraction and action intention analysis module comprises a motion unit action potential sequence feature extraction module and an action intention identification module; the motion unit action potential sequence feature extraction module extracts feature information related to the motion unit action potential sequence so as to represent the characteristics of the motion unit action potential sequence; and the action intention identification module establishes a correlation model with the action intention of the user according to the extracted characteristic quantity.
A gesture intention recognition method without hand motion comprises the following steps:
s1, collecting electromyographic signals;
s2, preprocessing signals;
s3, reversely resolving the surface electromyographic signals;
s4, extracting action potential sequence characteristics of the motion unit;
s5, establishing a correlation model of the characteristic information and the action intention;
and S6, terminal control.
Further, in step S2, the original signal is amplified, noise reduction filtered, a/D converted, and digitally sampled.
Further, in step S3, decomposing the surface myoelectric signal to obtain a motor unit action potential sequence includes the following steps:
s31, spike detection;
s32, classifying the action potential sequence of the motion unit;
s33, template generation;
and S34, positioning the template.
Further, in step S4, the discharge frequency of the motion unit operation potential sequence is extracted as the feature quantity.
Further, in step S5, the gesture intent of the hands-free motion is recognized by associating the feature of the motion unit motion potential sequence extracted at a certain time with the gesture of the hands-free motion performed at that time.
Further, in step S6, the terminal device is controlled by the gesture intention without the recognized hand motion, and the terminal control may be, but is not limited to: gesture visualization without hand action, hidden or password type man-machine interaction and the like.
The beneficial effect of this application is: the gesture recognition system can recognize the action intention (namely gestures) of the hand of the user under the condition that the hand of the user does not make specific actions, and can greatly reduce the acquisition cost and improve the convenience of signal acquisition by adopting a modularized and portable myoelectricity acquisition mode.
Drawings
FIG. 1 is a block diagram of the architecture of a preferred embodiment of the present application;
FIG. 2 is an overall flow chart of a preferred embodiment of the present application;
FIG. 3 is a diagram of a myoelectricity collection system according to a preferred embodiment of the present application;
FIG. 4 is a flowchart of electromyographic signal de-interpretation according to a preferred embodiment of the present application;
FIG. 5 is a flow chart of motion unit action potential sequence feature extraction and action intention analysis according to a preferred embodiment of the present application;
fig. 6 is a diagram illustrating an exemplary application scenario of a preferred embodiment of the present application.
Detailed Description
The preferred embodiments of the present application will be described below with reference to the accompanying drawings for clarity and understanding of the technical contents thereof. The present application may be embodied in many different forms of embodiments and the scope of the present application is not limited to only the embodiments set forth herein.
As shown in fig. 1, the structure diagram of an embodiment of the present invention includes four parts, which are respectively: the wearable electromyographic arm ring comprises a micro-array type modularized electromyographic arm ring 100, an electromyographic signal inverse decomposition module 200, a motion unit action potential sequence feature extraction and action intention analysis module 300 and a control terminal.
As shown in fig. 2, the overall process of an embodiment of the present invention includes the following steps:
s1, collecting electromyographic signals. Arm ring 100 is first worn and appropriately adjusted. The user wears the data acquisition arm ring 100 on the arm, and adjusts the wearing position and the tightness, so that the electrode is in good contact with the skin, and data can be acquired subsequently. After the adjustment is finished, the device is started, and in some occasions as shown in fig. 6, gesture intentions without specific actions are shown, the hand is tried to move under the condition that the hand is limited and cannot move, and the arm ring collects surface electromyographic signals in real time. The data contains information of the user's intended action.
And S2, preprocessing signals. The collected electromyographic signal data is subjected to signal amplification, filtering and noise reduction, converted into a digital signal through an analog-to-digital converter, and input to an electromyographic signal inverse-solving module 200 running on an upper computer through a transmission device.
And S3, reversely resolving the surface electromyographic signals. In the step, an electromyographic signal inverse-solving algorithm detects a signal peak value according to an input digital signal so as to separate different movement unit action electric sequence sequences from the signal; further, a waveform matching template is generated according to the action potential sequences of the different types of motion units, and the discharge time of the different motion units in the electromyographic signals is obtained according to the template positioning.
And S4, extracting the action potential sequence characteristics of the motion unit. In this step, the motion unit action potential sequence feature extraction and action intention analysis module 300 extracts feature information of the motion unit action potential sequence from the data obtained by the inverse solution module.
And S5, establishing a correlation model of the characteristic information and the action intention. The action intention analysis module comprises a training set without hand action recognition. The user firstly trains according to a set training mode, and in the training stage, the classifier is obtained through training according to the extracted features and the corresponding gesture intention. In the subsequent real-time action intention analysis, the electromyographic signals are decomposed in real time to obtain a motion unit action electric potential sequence, the motion unit action electric potential sequence characteristic information is extracted and is used as the input of a trained classifier, and the action intention is recognized.
And S6, terminal control. The control terminal 400 generates a command input to the controlled device based on the operation intention data obtained in S5, and the terminal control may include: gesture visualization without hand action, hidden or password type man-machine interaction and the like.
As shown in fig. 3, in the electromyography acquisition system structure diagram of an embodiment of the present invention, the micro-array type modularized wearable electromyography arm ring 100 is an arm ring for acquiring an electromyography signal, and the acquired multichannel surface electromyography signal may provide a data source for a subsequent electromyography signal inverse decomposition module. The arm ring can be worn on the arm so that the electrodes contact the skin surface to collect signals. In one implementation example, the micro-array type modular wearable electromyographic arm ring 100 may include micro-array type collecting electrodes, an electromyographic signal processing module, a wireless/wired signal transmission module, and a power supply module. Preferably, the wearable myoelectric arm ring 100 of microarray modularization adopts a plurality of microarray electrodes to gather a plurality of channel signals, and the surface electrode adopts the material that electric conductivity is excellent, ensures to keep electric conductivity good between skin and the electrode to the quality of the signal of guaranteeing to gather. Preferably, the electromyographic signal processing module performs preprocessing such as signal amplification, filtering and noise reduction on the acquired signal to improve the signal quality, and converts the analog signal into a digital signal. Preferably, the wireless/wired signal transmission module supports a wired or wireless data transmission mode, and transmits the preprocessed sEMG information to the processing program of the upper computer. Preferably, the power supply module is composed of a series of voltage and power conversion chips, and comprises a voltage conversion module for converting 7.4V voltage provided by the lithium battery into 5V, 3.3V and-3.3V voltage, so as to provide normal working voltage for each part in the system.
In one embodiment, as shown in fig. 4, the electromyographic signal inverse decomposition module 200 runs on an upper computer, and comprises a motor unit action potential sequence inverse decomposition algorithm. The electromyographic signal inverse decomposition module 200 performs peak detection on the electromyographic signals by utilizing surface electromyographic signal data acquired by the micro-array type modularized wearable electromyographic arm ring 100, so as to obtain different types of movement unit action electric sequence; further classifying the motion unit motion electric sequence to obtain discharge waveform templates corresponding to different types of motion unit motion electric sequences; and finally, positioning MUAPt contained in the electromyographic signals according to the template to obtain corresponding discharge time.
As shown in fig. 5, in a specific embodiment, the motion unit action potential sequence feature extraction and action intention analysis module 300 may include two modules, motion unit action potential sequence feature extraction and action intention analysis. The motion unit action potential sequence feature extraction module extracts feature information according to the data obtained by the inverse decomposition module 200, for example, feature information of discharge frequency of each motion unit action potential sequence in the collected electromyographic signal frequency band is extracted, and the feature information is classified and input into the action intention identification module. The recognized gestures contained in the action intention analysis module can be defined by a user, and a correlation model of the action electric sequence of the motion unit and the action intention is established through signals corresponding to the gestures. And inputting the gestures and the characteristic quantities of the action potentials of the corresponding motion units into a training classifier as a training set according to the requirements of the user. After the training is finished, the intention identification module is combined with the classifier to identify the characteristic information input into the classifier so as to obtain the corresponding action intention.
After receiving the action potential sequence feature extraction and action intention information transmitted by the action intention analysis module 300 of the motion unit, the control terminal 400 generates a corresponding instruction and inputs the instruction into the controlled device.
The foregoing detailed description of the preferred embodiments of the present application. It should be understood that numerous modifications and variations can be devised by those skilled in the art in light of the present teachings without departing from the inventive concepts. Therefore, the technical solutions available to those skilled in the art through logic analysis, reasoning and limited experiments based on the concepts of the present application should be within the scope of protection defined by the claims.

Claims (10)

1. A hand motion free gesture intent recognition system, comprising: the wearable electromyographic arm ring is worn on an arm by a user and used for collecting multi-channel electromyographic signals of local areas of the arm of the user; the electromyographic signal inverse-resolving module is used for analyzing and processing the collected electromyographic signals and inversely resolving a series of movement unit action electric sequence; the motion unit action potential sequence feature extraction and action intention analysis module is used for obtaining the discharge information of the motion unit by utilizing the inverse solution module, extracting feature information related to the action intention and establishing a correlation model with the action intention; and the control terminal module extracts the action intention analyzed by the action intention analysis module according to the action potential sequence characteristics of the motion unit to correspondingly control the controlled object.
2. The system for gesture intention recognition without hand movement according to claim 1, wherein the micro-array type modular wearable electromyographic arm ring comprises micro-array type electromyographic signal collecting electrodes, an electromyographic signal processing module and a signal transmission module, wherein the micro-array type modular wearable electromyographic arm ring is used for collecting the electromyographic signal of the arm of the user, preprocessing the signal and transmitting the processed signal to the electromyographic signal inverse decomposition module.
3. The system for gesture intent recognition without hand motion according to claim 2, wherein the micro-array type modular wearable electromyographic arm ring collects surface electromyographic signals through a plurality of array type arranged electrodes and performs multi-channel signal transmission; the electromyographic signal processing module is used for carrying out processing including signal amplification, filtering and noise reduction on the acquired electromyographic signals, improving the quality of the signals and converting analog signals into digital signals; the signal transmission module simultaneously supports wired and wireless transmission modes, and transmits the conditioned signal to the electromyographic signal inverse decomposition module.
4. The system for recognizing gesture intention without hand motion according to claim 1, wherein the motion unit motion potential sequence feature extraction and motion intention analysis module comprises a motion unit motion potential sequence feature extraction module and a motion intention recognition module; the motion unit action potential sequence feature extraction module extracts feature information related to the motion unit action potential sequence so as to represent the characteristics of the motion unit action potential sequence; and the action intention identification module establishes a correlation model with the action intention of the user according to the extracted characteristic quantity.
5. A gesture intention recognition method without hand motion comprises the following steps:
s1, collecting electromyographic signals;
s2, preprocessing signals;
s3, reversely resolving the electromyographic signals;
s4, extracting action potential sequence characteristics of the motion unit;
s5, establishing a correlation model of the characteristic information and the action intention;
and S6, terminal control.
6. The hand-motion-free gesture intention recognition method according to claim 5, wherein in the step S2, the electromyographic signal is amplified, noise-reduction-filtered, A/D-converted, and digitally sampled.
7. The hand gesture intent recognition method without hand motion according to claim 5, wherein in step S3, the electromyographic signals are decomposed to obtain a sequence of motion unit motion potentials; the method comprises the following steps:
s31, spike detection;
s32, classifying the motion unit action potential sequence;
s33, template generation;
and S34, positioning the template.
8. The method for recognizing gesture intent without hand motion according to claim 7, wherein in said step S4, a discharge frequency of said motion unit motion electric potential sequence is extracted as a feature quantity.
9. The method for recognizing gesture intentions of hands-free actions according to claim 8, wherein in step S5, the gesture intentions of hands-free actions are recognized by associating the features of the motion unit action potential sequence extracted at a certain time with the gesture of hands-free actions performed at that time.
10. The method for recognizing gesture intention of no hand motion according to claim 9, wherein in the step S6, controlling the terminal device by using the gesture intention of no hand motion recognized in the step S5 includes: gesture visualization without hand action, hidden or password type man-machine interaction.
CN202010507113.6A 2020-06-05 2020-06-05 Gesture intention recognition system without hand action Pending CN111651046A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010507113.6A CN111651046A (en) 2020-06-05 2020-06-05 Gesture intention recognition system without hand action

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010507113.6A CN111651046A (en) 2020-06-05 2020-06-05 Gesture intention recognition system without hand action

Publications (1)

Publication Number Publication Date
CN111651046A true CN111651046A (en) 2020-09-11

Family

ID=72343464

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010507113.6A Pending CN111651046A (en) 2020-06-05 2020-06-05 Gesture intention recognition system without hand action

Country Status (1)

Country Link
CN (1) CN111651046A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113730190A (en) * 2021-09-18 2021-12-03 上海交通大学 Upper limb rehabilitation robot system with three-dimensional space motion

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103735263A (en) * 2013-11-18 2014-04-23 浙江大学 Array surface myoelectric image collector and collecting method
CN104899594A (en) * 2014-03-06 2015-09-09 中国科学院沈阳自动化研究所 Hand action identification method based on surface electromyography decomposition
CN106980367A (en) * 2017-02-27 2017-07-25 浙江工业大学 A kind of gesture identification method based on myoelectricity topographic map
CN107273798A (en) * 2017-05-11 2017-10-20 华南理工大学 A kind of gesture identification method based on surface electromyogram signal
CN108958472A (en) * 2018-05-17 2018-12-07 北京邮电大学 A kind of method and device of gesture control suitcase
CN109343704A (en) * 2018-09-12 2019-02-15 南京航空航天大学 A kind of healing robot hand online actions identifying system based on LABVIEW platform
CN109765823A (en) * 2019-01-21 2019-05-17 吉林大学 Ground crawler-type unmanned vehicle control method based on arm electromyography signal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103735263A (en) * 2013-11-18 2014-04-23 浙江大学 Array surface myoelectric image collector and collecting method
CN104899594A (en) * 2014-03-06 2015-09-09 中国科学院沈阳自动化研究所 Hand action identification method based on surface electromyography decomposition
CN106980367A (en) * 2017-02-27 2017-07-25 浙江工业大学 A kind of gesture identification method based on myoelectricity topographic map
CN107273798A (en) * 2017-05-11 2017-10-20 华南理工大学 A kind of gesture identification method based on surface electromyogram signal
CN108958472A (en) * 2018-05-17 2018-12-07 北京邮电大学 A kind of method and device of gesture control suitcase
CN109343704A (en) * 2018-09-12 2019-02-15 南京航空航天大学 A kind of healing robot hand online actions identifying system based on LABVIEW platform
CN109765823A (en) * 2019-01-21 2019-05-17 吉林大学 Ground crawler-type unmanned vehicle control method based on arm electromyography signal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
熊安斌 等: "基于单通道sEMG分解的手部动作识别方法", 《机械工程学报》 *
罗万国: "基于经验模板的表面肌电运动单元动作电位序列分解方法研究", 《中国优秀硕士学位论文全文数据库医药卫生科技辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113730190A (en) * 2021-09-18 2021-12-03 上海交通大学 Upper limb rehabilitation robot system with three-dimensional space motion

Similar Documents

Publication Publication Date Title
CN100594858C (en) Electric artificial hand combined controlled by brain electricity and muscle electricity and control method
CN101987048B (en) Artificial limb control method and system thereof
CN110495893B (en) System and method for multi-level dynamic fusion recognition of continuous brain and muscle electricity of motor intention
CN111317600B (en) Artificial limb control method, device, system, equipment and storage medium
CN102499797B (en) Artificial limb control method and system
CN104548347A (en) Pure idea nerve muscle electrical stimulation control and nerve function evaluation system
CN105361880A (en) Muscle movement event recognition system and method
He et al. Spatial information enhances myoelectric control performance with only two channels
Hamedi et al. Surface electromyography-based facial expression recognition in Bi-polar configuration
CN107957783A (en) A kind of Multimode Intelligent control system and method based on brain electricity with myoelectric information
CN104978035A (en) Brain computer interface system evoking P300 based on somatosensory electrical stimulation and implementation method thereof
CN113143676B (en) Control method of external limb finger based on brain-muscle-electricity cooperation
CN112732090B (en) Muscle cooperation-based user-independent real-time gesture recognition method
CN104267807A (en) Hand action mechanomyography based man-machine interaction method and interaction system
Li et al. Wireless sEMG-based identification in a virtual reality environment
Guo et al. Towards semi-supervised myoelectric finger motion recognition based on spatial motor units activation
CN111651046A (en) Gesture intention recognition system without hand action
CN201227336Y (en) Electric artificial hand controlled by brain electricity and muscle electricity
CN106843509B (en) Brain-computer interface system
CN113082448A (en) Virtual immersion type autism children treatment system based on electroencephalogram signal and eye movement instrument
CN110705656A (en) Facial action recognition method based on EEG sensor
CN114936574A (en) High-flexibility manipulator system based on BCI and implementation method thereof
CN114504730A (en) Portable brain-controlled hand electrical stimulation rehabilitation system based on deep learning
CN114098768A (en) Cross-individual surface electromyographic signal gesture recognition method based on dynamic threshold and easy TL
Rada et al. Recognition of Upper Limb Movements Based on Hybrid EEG and EMG Signals for Human-Robot Interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200911

RJ01 Rejection of invention patent application after publication