CN113312994A - Gesture classification recognition method and application thereof - Google Patents

Gesture classification recognition method and application thereof Download PDF

Info

Publication number
CN113312994A
CN113312994A CN202110542132.7A CN202110542132A CN113312994A CN 113312994 A CN113312994 A CN 113312994A CN 202110542132 A CN202110542132 A CN 202110542132A CN 113312994 A CN113312994 A CN 113312994A
Authority
CN
China
Prior art keywords
gesture
recognition method
gesture classification
classification recognition
classification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110542132.7A
Other languages
Chinese (zh)
Inventor
郭伟钰
杨永魁
陈瑞
陈超
辛锦瀚
王峥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN202110542132.7A priority Critical patent/CN113312994A/en
Publication of CN113312994A publication Critical patent/CN113312994A/en
Priority to PCT/CN2021/138067 priority patent/WO2022242133A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application belongs to the technical field of data classification, and particularly relates to a gesture classification recognition method and application thereof. However, some current gesture classification recognition algorithms related to sEMG signals are low in recognition accuracy, and overfitting and under-fitting, gradient disappearance, poor robustness and long training time also exist in the model training process. The application provides a gesture classification recognition method, which comprises the following steps: acquiring a surface electromyographic signal; extracting the characteristics of the surface electromyographic signals to obtain a gesture characteristic sequence and a gesture type; and inputting the gesture feature sequence and the gesture type into a neural network of a circulating gate circuit for training to obtain a classification model, and realizing gesture classification and recognition by adopting the classification model. The accuracy of prediction classification is improved.

Description

Gesture classification recognition method and application thereof
Technical Field
The application belongs to the technical field of data classification, and particularly relates to a gesture classification recognition method and application thereof.
Background
With the development of science and technology, new human-computer interaction methods are more and more concerned by researchers. Gestures are a natural and intuitive interaction means in life and are very important perception channels in man-machine interaction. In human-computer interaction, the gesture type capable of interacting with a computer is identified, and the method has very important significance for the current control research on the artificial limb.
Compared with a mode of realizing human-computer interaction through computer vision, the current research uses surface electromyography (sEMG) to effectively avoid the influence caused by physical factors such as illumination. sEMG signals are important bioelectric signals generated as a person's muscle moves, and changes in signals at the muscle surface are recorded by electrodes. Different gesture motions correspond to different muscle motions, the generated sEMG signals are different, and the sEMG signals have great application value in gesture motion classification. Gesture classification recognition based on sEMG signals has become one of research hotspots in the fields of prosthesis control and rehabilitation training, and the recognition of future gesture actions will also be widely applied in the fields of sports medicine, clinical muscle diagnosis and the like.
The sEMG signal is a time series signal, and in the problem of processing the series signal, a Recurrent Neural Network (RNN) is a good tool for processing the series problem by memorizing and learning sequence data of the signal through the RNN, compared with a machine learning network and a convolutional neural network. However, because the cycle of RNN is only a simple linear relationship, there is a long-term dependence problem, and in multiple iterations of the calculation process, the coefficient multiplication becomes smaller and smaller, which indirectly results in the loss of data information at a longer distance.
However, some current gesture classification recognition algorithms related to sEMG signals are low in recognition accuracy, and overfitting and under-fitting, gradient disappearance, poor robustness and long training time also exist in the model training process.
Disclosure of Invention
1. Technical problem to be solved
Based on the current gesture classification recognition algorithms about sEMG signals, the recognition accuracy is low, and the problems of over-fitting and under-fitting, gradient disappearance, poor robustness and long training time can also exist in the model training process.
2. Technical scheme
In order to achieve the above object, the present application provides a gesture classification recognition method, including the following steps: acquiring a surface electromyographic signal; extracting the characteristics of the surface electromyographic signals to obtain a gesture characteristic sequence and a gesture type; and inputting the gesture feature sequence and the gesture type into a neural network of a circulating gate circuit for training to obtain a classification model, and realizing gesture classification and recognition by adopting the classification model.
Another embodiment provided by the present application is: acquiring the surface electromyographic signals comprises wiping experimental equipment and the skin surface of a testee by adopting alcohol, placing electrodes for acquiring the electromyographic signals on the skin surface of the testee, and acquiring the muscle signal changes brought by the gesture actions of different arms in real time.
Another embodiment provided by the present application is: the number of the electrodes is 8, and the 8 electrodes are distributed on the forearm at equal intervals.
Another embodiment provided by the present application is: and the surface electromyographic signals are transmitted to an intelligent terminal through a wireless device for data processing.
Another embodiment provided by the present application is: and processing the surface electromyogram signal comprises converting the surface electromyogram signal into a digital signal and extracting a characteristic value of the digital signal.
Another embodiment provided by the present application is: and the surface electromyographic signals are input to the intelligent terminal by adopting a sliding windowing method.
Another embodiment provided by the present application is: the frequency of the collected surface electromyographic signals is 2kHZ, the width of the sliding window is 100ms, and the sliding step length is 0.5ms.
Another embodiment provided by the present application is: the characteristic value extraction is carried out by calculating a root mean square value.
Another embodiment provided by the present application is: the recurrent gate neural network includes a fully connected layer.
The application also provides a gesture classification recognition method which is applied to a human-computer interaction system.
3. Advantageous effects
Compared with the prior art, the gesture classification recognition method and the application thereof have the beneficial effects that:
the gesture classification and recognition method provided by the application is a gesture classification and recognition method of surface electromyography (sEMG) based on a GRU model.
According to the gesture classification and recognition method, the collected muscle electrode signals are subjected to data processing, and the processed gesture feature sequences and gesture types are input into a neural network (GRU) of a circulating Gate circuit for training to obtain a classification model, so that gesture classification and recognition are achieved. Compared with the prior machine learning classification algorithm, the method and the device have the advantages that the problems of over-fitting and under-fitting in the gradient calculation process are greatly solved, the robustness is improved, meanwhile, the calculation amount is greatly reduced, and the waste of resources is reduced.
The gesture classification recognition system provided by the application is a gesture classification recognition algorithm based on a Gate circuit neural network (GRU), and accuracy of prediction classification is improved.
According to the gesture classification recognition method, the GRU is an improved neural network of the recurrent neural network, the dependence problem of the RNN on data processing is effectively avoided, and the problems of gradient disappearance, gradient explosion and the like of data of a conventional long-sequence signal in a traditional machine learning training stage can be solved.
According to the gesture classification recognition method provided by the application, the structure of the GRU allows a network to capture dependent items from a large number of data sequences in an adaptive mode, and information of an early part of the sequence is not discarded. The GRU is similar to the long-short term memory neural network LSTM, but removes part of the state, and directly uses the hidden state for information transfer. Compared with a recurrent neural network, the GRU only comprises a reset gate and an update gate, and the operation amount is greatly simplified.
The gesture classification and recognition method is high in accuracy and strong in robustness.
Drawings
FIG. 1 is a schematic illustration of the experimental set-up of the present application;
FIG. 2 is a schematic flow chart of a gesture classification recognition method according to the present application;
FIG. 3 is a diagram illustrating comparison of the results of the gesture classification recognition method according to the present application.
Detailed Description
Hereinafter, specific embodiments of the present application will be described in detail with reference to the accompanying drawings, and it will be apparent to those skilled in the art from this detailed description that the present application can be practiced. Features from different embodiments may be combined to yield new embodiments, or certain features may be substituted for certain embodiments to yield yet further preferred embodiments, without departing from the principles of the present application.
Referring to fig. 1 to 3, the present application provides a gesture classification recognition method, including the following steps: acquiring a surface electromyographic signal; extracting the characteristics of the surface electromyographic signals to obtain a gesture characteristic sequence and a gesture type; and inputting the gesture feature sequence and the gesture type into a neural network of a circulating gate circuit for training to obtain a classification model, and realizing gesture classification and recognition by adopting the classification model.
Further, acquiring the surface electromyographic signals comprises wiping the experimental equipment and the skin surface of the testee by adopting alcohol, placing electrodes for acquiring the electromyographic signals on the skin surface of the testee, and acquiring muscle signal changes brought by gesture motions of different arms in real time.
Further, the number of the electrodes is 8, and the 8 electrodes are distributed on the forearm at equal intervals.
Further, the surface electromyogram signal is transmitted to an intelligent terminal through a wireless device for data processing. The intelligent terminal is a computer, a tablet computer, a mobile phone or other electronic equipment capable of realizing data processing.
Further, processing the surface electromyogram signal includes converting the surface electromyogram signal into a digital signal, and performing feature value extraction on the digital signal.
Further, the surface electromyographic signals are input to the intelligent terminal by adopting a sliding windowing method.
Furthermore, the frequency of the collected surface electromyogram signal is 2kHZ, the width of the sliding window is 100ms, and the sliding step length is 0.5ms. The data volume can be greatly improved, the overfitting problem in the training process is effectively avoided, and the data processing speed is also improved.
Further, the feature value extraction is performed by calculating a root mean square value. The amplitude change of the electromyographic signal can be better described.
Further, the circular gate neural network includes a fully connected layer.
The application also provides a gesture classification recognition method which is applied to a human-computer interaction system.
Examples
The application provides a gesture classification method of a GRU model based on surface electromyogram signal acquisition, which comprises the following specific implementation steps of:
step 1: surface electromyogram signals of different gestures of a tested person are collected
The experimental device and the skin surface of a human subject are wiped by alcohol to avoid the interference of noise signals caused by skin epidermis and grease, the electrodes for acquiring the electromyographic signals are placed on the skin surface of the human subject, 8 electrodes are equidistantly distributed on the forearm, the change of muscle signals caused by the gesture motions of different arms is acquired in real time, and the experimental device is shown in figure 1. In the experimental process, the method and the device collect various gestures. In addition, in order to reduce errors caused by muscle fatigue of a testee in the acquisition process, various gestures are repeated for 6 times, the gesture action lasts for 5s every time, and the interval rest is carried out for 5s every time. The collected sEMG signals are transmitted to a computer via a wireless device.
Step 2: processing of surface electromyographic signals
(a) Converting the transmitted analog signal into a digital signal;
(b) in order to obtain more experimental data, the experiment adopts a sliding windowing method, a window contains 100ms of data, and the data in two windows before and after the window slides backwards by 0.5ms. each time has 99.5ms of overlap. Since the frequency of the acquired signal of the experimental device is 2kHZ, the width of the sliding window set by the application is 100ms, and the step length of the sliding is 0.5ms. Meanwhile, before training, data are disorganized, and the convergence speed of the model is accelerated.
(c) The extraction of the feature value is performed by calculating the root mean square value RMS.
And step 3: classification recognition of gesture actions
And inputting the gesture signals after the characteristics are extracted and the corresponding gesture categories into a designed GRU neural network, and obtaining a neural network model as a gesture classifier through training.
Compared with a traditional machine learning-based recognition and classification method, such as a Decision Tree (Decision Tree), a Random Forest (Random Forest), a Support Vector Machine (SVM), a K-nearest neighbor algorithm (KNN) and Naive Bayes (NB), the method has the advantages that when gesture classification is carried out, robustness on different postures of the arm is higher, gesture classification accuracy is higher, and the method can serve a more complex human-computer interaction system with a scene.
The feasibility of the application is verified through experiments. The experiment is realized by adopting a pyrrch frame, and the performance of the method is verified on the collected electromyographic signals of the 8 testees. P1, P2 and P3 respectively represent the arm lying on the table, the arm at an angle of 45 degrees to the table top and the arm lying in the air parallel. The training data were only the front fraction data in P1 arm posture, and the test data were the back fraction data in P1, P2, and P3 arm postures. Experimental results as shown in the following figures, the accuracy of the gesture classification method based on GRU provided by the present application is as high as 0.96, which is higher than that of other methods based on traditional machine learning. And for the arm postures of P2 and P3, the gesture classification accuracy of the method is reduced to the minimum, namely the robustness is strongest.
Although the present application has been described above with reference to specific embodiments, those skilled in the art will recognize that many changes may be made in the configuration and details of the present application within the principles and scope of the present application. The scope of protection of the application is determined by the appended claims, and all changes that come within the meaning and range of equivalency of the technical features are intended to be embraced therein.

Claims (10)

1. A gesture classification recognition method is characterized by comprising the following steps: the method comprises the following steps:
acquiring a surface electromyographic signal; extracting the characteristics of the surface electromyographic signals to obtain a gesture characteristic sequence and a gesture type; and inputting the gesture feature sequence and the gesture type into a neural network of a circulating gate circuit for training to obtain a classification model, and realizing gesture classification and recognition by adopting the classification model.
2. The gesture classification recognition method according to claim 1, characterized in that: acquiring the surface electromyographic signals comprises wiping experimental equipment and the skin surface of a testee by adopting alcohol, placing electrodes for acquiring the electromyographic signals on the skin surface of the testee, and acquiring the muscle signal changes brought by the gesture actions of different arms in real time.
3. The gesture classification recognition method according to claim 2, characterized in that: the number of the electrodes is 8, and the 8 electrodes are distributed on the forearm at equal intervals.
4. The gesture classification recognition method according to claim 2, characterized in that: and the surface electromyographic signals are transmitted to an intelligent terminal through a wireless device for data processing.
5. The gesture classification recognition method according to claim 1, characterized in that: and processing the surface electromyogram signal comprises converting the surface electromyogram signal into a digital signal and extracting a characteristic value of the digital signal.
6. The gesture classification recognition method according to claim 4, characterized in that: and the surface electromyographic signals are input to the intelligent terminal by adopting a sliding windowing method.
7. The gesture classification recognition method according to claim 6, characterized in that: the frequency of the collected surface electromyographic signals is 2kHZ, the width of the sliding window is 100ms, and the sliding step length is 0.5ms.
8. The gesture classification recognition method according to claim 5, characterized in that: the characteristic value extraction is carried out by calculating a root mean square value.
9. The gesture classification recognition method according to any one of claims 1 to 8, characterized by: the recurrent gate neural network includes a fully connected layer.
10. The application of the gesture classification recognition method is characterized in that: the gesture classification recognition method of any one of claims 1-9 is applied to a human-computer interaction system.
CN202110542132.7A 2021-05-18 2021-05-18 Gesture classification recognition method and application thereof Pending CN113312994A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110542132.7A CN113312994A (en) 2021-05-18 2021-05-18 Gesture classification recognition method and application thereof
PCT/CN2021/138067 WO2022242133A1 (en) 2021-05-18 2021-12-14 Gesture classification and recognition method and application thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110542132.7A CN113312994A (en) 2021-05-18 2021-05-18 Gesture classification recognition method and application thereof

Publications (1)

Publication Number Publication Date
CN113312994A true CN113312994A (en) 2021-08-27

Family

ID=77373401

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110542132.7A Pending CN113312994A (en) 2021-05-18 2021-05-18 Gesture classification recognition method and application thereof

Country Status (2)

Country Link
CN (1) CN113312994A (en)
WO (1) WO2022242133A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114159079A (en) * 2021-11-18 2022-03-11 中国科学院合肥物质科学研究院 Multi-type muscle fatigue detection method based on feature extraction and GRU deep learning model
WO2022242133A1 (en) * 2021-05-18 2022-11-24 中国科学院深圳先进技术研究院 Gesture classification and recognition method and application thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150346833A1 (en) * 2014-06-03 2015-12-03 Beijing TransBorder Information Technology Co., Ltd. Gesture recognition system and gesture recognition method
CN108388348A (en) * 2018-03-19 2018-08-10 浙江大学 A kind of electromyography signal gesture identification method based on deep learning and attention mechanism
CN110598676A (en) * 2019-09-25 2019-12-20 南京邮电大学 Deep learning gesture electromyographic signal identification method based on confidence score model
CN111209885A (en) * 2020-01-13 2020-05-29 腾讯科技(深圳)有限公司 Gesture information processing method and device, electronic equipment and storage medium
CN111553307A (en) * 2020-05-08 2020-08-18 中国科学院合肥物质科学研究院 Gesture recognition system fusing bioelectrical impedance information and myoelectric information
CN111898526A (en) * 2020-07-29 2020-11-06 南京邮电大学 Myoelectric gesture recognition method based on multi-stream convolution neural network
US20210064132A1 (en) * 2019-09-04 2021-03-04 Facebook Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160091965A1 (en) * 2014-09-30 2016-03-31 Microsoft Corporation Natural motion-based control via wearable and mobile devices
US10671908B2 (en) * 2016-11-23 2020-06-02 Microsoft Technology Licensing, Llc Differential recurrent neural network
CN110537922B (en) * 2019-09-09 2020-09-04 北京航空航天大学 Human body walking process lower limb movement identification method and system based on deep learning
CN112783327B (en) * 2021-01-29 2022-08-30 中国科学院计算技术研究所 Method and system for gesture recognition based on surface electromyogram signals
CN113312994A (en) * 2021-05-18 2021-08-27 中国科学院深圳先进技术研究院 Gesture classification recognition method and application thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150346833A1 (en) * 2014-06-03 2015-12-03 Beijing TransBorder Information Technology Co., Ltd. Gesture recognition system and gesture recognition method
CN108388348A (en) * 2018-03-19 2018-08-10 浙江大学 A kind of electromyography signal gesture identification method based on deep learning and attention mechanism
US20210064132A1 (en) * 2019-09-04 2021-03-04 Facebook Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
CN110598676A (en) * 2019-09-25 2019-12-20 南京邮电大学 Deep learning gesture electromyographic signal identification method based on confidence score model
CN111209885A (en) * 2020-01-13 2020-05-29 腾讯科技(深圳)有限公司 Gesture information processing method and device, electronic equipment and storage medium
CN111553307A (en) * 2020-05-08 2020-08-18 中国科学院合肥物质科学研究院 Gesture recognition system fusing bioelectrical impedance information and myoelectric information
CN111898526A (en) * 2020-07-29 2020-11-06 南京邮电大学 Myoelectric gesture recognition method based on multi-stream convolution neural network

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022242133A1 (en) * 2021-05-18 2022-11-24 中国科学院深圳先进技术研究院 Gesture classification and recognition method and application thereof
CN114159079A (en) * 2021-11-18 2022-03-11 中国科学院合肥物质科学研究院 Multi-type muscle fatigue detection method based on feature extraction and GRU deep learning model
CN114159079B (en) * 2021-11-18 2023-05-02 中国科学院合肥物质科学研究院 Multi-type muscle fatigue detection method based on feature extraction and GRU deep learning model

Also Published As

Publication number Publication date
WO2022242133A1 (en) 2022-11-24

Similar Documents

Publication Publication Date Title
CN111209885B (en) Gesture information processing method and device, electronic equipment and storage medium
Bao et al. A CNN-LSTM hybrid model for wrist kinematics estimation using surface electromyography
CN111553307B (en) Gesture recognition system fusing bioelectrical impedance information and myoelectric information
CN110555468A (en) Electroencephalogram signal identification method and system combining recursion graph and CNN
CN106383579A (en) EMG and FSR-based refined gesture recognition system and method
CN111657941B (en) Electrode correction and myoelectric pattern recognition method based on muscle core activation region
CN113312994A (en) Gesture classification recognition method and application thereof
CN103440498A (en) Surface electromyogram signal identification method based on LDA algorithm
CN112732090B (en) Muscle cooperation-based user-independent real-time gesture recognition method
Yang et al. A novel deep learning scheme for motor imagery EEG decoding based on spatial representation fusion
CN113111831A (en) Gesture recognition technology based on multi-mode information fusion
Yao et al. Multi-feature gait recognition with DNN based on sEMG signals
CN114548165B (en) Myoelectricity mode classification method capable of crossing users
Zhang et al. Classification of Finger Movements for Prosthesis Control with Surface Electromyography.
Yang et al. Hand motion recognition based on GA optimized SVM using sEMG signals
CN111783719A (en) Myoelectric control method and device
CN106845348B (en) Gesture recognition method based on arm surface electromyographic signals
Wu et al. A wireless surface EMG acquisition and gesture recognition system
CN110604578A (en) Human hand and hand motion recognition method based on SEMG
AlOmari et al. Novel hybrid soft computing pattern recognition system SVM–GAPSO for classification of eight different hand motions
Wang et al. Research on the key technologies of motor imagery EEG signal based on deep learning
Fu et al. Identification of finger movements from forearm surface EMG using an augmented probabilistic neural network
CN110738093A (en) Classification method based on improved small world echo state network electromyography
CN116755547A (en) Surface electromyographic signal gesture recognition system based on light convolutional neural network
CN114983446A (en) Finger multi-joint continuous motion estimation method based on electromyographic signals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination