CN112315488A - Human motion state identification method based on electromyographic signals - Google Patents

Human motion state identification method based on electromyographic signals Download PDF

Info

Publication number
CN112315488A
CN112315488A CN202011324782.6A CN202011324782A CN112315488A CN 112315488 A CN112315488 A CN 112315488A CN 202011324782 A CN202011324782 A CN 202011324782A CN 112315488 A CN112315488 A CN 112315488A
Authority
CN
China
Prior art keywords
electromyographic
gray level
matrix
state
signals
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011324782.6A
Other languages
Chinese (zh)
Inventor
徐兆红
许留凯
田俊
郏云涛
何方剑
裘焱枫
张克勤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Industrial Internet Research Institute Co ltd
Original Assignee
Ningbo Industrial Internet Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Industrial Internet Research Institute Co ltd filed Critical Ningbo Industrial Internet Research Institute Co ltd
Priority to CN202011324782.6A priority Critical patent/CN112315488A/en
Publication of CN112315488A publication Critical patent/CN112315488A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/725Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Physiology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a human motion state recognition method based on electromyographic signals, which is characterized by collecting the electromyographic signals; preprocessing the electromyographic signals to obtain time window electromyographic time sequence signals; converting all time window myoelectricity time sequence signals into myoelectricity gray level images; a convolutional neural network is adopted to realize the identification of the current human motion state; the method has the advantages that the traditional myoelectric signals are converted into specific myoelectric gray level images to extract features, some useful information cannot be lost in the process of extracting the features, the features can be completely expressed, a time window is used to achieve the effect of real-time acquisition and analysis, the Butterworth filter has the advantage of high calculation efficiency, the convolutional neural network can be used to accurately extract the features of the myoelectric gray level images to complete the classification of the current myoelectric gray level images, so that the current human motion state is finally identified, the features are extracted without depending on experience, the classification accuracy is high, and the accuracy of human motion state identification is effectively improved.

Description

Human motion state identification method based on electromyographic signals
Technical Field
The invention relates to a motion state identification method, in particular to a human motion state identification method based on an electromyographic signal.
Background
The lower limb assistance exoskeleton robot is a man-machine integrated mechanical device for people to wear, combines human intelligence with the physical power of the robot, belongs to a man-machine cooperative robot, and can play a certain auxiliary role in the rotation of human joints when a wearer has subjective movement intention but the movement ability is declined or loses the movement ability, or the wearer completely depends on the drive of the lower limb assistance exoskeleton robot to carry out human body movement, such as old and disabled assistance or rehabilitation training and the like.
The myoelectric signal is a very weak bioelectric signal on the surface of a human body, is a physiological parameter reflecting muscle form, physiological function and state change, is very easily influenced by the surrounding environment, and is commonly used in the scene of recognizing and controlling the movement intention of the lower limb assistance exoskeleton robot.
The existing myoelectric signal motion recognition method adopts a time domain or frequency domain signal processing means to process the myoelectric signal, obtains the characteristic vector of the myoelectric signal, and then classifies the motion by using classifiers such as a support vector machine, a neural network, Bayes and the like; the existing motion recognition method has low performance requirement on embedded hardware and high operation efficiency, but when extracting features from signals, not only some useful information can be lost, but also only partial features can be extracted, and the signals cannot be completely expressed; in addition, the extraction of the features needs experience, and the features extracted by different signal processing methods are different, so that the accuracy of motion state classification is influenced.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a human motion state identification method based on electromyographic signals, which can not only extract feature information completely without depending on experience, but also effectively improve the accuracy of motion state identification.
The technical scheme adopted by the invention for solving the technical problems is as follows: a human motion state identification method based on electromyographic signals comprises the following steps:
firstly, collecting electromyographic signals: the method comprises the following steps that a plurality of myoelectric electrode plates of a myoelectric sensor are attached to appointed muscle positions of thighs and shanks of a person, the myoelectric sensor respectively collects myoelectric signals of a walking state, a stair climbing state, a stair descending state and a stooping state of the person according to preset collection frequency, and sends the myoelectric signals to a signal processing module;
preprocessing the electromyographic signals: the signal processing module processes the electromyographic signals received in an acquisition period by adopting a Butterworth filter to obtain filtered electromyographic signals, and then slides all the filtered electromyographic signals in sequence according to a preset time window length to perform time window interception processing to obtain a group of time window electromyographic time sequence signals corresponding to each time window;
the signal processing module converts all time window myoelectricity time sequence signals into myoelectricity gray level images: acquiring a corresponding amplitude sequence group consisting of the amplitude of each electromyographic signal according to each group of time window electromyographic time sequence signals, sequentially arranging all elements in the amplitude sequence group to form an M multiplied by N electromyographic matrix, wherein M represents the number of rows of the electromyographic matrix, N represents the number of columns of the electromyographic matrix, carrying out normalization processing on each electromyographic matrix to obtain a normalized electromyographic matrix, and constructing an electromyographic gray level image corresponding to each electromyographic matrix, wherein the number of rows and the number of columns of pixel points of the electromyographic gray level image are M rows and N columns, and the gray level value of each pixel point in the electromyographic gray level image is the value of the element which is the same as the number of rows and the number of columns of the pixel point in the normalized electromyographic matrix;
fourthly, the signal processing module adopts a convolutional neural network to realize the identification of the current human motion state: the myoelectricity gray level image is input into a convolutional neural network in real time, and the motion state classification is carried out on the myoelectricity gray level image by adopting softmax as a classifier, so that the probability that the myoelectricity gray level image is classified into each motion state of a walking state, a stair climbing state, a stair descending state and a bending state is obtained, the motion state corresponding to the probability with the maximum numerical value is used as the current human motion state, and the identification process is completed.
The Butterworth filter adopted in the step II is
Figure BDA0002792448030000021
Where ω is expressed as a preset acquisition frequency, ωcDenotes the cut-off frequency, H (ω) denotes the amplitude frequency, and s denotes the order of the butterworth filter.
The preset acquisition frequency is 1000 Hz.
The order s of the Butterworth filter is 2.
The Butterworth filter is a band-pass filter.
The upper cut-off frequency of the cut-off frequency is 450Hz, and the lower cut-off frequency is 20 Hz.
The specific normalization processing method in the third step is as follows: marking the value of the element of the ith row and the ith column of each electromyographic matrix as VEMG (i, j), wherein i is more than or equal to 1 and less than or equal to M, j is more than or equal to 1 and less than or equal to N, and normalizing the electromyographic matrix to obtain a normalized electromyographic signal amplitude VEMGafter(i,j),
Figure BDA0002792448030000022
Where min (vemg) represents the minimum element value in the electromyography matrix, and max (vemg) represents the maximum element value in the electromyography matrix.
The specific method of the convolutional neural network in the step (iv) is as follows:
a, performing convolution operation on the myoelectricity gray level image, and extracting a characteristic value C in the myoelectricity gray level imagex,y(Θ),
Figure BDA0002792448030000031
Wherein x represents the abscissa of the pixel, y represents the ordinate of the pixel, m represents the row variable, n represents the column variable, Θx+m-1,y+n-1The gray value P of the pixel point of the x + m-1 th row and the y + n-1 th columnm,nExpressing the weight values of the Gaussian convolution functions of the pixel points of the mth row and the nth column, and expressing the row number of the Gaussian convolution functions by r;
and B, performing characteristic degradation on the characteristic values by adopting a root-mean-square pooling strategy to obtain a degraded characteristic matrix.
The specific method for classifying the myoelectric gray level image into the motion states of the walking state, the stair climbing state, the stair descending state and the stooping state in the step (IV) is as follows: defining the probability that the myoelectric gray level image is classified into the walking state as d1
Figure BDA0002792448030000032
Defining the probability that the myoelectric gray level image is classified into the upstairs stair-climbing state as d2
Figure BDA0002792448030000033
Defining the probability that the myoelectric gray level image is classified into the downstairs state as d3
Figure BDA0002792448030000034
Figure BDA0002792448030000035
Defining the probability that the myoelectric gray level image is classified into a stoop state as d4
Figure BDA0002792448030000036
Wherein Z1First eigenvalue, Z, of the degraded eigenvalue matrix2Second eigenvalue, Z, of the degraded eigenvalue matrix3Third eigenvalue, Z, representing the degraded eigenvalue matrix4And a fourth eigenvalue representing the degraded eigenvalue matrix.
Compared with the prior art, the method has the advantages that the myoelectric electrode plates of the myoelectric sensor are firstly attached to the positions of appointed muscles of thighs and shanks of a person, the myoelectric sensor respectively collects myoelectric signals of a walking state, a stair climbing state, a stair descending state and a bending state of the person according to preset collection frequency and sends the myoelectric signals to the signal processing module, then the signal processing module preprocesses the collected myoelectric signals to obtain time window myoelectric time sequence signals, then the signal processing module converts the time window myoelectric time sequence signals into a myoelectric matrix and then converts the time window myoelectric time sequence signals into myoelectric gray images, and finally, a convolutional neural network is adopted in the signal processing module to realize the recognition of the current human motion state; the traditional myoelectric signals are converted into specific myoelectric gray level images to extract features, so that not only is some useful information not lost in the process of extracting the features prevented, but also the features can be completely expressed, a time window is used to achieve the effect of real-time acquisition and analysis, a Butterworth filter has the advantage of high calculation efficiency, finally, a convolutional neural network is adopted to accurately extract the features of the myoelectric gray level images, the classification of the current myoelectric gray level images is completed, the current human motion state is finally identified, the features are extracted without depending on experience, the classification accuracy is high, and the accuracy of human motion state identification is effectively improved.
Drawings
FIG. 1 is a schematic flow diagram of the present invention;
FIG. 2 is a schematic diagram of a time window electromyographic time series signal obtained by intercepting a certain section of filtered electromyographic signal by a time window;
FIG. 3 is a schematic diagram of a group of time windows of the present invention for converting the myoelectric time series signals into myoelectric gray scale images.
Detailed Description
The invention is described in further detail below with reference to the accompanying examples.
As shown in fig. 1, a method for recognizing a human motion state based on an electromyographic signal includes the following steps:
firstly, collecting electromyographic signals: the method comprises the following steps that a plurality of myoelectric electrode plates of a myoelectric sensor are attached to appointed muscle positions of thighs and shanks of a person, the myoelectric sensor respectively collects myoelectric signals of a walking state, a stair climbing state, a stair descending state and a stooping state of the person according to preset collection frequency, and sends the myoelectric signals to a signal processing module; the signal processing module is used for processing the collected electromyographic signals and realizing the identification of the current motion state of the human body;
preprocessing the electromyographic signals: the signal processing module processes the electromyographic signals received in an acquisition period by adopting a Butterworth filter to obtain filtered electromyographic signals, and then slides all the filtered electromyographic signals in sequence according to a preset time window length to perform time window interception processing to obtain a group of time window electromyographic time sequence signals corresponding to each time window;
the employed Butterworth filter is
Figure BDA0002792448030000041
Where ω is expressed as a preset acquisition frequency, ωcDenotes the cut-off frequency, H (ω) denotes the amplitude frequency, s denotes the order of the butterworth filter; the preset acquisition frequency is 1000Hz, the order s of the Butterworth filter is 2 orders, the Butterworth filter is a band-pass filter, the upper cut-off frequency of the cut-off frequency is 450Hz, and the lower cut-off frequency is 20 Hz;
the signal processing module converts all time window myoelectricity time sequence signals into myoelectricity gray level images: acquiring a corresponding amplitude sequence group consisting of the amplitude of each electromyographic signal according to each group of time window electromyographic time sequence signals, sequentially arranging all elements in the amplitude sequence group to form an M multiplied by N electromyographic matrix, wherein M represents the number of rows of the electromyographic matrix, N represents the number of columns of the electromyographic matrix, carrying out normalization processing on each electromyographic matrix to obtain a normalized electromyographic matrix, and constructing an electromyographic gray level image corresponding to each electromyographic matrix, wherein the number of rows and the number of columns of pixel points of the electromyographic gray level image are M rows and N columns, and the gray level value of each pixel point in the electromyographic gray level image is the value of the element which is the same as the number of rows and the number of columns of the pixel point in the normalized electromyographic matrix;
the specific normalization processing method comprises the following steps: the value of the element in the ith row and the jth column of each electromyography matrix is denoted as VEMG (i,j) i is more than or equal to 1 and less than or equal to M, j is more than or equal to 1 and less than or equal to N, and the normalized electromyographic signal amplitude VEMG is obtained after the electromyographic matrix is normalizedafter(i,j),
Figure BDA0002792448030000042
Wherein min (VEMG) represents the minimum element value in the electromyographic matrix, and max (VEMG) represents the maximum element value in the electromyographic matrix;
as shown in fig. 2 and 3, the acquisition frequency is preset to 1000Hz, the length of the time window is preset to 240ms, and one acquisition cycle is set to 4s, so that 4000 electromyographic signals can be acquired in the movement state of 4s, 3760 time windows can be obtained after filtering processing and time window interception, a certain group of time window electromyographic time series signals with 240 electromyographic signals can be converted into an electromyographic matrix of 12 × 20 according to the signal processing module, and the electromyographic matrix of 12 × 20 is continuously converted into an electromyographic gray scale image of 12 × 20 pixels; analyzing all myoelectric gray level images once every 200ms, and analyzing one myoelectric gray level image every time;
fourthly, the signal processing module adopts a convolutional neural network to realize the identification of the current human motion state: the myoelectricity gray level image is input into a convolutional neural network in real time, and the motion state classification is carried out on the myoelectricity gray level image by adopting softmax as a classifier, so that the probability that the myoelectricity gray level image is classified into each motion state of a walking state, a stair climbing state, a stair descending state and a bending state is obtained, the motion state corresponding to the probability with the maximum numerical value is used as the current human motion state, and the identification process is completed;
the specific method of the convolutional neural network is as follows:
a, performing convolution operation on the myoelectricity gray level image, and extracting a characteristic value C in the myoelectricity gray level imagex,y(Θ),
Figure BDA0002792448030000057
Wherein x represents the abscissa of the pixel, y represents the ordinate of the pixel, m represents the row variable, n represents the column variable, Θx+m-1,y+n-1The gray value P of the pixel point of the x + m-1 th row and the y + n-1 th columnm,nPixel point for representing m row and n columnR represents the number of lines of the Gaussian convolution function;
b, performing characteristic degradation on the characteristic values by adopting a root-mean-square pooling strategy to obtain a degraded characteristic matrix Sp,q(Θ),
Figure BDA0002792448030000051
Where p represents a row variable, q represents a column variable, e represents a pooling domain radius, Θx,yExpressing the gray values of the pixel points of the x row and the y column;
the specific method for classifying the myoelectric gray level image into the probability of each motion state of the walking state, the stair ascending state, the stair descending state and the waist bending state comprises the following steps: defining the probability that the myoelectric gray level image is classified into the walking state as d1
Figure BDA0002792448030000052
Defining the probability that the myoelectric gray level image is classified into the upstairs stair-climbing state as d2
Figure BDA0002792448030000053
Figure BDA0002792448030000054
Defining the probability that the myoelectric gray level image is classified into the downstairs state as d3
Figure BDA0002792448030000055
Defining the probability that the myoelectric gray level image is classified into a stoop state as d4
Figure BDA0002792448030000056
Wherein Z1First eigenvalue, Z, of the degraded eigenvalue matrix2Second eigenvalue, Z, of the degraded eigenvalue matrix3Third eigenvalue, Z, representing the degraded eigenvalue matrix4A fourth eigenvalue representing a degraded eigenvalue matrix; the eigenvalue is obtained by matrix operation of the degraded characteristic matrix.

Claims (9)

1. A human motion state identification method based on electromyographic signals is characterized by comprising the following steps:
firstly, collecting electromyographic signals: the method comprises the following steps that a plurality of myoelectric electrode plates of a myoelectric sensor are attached to appointed muscle positions of thighs and shanks of a person, the myoelectric sensor respectively collects myoelectric signals of a walking state, a stair climbing state, a stair descending state and a stooping state of the person according to preset collection frequency, and sends the myoelectric signals to a signal processing module;
preprocessing the electromyographic signals: the signal processing module processes the electromyographic signals received in an acquisition period by adopting a Butterworth filter to obtain filtered electromyographic signals, and then slides all the filtered electromyographic signals in sequence according to a preset time window length to perform time window interception processing to obtain a group of time window electromyographic time sequence signals corresponding to each time window;
the signal processing module converts all time window myoelectricity time sequence signals into myoelectricity gray level images: acquiring a corresponding amplitude sequence group consisting of the amplitude of each electromyographic signal according to each group of time window electromyographic time sequence signals, sequentially arranging all elements in the amplitude sequence group to form an M multiplied by N electromyographic matrix, wherein M represents the number of rows of the electromyographic matrix, N represents the number of columns of the electromyographic matrix, carrying out normalization processing on each electromyographic matrix to obtain a normalized electromyographic matrix, and constructing an electromyographic gray level image corresponding to each electromyographic matrix, wherein the number of rows and the number of columns of pixel points of the electromyographic gray level image are M rows and N columns, and the gray level value of each pixel point in the electromyographic gray level image is the value of the element which is the same as the number of rows and the number of columns of the pixel point in the normalized electromyographic matrix;
fourthly, the signal processing module adopts a convolutional neural network to realize the identification of the current human motion state: the myoelectricity gray level image is input into a convolutional neural network in real time, and the motion state classification is carried out on the myoelectricity gray level image by adopting softmax as a classifier, so that the probability that the myoelectricity gray level image is classified into each motion state of a walking state, a stair climbing state, a stair descending state and a bending state is obtained, the motion state corresponding to the probability with the maximum numerical value is used as the current human motion state, and the identification process is completed.
2. The method according to claim 1, wherein the Butterworth filter used in the step (II) is a Butterworth filter
Figure FDA0002792448020000011
Where ω is expressed as a preset acquisition frequency, ωcDenotes the cut-off frequency, H (ω) denotes the amplitude frequency, and s denotes the order of the butterworth filter.
3. The method for recognizing human motion state based on electromyographic signals according to claim 1 or 2, wherein the preset collection frequency is 1000 Hz.
4. The method as claimed in claim 2, wherein the order s of the Butterworth filter is 2.
5. The method according to claim 2, wherein the Butterworth filter is a band pass filter.
6. The method according to claim 2, wherein the cutoff frequency has an upper cutoff frequency of 450Hz and a lower cutoff frequency of 20 Hz.
7. The method for recognizing human motion state based on electromyographic signals according to claim 1, wherein the normalization processing method in the third step is as follows: marking the value of the element of the ith row and the jth column of each electromyographic matrix as VEMG (i, j), wherein i is more than or equal to 1 and less than or equal to M, j is more than or equal to 1 and less than or equal to N, and normalizing the electromyographic matrix to obtain a normalized electromyographic signal amplitude VEMGafter(i,j),
Figure FDA0002792448020000021
Where min (vemg) represents the minimum element value in the electromyography matrix, and max (vemg) represents the maximum element value in the electromyography matrix.
8. The method for recognizing human motion state based on electromyographic signals according to claim 1, wherein the convolutional neural network in the step (iv) is specifically:
a, performing convolution operation on the myoelectricity gray level image, and extracting a characteristic value C in the myoelectricity gray level imagex,y(Θ),
Figure FDA0002792448020000022
Wherein x represents the abscissa of the pixel, y represents the ordinate of the pixel, m represents the row variable, n represents the column variable, Θx+m-1,y+n-1The gray value P of the pixel point of the x + m-1 th row and the y + n-1 th columnm,nExpressing the weight values of the Gaussian convolution functions of the pixel points of the mth row and the nth column, and expressing the row number of the Gaussian convolution functions by r;
and B, performing characteristic degradation on the characteristic values by adopting a root-mean-square pooling strategy to obtain a degraded characteristic matrix.
9. The method for recognizing the motion state of the human body based on the electromyogram signal according to claim 1, wherein the probability that the electromyogram gray scale image is classified into each motion state of a walking state, an ascending stair state, a descending stair state and a stooping state in the step (iv) is specifically: defining the probability that the myoelectric gray level image is classified into the walking state as d1
Figure FDA0002792448020000023
Defining the probability that the myoelectric gray level image is classified into the upstairs stair-climbing state as d2
Figure FDA0002792448020000024
Defining myoelectric grayscale images to be classifiedThe probability of going down the staircase is d3
Figure FDA0002792448020000025
Figure FDA0002792448020000026
Defining the probability that the myoelectric gray level image is classified into a stoop state as d4
Figure FDA0002792448020000027
Wherein Z1First eigenvalue, Z, of the degraded eigenvalue matrix2Second eigenvalue, Z, of the degraded eigenvalue matrix3Third eigenvalue, Z, representing the degraded eigenvalue matrix4And a fourth eigenvalue representing the degraded eigenvalue matrix.
CN202011324782.6A 2020-11-23 2020-11-23 Human motion state identification method based on electromyographic signals Pending CN112315488A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011324782.6A CN112315488A (en) 2020-11-23 2020-11-23 Human motion state identification method based on electromyographic signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011324782.6A CN112315488A (en) 2020-11-23 2020-11-23 Human motion state identification method based on electromyographic signals

Publications (1)

Publication Number Publication Date
CN112315488A true CN112315488A (en) 2021-02-05

Family

ID=74321087

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011324782.6A Pending CN112315488A (en) 2020-11-23 2020-11-23 Human motion state identification method based on electromyographic signals

Country Status (1)

Country Link
CN (1) CN112315488A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140198944A1 (en) * 2013-01-14 2014-07-17 Qualcomm Incorporated Use of emg for subtle gesture recognition on surfaces
CN106980367A (en) * 2017-02-27 2017-07-25 浙江工业大学 A kind of gesture identification method based on myoelectricity topographic map
CN109498362A (en) * 2018-09-10 2019-03-22 南京航空航天大学 A kind of hemiplegic patient's hand movement function device for healing and training and model training method
CN110464348A (en) * 2019-07-10 2019-11-19 深圳市智能机器人研究院 The continuous amount of exercise recognition methods of joint of lower extremity and system based on electromyography signal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140198944A1 (en) * 2013-01-14 2014-07-17 Qualcomm Incorporated Use of emg for subtle gesture recognition on surfaces
CN106980367A (en) * 2017-02-27 2017-07-25 浙江工业大学 A kind of gesture identification method based on myoelectricity topographic map
CN109498362A (en) * 2018-09-10 2019-03-22 南京航空航天大学 A kind of hemiplegic patient's hand movement function device for healing and training and model training method
CN110464348A (en) * 2019-07-10 2019-11-19 深圳市智能机器人研究院 The continuous amount of exercise recognition methods of joint of lower extremity and system based on electromyography signal

Similar Documents

Publication Publication Date Title
CN110537922B (en) Human body walking process lower limb movement identification method and system based on deep learning
CN100594858C (en) Electric artificial hand combined controlled by brain electricity and muscle electricity and control method
CN112603758A (en) Gesture recognition method based on sEMG and IMU information fusion
CN111631908B (en) Active hand rehabilitation system for stroke based on brain-computer interaction and deep learning
CN111184512A (en) Method for recognizing rehabilitation training actions of upper limbs and hands of stroke patient
CN114099234B (en) Intelligent rehabilitation robot data processing method and system for assisting rehabilitation training
CN112120697A (en) Muscle fatigue advanced prediction and classification method based on surface electromyographic signals
CN112022619A (en) Multi-mode information fusion sensing system of upper limb rehabilitation robot
CN113143676B (en) Control method of external limb finger based on brain-muscle-electricity cooperation
Pancholi et al. Intelligent upper-limb prosthetic control (iULP) with novel feature extraction method for pattern recognition using EMG
Vijayvargiya et al. Voting-based 1D CNN model for human lower limb activity recognition using sEMG signal
WO2021184599A1 (en) Ms-cnn-based p300 signal identification method and apparatus, and storage medium
CN115206484B (en) Cerebral apoplexy rehabilitation training system
CN201227336Y (en) Electric artificial hand controlled by brain electricity and muscle electricity
CN112315488A (en) Human motion state identification method based on electromyographic signals
CN111783719A (en) Myoelectric control method and device
CN116831874A (en) Lower limb rehabilitation device control method based on electromyographic signals
Sun et al. A fault-tolerant algorithm to enhance generalization of EMG-based pattern recognition for lower limb movement
CN114569143A (en) Myoelectric gesture recognition method based on attention mechanism and multi-feature fusion
CN109522810B (en) Myoelectric prosthetic hand gesture recognition method based on community voting mechanism
Qi et al. Recognition of composite motions based on sEMG via deep learning
CN114504730A (en) Portable brain-controlled hand electrical stimulation rehabilitation system based on deep learning
CN114343679A (en) Surface electromyogram signal upper limb action recognition method and system based on transfer learning
Guo et al. A novel fuzzy neural network-based rehabilitation stage classifying method for the upper limb rehabilitation robotic system
Ye et al. Upper Limb Motion Recognition Using Gated Convolution Neural Network via Multi-Channel sEMG

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210205

RJ01 Rejection of invention patent application after publication