CN116019467A - Feature extraction method of facial nerve electromyographic signals - Google Patents

Feature extraction method of facial nerve electromyographic signals Download PDF

Info

Publication number
CN116019467A
CN116019467A CN202310310179.XA CN202310310179A CN116019467A CN 116019467 A CN116019467 A CN 116019467A CN 202310310179 A CN202310310179 A CN 202310310179A CN 116019467 A CN116019467 A CN 116019467A
Authority
CN
China
Prior art keywords
signal
average energy
probability
processing
electromyographic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202310310179.XA
Other languages
Chinese (zh)
Inventor
田倩倩
李洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Affiliated Hospital of Weifang Medical University
Original Assignee
Affiliated Hospital of Weifang Medical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Affiliated Hospital of Weifang Medical University filed Critical Affiliated Hospital of Weifang Medical University
Priority to CN202310310179.XA priority Critical patent/CN116019467A/en
Publication of CN116019467A publication Critical patent/CN116019467A/en
Withdrawn legal-status Critical Current

Links

Images

Landscapes

  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a characteristic extraction method of facial nerve electromyographic signals, and belongs to the technical field of electrical signal data processing; the processing signals are subjected to validity audit to screen the facial nerve electromyographic signals, so that the facial nerve electromyographic signals which do not accord with the identification range can be timely and efficiently obtained and marked, and the influence of the facial nerve electromyographic signals which do not accord with the identification range in the subsequent feature extraction and identification processing can be eliminated; the data support of different dimensions is provided for the type analysis and the oscillation stability analysis of the subsequently acquired processing signals through a pre-constructed standard template and pre-calculated acquired sample average energy and sample total average energy; the invention is used for solving the technical problem that the characteristic extraction of the facial nerve electromyographic signals and the subsequent processing analysis have poor overall effect because the prior scheme does not carry out the validity verification of different purposes in different periods of the facial nerve electromyographic signal characteristic extraction.

Description

Feature extraction method of facial nerve electromyographic signals
Technical Field
The invention relates to the technical field of electric signal data processing, in particular to a characteristic extraction method of facial nerve electromyographic signals.
Background
Electromyographic signals (EMG) are a superposition of the Motor Unit Action Potentials (MUAP) in numerous muscle fibers in time and space; the surface electromyographic Signal (SEMG) is the combined effect of the electrical activity on the superficial muscle EMG and the nerve trunk on the skin surface, reflecting the activity of the neuromuscular to some extent.
In the implementation process of the existing characteristic extraction scheme of the facial nerve electromyographic signals, the method has the defects that the validity analysis is not carried out on the facial nerve electromyographic signals after the collection and pretreatment in the early stage, whether the electromyographic signals are valid or not is judged, the processing of invalid electromyographic signals influences the overall gesture recognition effect, and meanwhile, the validity analysis is not carried out in the gesture type analysis process, so that the accuracy of obtaining the corresponding gesture action type according to the similarity probability analysis is poor.
Disclosure of Invention
The invention aims to provide a characteristic extraction method of a facial nerve electromyographic signal, which is used for solving the technical problem that the prior scheme does not implement validity verification of different purposes in different periods of facial nerve electromyographic signal characteristic extraction, so that the overall effect of the facial nerve electromyographic signal characteristic extraction and subsequent processing analysis is poor.
The aim of the invention can be achieved by the following technical scheme:
a feature extraction method of facial nerve electromyographic signals comprises the following steps:
collecting a facial nerve electromyographic signal of a target through an SEMG electrode, and preprocessing the facial nerve electromyographic signal to obtain a processing signal; the pretreatment comprises high-pass filtering, high-power amplification and low-pass filtering; performing validity verification on the processing signals, and associating the processing signals passing the verification with valid tags;
counting the length of a multichannel SEMG signal before a current processing signal according to an effective label, acquiring the average energy of the surface electromyographic signals of the processing signal before a current sampling point according to the length, and integrating the average energy corresponding to a plurality of processing signals in a monitoring set to acquire the total average energy corresponding to the monitoring set;
matching and verifying the average energy of the processing signals with a pre-constructed standard template to obtain a corresponding selected action type, and marking the action corresponding to the selected action type as a matching action;
and carrying out stability evaluation on the oscillation condition of the matching action, and carrying out dynamic prompt on the oscillation of the corresponding action according to an evaluation result.
Preferably, the step of performing a validity audit on the processed signal comprises:
matching the value of the processing signal frequency with a pre-constructed standard frequency range, and if the processing signal frequency belongs to the standard frequency range, judging that the corresponding electromyographic signal is valid and generating a valid tag; if the frequency of the processing signal does not belong to the standard frequency range, judging that the corresponding electromyographic signal is invalid and generating an invalid tag.
Preferably, the length of the multichannel SEMG signal before the current processing signal is counted according to the effective label, and the average energy of the surface electromyographic signal of the processing signal before the current sampling point is obtained according to the length
Figure SMS_1
And integrating the average energy corresponding to the processing signals in the monitoring set to obtain total average energy E corresponding to the monitoring set; wherein the average energy +.>
Figure SMS_2
The energy calculation formula of (2) is:
Figure SMS_3
in (1) the->
Figure SMS_4
For processing the average energy of the individual channels of signal x with length N before the current sampling point, N is the length of the multi-channel SEMG signal, < >>
Figure SMS_5
For processing the nth signal sample point in the ith channel in the signal segment, i=1, 2,3,4,5; n has a value of [0, N-1 ]];
The calculation formula of the total average energy E is as follows:
Figure SMS_6
where M is the total number of all channels.
Preferably, a plurality of sample actions are trained in advance and corresponding energy features are extracted and the extraction order is followedThe arrangement and combination establishes a standard template; when training a plurality of sample actions, the sample actions need to be repeatedly performed for P times, wherein P is a positive integer, and the corresponding average energy of the sample is obtained through calculation of a sample energy calculation formula
Figure SMS_7
And a sample total average energy Y (k) corresponding to the sample action type; the sample energy calculation formula is:
Figure SMS_8
in (1) the->
Figure SMS_9
The average energy of the ith channel for the kth class of action, N is the length of the template signal, +.>
Figure SMS_10
An nth signal sampling point in an ith channel in the template signal segment;
the calculation formula of the total average energy Y (k) of the sample is as follows:
Figure SMS_11
where M is the total number of all channels.
Preferably, average energy corresponding to a plurality of processing signals in the monitoring set is sequentially compared with average energy of samples corresponding to standard templates of all actions to obtain corresponding similarity probability
Figure SMS_12
The method comprises the steps of carrying out a first treatment on the surface of the Similarity probability->
Figure SMS_13
The calculation formula of (2) is as follows:
Figure SMS_14
and arranging a plurality of similar probabilities in a descending order, marking the similar probability of the first ranking as the selected probability, and carrying out validity analysis.
Preferably, when validity analysis is performed, if the probability is selectedIf the probability is not smaller than the standard probability, judging that the selected probability is effective and setting the corresponding action type as the selected action type, and simultaneously acquiring the variance corresponding to the selected probability and marking the variance as the selected action type
Figure SMS_15
If the selected probability is smaller than the standard probability, judging that the selected probability is invalid, marking the corresponding processing signal as a pre-training signal, and storing the pre-training signal into a pre-training database.
Preferably, the step of evaluating the stability of the oscillation of the matching action comprises:
the corresponding similarity probability variance of the processed signals
Figure SMS_16
The method comprises the steps of performing simultaneous integration with total average energy E and sample total average energy Y (k), and obtaining stability probability p (x|k) of a processing signal x on kth gesture actions through calculation; the calculation formula of the stability probability p (x|k) is:
Figure SMS_17
wherein c1 and c2 are constant coefficients and c1=c2+.0;
and carrying out stability evaluation on the oscillation condition of the selected action type according to the stability probability to obtain a gesture recognition result comprising a start tag and an end tag.
Preferably, when stability evaluation is performed on the oscillation condition of the selected action type according to the stability probability, the stability probability is respectively matched with a stability start threshold and a stability end threshold; the stabilization start threshold is greater than the stabilization end threshold; if the stability probability is greater than the stability start threshold, judging that the kth type of action starts and generating a start label; if the stabilization probability is smaller than the stabilization ending threshold, judging that the kth type of action is ended and generating an ending label.
Compared with the prior art, the invention has the beneficial effects that:
the invention screens the facial nerve electromyographic signals by checking the validity of the processing signals, can timely and efficiently acquire and mark the facial nerve electromyographic signals which do not accord with the recognition range, can eliminate the influence of the facial nerve electromyographic signals which do not accord with the recognition range during the subsequent feature extraction and recognition processing, avoids the influence of the processing of invalid electromyographic signals on the overall gesture recognition effect, and can effectively improve the overall efficiency of the feature extraction and the processing of the electromyographic signals.
According to the invention, through a pre-constructed standard template and pre-calculated obtained sample average energy and sample total average energy, data support with different dimensionalities is provided for type analysis and oscillation stability analysis of subsequently acquired processing signals; the average energy corresponding to the processing signals is compared with the average energy of the samples corresponding to the standard templates of all the actions to obtain the corresponding similarity probability, the gesture types corresponding to the processing signals are obtained based on the similarity probability, the processing and analysis of the facial electromyographic signal characteristics can be efficiently and rapidly realized, meanwhile, the accuracy of extraction and analysis of the facial nerve electromyographic signal characteristics can be improved by carrying out validity analysis on the similarity probability, and reliable data support can be provided for optimization updating of subsequent samples.
Drawings
The invention is further described below with reference to the accompanying drawings.
Fig. 1 is a flow chart of a method for extracting characteristics of facial nerve electromyographic signals.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1, the present invention is a method for extracting characteristics of facial nerve electromyographic signals, comprising:
collecting a facial nerve electromyographic signal of a target through an SEMG electrode, and preprocessing the facial nerve electromyographic signal to obtain a processing signal; the preprocessing comprises high-pass filtering, high-power amplification and low-pass filtering, wherein the preprocessing of the facial nerve electromyographic signals is the conventional technical means, and specific steps are not repeated here; performing validity verification on the processing signals, and associating the processing signals passing the verification with valid tags;
the existing surface electromyographic signal acquisition system comprises a series of components such as an amplifying circuit, a filter circuit, a digital/analog conversion, a communication interface, a computing device and the like besides an electromyographic electrode;
furthermore, the step of performing a validity audit on the processed signal includes:
matching the value of the processing signal frequency with a pre-constructed standard frequency range, and if the processing signal frequency belongs to the standard frequency range, judging that the corresponding electromyographic signal is valid and generating a valid tag; if the frequency of the processing signal does not belong to the standard frequency range, judging that the corresponding electromyographic signal is invalid and generating an invalid tag, and stopping carrying out subsequent feature extraction and recognition processing on the processing signal according to the invalid tag and prompting; the standard frequency range is constructed based on signal frequencies corresponding to a plurality of sample actions trained in advance;
the surface electromyographic signal SEMG is mainly used for acquiring electromyographic signals through a detection electrode arranged on the skin surface, and the electromyographic detection electrode structure is an existing conventional detection electrode structure;
it should be noted that the purpose of performing validity audit on the processing signals is to screen the facial nerve electromyographic signals, so that the facial nerve electromyographic signals which do not accord with the recognition range can be timely and efficiently obtained and marked, the influence of the facial nerve electromyographic signals which do not accord with the recognition range during the subsequent feature extraction and recognition processing can be eliminated, the influence of the processing of invalid electromyographic signals on the overall gesture recognition effect is avoided, and the overall efficiency of the electromyographic signal feature extraction and processing can be effectively improved;
counting the length of a multichannel SEMG signal before a current processing signal according to an effective label, acquiring the average energy of the surface electromyographic signals of the processing signal before a current sampling point according to the length, and integrating the average energy corresponding to a plurality of processing signals in a monitoring set to acquire the total average energy corresponding to the monitoring set; the monitoring set is a set of all the surface electromyographic signals of the multiple channels;
according to the length of the multichannel SEMG signal before the current processing signal is counted by the effective label, the average energy of the surface electromyographic signal of the processing signal before the current sampling point is obtained according to the length
Figure SMS_18
And integrating the average energy corresponding to the processing signals in the monitoring set to obtain total average energy E corresponding to the monitoring set; wherein the average energy
Figure SMS_19
The energy calculation formula of (2) is:
Figure SMS_20
in (1) the->
Figure SMS_21
For processing the average energy of the individual channels of signal x with length N before the current sampling point, N is the length of the multi-channel SEMG signal, < >>
Figure SMS_22
For processing the nth signal sample point in the ith channel in the signal segment, i=1, 2,3,4,5; n has a value of [0, N-1 ]]The method comprises the steps of carrying out a first treatment on the surface of the The signal sampling point can be a muscle fiber of the upper arm traction finger movement;
the calculation formula of the total average energy E is as follows:
Figure SMS_23
wherein M is the total number of all channels;
when different gestures are made, the muscle contraction states of different parts of the human forearm are different, and the energy of the electromyographic signals acquired by the electrodes of each channel is different, so that the average energy of the electromyographic signals of each channel surface in SEMG recognition can be used as the characteristic of gesture action; the total average energy provides data support for the evaluation of the motion oscillation condition obtained by the subsequent matching.
Matching and verifying the average energy of the processing signals with a pre-constructed standard template to obtain a corresponding selected action type, and marking the action corresponding to the selected action type as a matching action;
training a plurality of sample actions in advance, extracting corresponding energy features, and establishing a standard template according to the sequence of extraction; the number of sample actions can be five, namely palm stretching, fist holding, wrist stretching, wrist bending and little finger stretching;
when training a plurality of sample actions, the sample actions need to be repeatedly performed for P times, wherein P is a positive integer, and the corresponding average energy of the sample is obtained through calculation of a sample energy calculation formula
Figure SMS_24
And a sample total average energy Y (k) corresponding to the sample action type; the sample energy calculation formula is:
Figure SMS_25
in (1) the->
Figure SMS_26
The average energy of the ith channel for the kth class of action, N is the length of the template signal, +.>
Figure SMS_27
An nth signal sampling point in an ith channel in the template signal segment;
the calculation formula of the total average energy Y (k) of the sample is as follows:
Figure SMS_28
wherein M is the total number of all channels and is not less than M in the calculation of total average energy;
in addition, average energy corresponding to a plurality of processing signals in the monitoring set is sequentially compared with sample average energy corresponding to the standard templates of all actions to obtain corresponding similarity probability
Figure SMS_29
The method comprises the steps of carrying out a first treatment on the surface of the Similarity probability->
Figure SMS_30
The calculation formula of (2) is as follows:
Figure SMS_31
arranging a plurality of similar probabilities in a descending order, marking the similar probability of the first ranking as a selected probability and implementing validity analysis, if the selected probability is not smaller than the standard probability, judging that the selected probability is valid and setting the corresponding action type as the selected action type, and simultaneously acquiring the variance corresponding to the selected probability and marking the variance as ++ ->
Figure SMS_32
The method comprises the steps of carrying out a first treatment on the surface of the The standard probability can be obtained based on the existing identification standard requirements;
if the selected probability is smaller than the standard probability, judging that the selected probability is invalid, marking the corresponding processing signal as a pre-training signal, and storing the pre-training signal in a pre-training database, so that the pre-training signal can be colloquially understood to be not identical to the gesture action of the sample, does not meet the requirement of the recognition standard, and needs to train and verify the sample subsequently;
the purpose of carrying out the validity analysis on the similarity probability is to further improve the accuracy of the facial nerve electromyographic signal characteristic extraction analysis, and meanwhile, reliable data support can be provided for the optimization updating of the subsequent samples;
in the embodiment of the invention, the data support of different dimensions can be provided for the type analysis and the oscillation stability analysis of the subsequently acquired processing signals through the pre-constructed standard template and the pre-calculated and acquired sample average energy and sample total average energy;
in addition, the average energy corresponding to the processing signals is compared with the average energy of the samples corresponding to the standard templates of all the actions to obtain the corresponding similarity probability, and the gesture types corresponding to the processing signals are obtained based on the similarity probability, so that the processing and analysis of the surface electromyographic signal characteristics can be efficiently and rapidly realized.
Performing stability evaluation on the oscillation condition of the matching action, and dynamically prompting the oscillation of the corresponding action according to an evaluation result; the method comprises the following specific steps of:
the corresponding similarity probability variance of the processed signals
Figure SMS_33
The method comprises the steps of performing simultaneous integration with total average energy E and sample total average energy Y (k), and obtaining stability probability p (x|k) of a processing signal x on kth gesture actions through calculation; the calculation formula of the stability probability p (x|k) is:
Figure SMS_34
wherein c1 and c2 are constant coefficients, and c1=c2 noteq 0, and the values of c1 and c2 can be 2;
performing stability evaluation on the oscillation condition of the selected action type according to the stability probability to obtain a gesture recognition result comprising a start tag and an end tag;
when stability evaluation is carried out on the oscillation condition of the selected action type according to the stability probability, the stability probability is respectively matched with a stability starting threshold value and a stability ending threshold value;
the stability starting threshold is larger than the stability ending threshold, the specific values of the stability starting threshold and the stability ending threshold are obtained by training the corresponding sample stability probability values when a plurality of samples change in motion, the stability starting threshold can be 1.1, and the stability ending threshold can be 0.1;
if the stability probability is greater than the stability start threshold, judging that the kth type of action starts and generating a start label; if the stabilization probability is smaller than the stabilization ending threshold, judging that the kth type of action is ended and generating an ending label.
In the embodiment of the invention, by carrying out the matching analysis on the stability probability, each sampling point is ensured to be judged as one action, namely, each time point only selects the kth type action of the stability probability p (x|k), and the influence of the identification oscillation at the joint of the two action changes can be reduced by filtering the identification result.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the essential characteristics thereof.
Finally, it should be noted that the above-mentioned embodiments are merely for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made to the technical solution of the present invention without departing from the spirit and scope of the technical solution of the present invention.

Claims (8)

1. A method for extracting characteristics of facial nerve electromyographic signals, comprising the steps of:
collecting a facial nerve electromyographic signal of a target through an SEMG electrode, and preprocessing the facial nerve electromyographic signal to obtain a processing signal; the pretreatment comprises high-pass filtering, high-power amplification and low-pass filtering; performing validity verification on the processing signals, and associating the processing signals passing the verification with valid tags;
counting the length of a multichannel SEMG signal before a current processing signal according to an effective label, acquiring the average energy of the surface electromyographic signals of the processing signal before a current sampling point according to the length, and integrating the average energy corresponding to a plurality of processing signals in a monitoring set to acquire the total average energy corresponding to the monitoring set;
matching and verifying the average energy of the processing signals with a pre-constructed standard template to obtain a corresponding selected action type, and marking the action corresponding to the selected action type as a matching action;
and carrying out stability evaluation on the oscillation condition of the matching action, and carrying out dynamic prompt on the oscillation of the corresponding action according to an evaluation result.
2. The method for extracting features of facial nerve electromyographic signals according to claim 1, wherein the step of performing validity verification on the processed signals comprises:
matching the value of the processing signal frequency with a pre-constructed standard frequency range, and if the processing signal frequency belongs to the standard frequency range, judging that the corresponding electromyographic signal is valid and generating a valid tag; if the frequency of the processing signal does not belong to the standard frequency range, judging that the corresponding electromyographic signal is invalid and generating an invalid tag.
3. The method for extracting features of a facial nerve electromyographic signal according to claim 2, wherein the length of the multichannel SEMG signal before the current processing signal is counted according to the effective label, and the average energy of the facial nerve electromyographic signal before the current sampling point of the processing signal is obtained according to the length
Figure QLYQS_1
And integrating the average energy corresponding to the processing signals in the monitoring set to obtain total average energy E corresponding to the monitoring set; wherein the average energy +.>
Figure QLYQS_2
The energy calculation formula of (2) is:
Figure QLYQS_3
in (1) the->
Figure QLYQS_4
For processing the average energy of the individual channels of signal x with length N before the current sampling point, N is the length of the multi-channel SEMG signal, < >>
Figure QLYQS_5
For processing the nth signal sample point in the ith channel in the signal segment, i=1, 2,3,4,5; n has a value of [0, N-1 ]];
The calculation formula of the total average energy E is as follows:
Figure QLYQS_6
where M is the total number of all channels.
4. A feature of facial nerve electromyographic signals as claimed in claim 3The extraction method is characterized in that a plurality of sample actions are trained in advance, corresponding energy features are extracted, and standard templates are established according to the extracted sequence; when training a plurality of sample actions, the sample actions need to be repeatedly performed for P times, wherein P is a positive integer, and the corresponding average energy of the sample is obtained through calculation of a sample energy calculation formula
Figure QLYQS_7
And a sample total average energy Y (k) corresponding to the sample action type; the sample energy calculation formula is:
Figure QLYQS_8
in (1) the->
Figure QLYQS_9
The average energy of the ith channel for the kth class of action, N is the length of the template signal, +.>
Figure QLYQS_10
An nth signal sampling point in an ith channel in the template signal segment;
the calculation formula of the total average energy Y (k) of the sample is as follows:
Figure QLYQS_11
where M is the total number of all channels.
5. The method for extracting features of facial nerve electromyographic signals according to claim 4, wherein average energy corresponding to a plurality of processing signals in the monitoring set is sequentially compared with average energy of samples corresponding to standard templates of each action to obtain corresponding similarity probabilities
Figure QLYQS_12
The method comprises the steps of carrying out a first treatment on the surface of the Similarity probability->
Figure QLYQS_13
Computing means of (a)The formula is:
Figure QLYQS_14
and arranging a plurality of similar probabilities in a descending order, marking the similar probability of the first ranking as the selected probability, and carrying out validity analysis.
6. The method according to claim 5, wherein when the validity analysis is performed, if the selected probability is not smaller than the standard probability, the selected probability is determined to be valid and the corresponding action type is set as the selected action type, and the variance corresponding to the selected probability is obtained and marked as
Figure QLYQS_15
If the selected probability is smaller than the standard probability, judging that the selected probability is invalid, marking the corresponding processing signal as a pre-training signal, and storing the pre-training signal into a pre-training database.
7. The method for extracting features of facial nerve electromyographic signals according to claim 1, wherein the step of evaluating stability of oscillation conditions of the matching action comprises:
the corresponding similarity probability variance of the processed signals
Figure QLYQS_16
The method comprises the steps of performing simultaneous integration with total average energy E and sample total average energy Y (k), and obtaining stability probability p (x|k) of a processing signal x on kth gesture actions through calculation; the calculation formula of the stability probability p (x|k) is:
Figure QLYQS_17
wherein c1 and c2 are constant coefficients and c1=c2+.0;
and carrying out stability evaluation on the oscillation condition of the selected action type according to the stability probability to obtain a gesture recognition result comprising a start tag and an end tag.
8. The method for extracting characteristics of facial nerve electromyographic signals according to claim 7, wherein when stability evaluation is performed on the oscillation condition of the selected action type according to the stability probability, the stability probability is respectively matched with a stability start threshold and a stability end threshold; the stabilization start threshold is greater than the stabilization end threshold; if the stability probability is greater than the stability start threshold, judging that the kth type of action starts and generating a start label; if the stabilization probability is smaller than the stabilization ending threshold, judging that the kth type of action is ended and generating an ending label.
CN202310310179.XA 2023-03-28 2023-03-28 Feature extraction method of facial nerve electromyographic signals Withdrawn CN116019467A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310310179.XA CN116019467A (en) 2023-03-28 2023-03-28 Feature extraction method of facial nerve electromyographic signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310310179.XA CN116019467A (en) 2023-03-28 2023-03-28 Feature extraction method of facial nerve electromyographic signals

Publications (1)

Publication Number Publication Date
CN116019467A true CN116019467A (en) 2023-04-28

Family

ID=86074335

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310310179.XA Withdrawn CN116019467A (en) 2023-03-28 2023-03-28 Feature extraction method of facial nerve electromyographic signals

Country Status (1)

Country Link
CN (1) CN116019467A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117653320A (en) * 2024-02-02 2024-03-08 四川省肿瘤医院 Brain tumor operation monitoring equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110618754A (en) * 2019-08-30 2019-12-27 电子科技大学 Surface electromyogram signal-based gesture recognition method and gesture recognition armband
CN112773382A (en) * 2021-01-20 2021-05-11 钛虎机器人科技(上海)有限公司 Myoelectricity sensing method and system with user self-adaption capability
CN113598759A (en) * 2021-09-13 2021-11-05 曲阜师范大学 Lower limb action recognition method and system based on myoelectric feature optimization

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110618754A (en) * 2019-08-30 2019-12-27 电子科技大学 Surface electromyogram signal-based gesture recognition method and gesture recognition armband
CN112773382A (en) * 2021-01-20 2021-05-11 钛虎机器人科技(上海)有限公司 Myoelectricity sensing method and system with user self-adaption capability
CN113598759A (en) * 2021-09-13 2021-11-05 曲阜师范大学 Lower limb action recognition method and system based on myoelectric feature optimization

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱旭鹏;陈香;李云;赵璋炎;: "基于经验公式的连续手势动作表面肌电信号识别方法", 北京生物医学工程, no. 02, pages 118 - 124 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117653320A (en) * 2024-02-02 2024-03-08 四川省肿瘤医院 Brain tumor operation monitoring equipment
CN117653320B (en) * 2024-02-02 2024-04-05 四川省肿瘤医院 Brain tumor operation monitoring equipment

Similar Documents

Publication Publication Date Title
CN110598676B (en) Deep learning gesture electromyographic signal identification method based on confidence score model
CN106803081A (en) A kind of brain electricity sorting technique based on Multi-classifers integrated
CN109299751B (en) EMD data enhancement-based SSVEP electroencephalogram classification method of convolutional neural model
CN109758145B (en) Automatic sleep staging method based on electroencephalogram causal relationship
CN109009102B (en) Electroencephalogram deep learning-based auxiliary diagnosis method and system
CN116019467A (en) Feature extraction method of facial nerve electromyographic signals
CN111460951A (en) Electrocardiosignal automatic analysis method based on deep learning
CN113486752B (en) Emotion recognition method and system based on electrocardiosignal
CN113940634B (en) Alzheimer&#39;s disease classification diagnosis system based on high potential treatment
CN106845348B (en) Gesture recognition method based on arm surface electromyographic signals
CN106843509B (en) Brain-computer interface system
CN111481193B (en) Fall risk assessment and early warning method and system
CN108470182A (en) A kind of brain-computer interface method enhanced for asymmetric brain electrical feature with identification
CN111616680A (en) Automatic mental load identification method and system
CN110163142B (en) Real-time gesture recognition method and system
CN210697629U (en) Signal acquisition device, mobile terminal and signal analysis system
CN114098768B (en) Cross-individual surface electromyographic signal gesture recognition method based on dynamic threshold and EasyTL
CN111973184B (en) Model training data optimization method for nonideal sEMG signals
CN113057654B (en) Memory load detection and extraction system and method based on frequency coupling neural network model
CN111657860B (en) Method and system for identifying sleep stage based on counterstudy
KR20220158462A (en) EMG signal-based recognition information extraction system and EMG signal-based recognition information extraction method using the same
CN114420304A (en) Novel new crown auxiliary screening method and device based on deep learning
CN111324880A (en) Fingerprint and electrocardio characteristic double-authentication identity recognition system and method
CN114288634B (en) Body-building action self-recognition and alarm system based on electromyographic signal acquisition
CN110413125A (en) Conversion method, electronic equipment and storage medium based on brain wave

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20230428