CN105943206A - Prosthetic hand control method based on MYO armlet - Google Patents

Prosthetic hand control method based on MYO armlet Download PDF

Info

Publication number
CN105943206A
CN105943206A CN201610379614.4A CN201610379614A CN105943206A CN 105943206 A CN105943206 A CN 105943206A CN 201610379614 A CN201610379614 A CN 201610379614A CN 105943206 A CN105943206 A CN 105943206A
Authority
CN
China
Prior art keywords
electromyographic signal
myo armlet
myo
signal
armlet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610379614.4A
Other languages
Chinese (zh)
Inventor
李传江
王朋
张崇明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Normal University
University of Shanghai for Science and Technology
Original Assignee
Shanghai Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Normal University filed Critical Shanghai Normal University
Priority to CN201610379614.4A priority Critical patent/CN105943206A/en
Publication of CN105943206A publication Critical patent/CN105943206A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/70Operating or control means electrical
    • A61F2/72Bioelectric control, e.g. myoelectric
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/70Operating or control means electrical
    • A61F2002/704Operating or control means electrical computer-controlled, e.g. robotic control

Abstract

The invention relates to a prosthetic hand control method based on an MYO armlet. An electromyographic signal of arm muscles is acquired in real time through the MYO armlet and read, the characteristic value of the electromyographic signal is extracted, hand motion mode is recognized online by means of the characteristic value and a well trained neural network model, and the motion mode is converted into a corresponding motor movement instruction which drives a prosthetic hand to make a corresponding motion; a method for training the neural network model comprises the steps of executing the hand motion, acquiring the electromyographic signal of arm muscles through the MYO armlet, reading the electromyographic signal, extracting the characteristic value of the electromyographic signal, and training the hand motion neural network model according to a sample of the characteristic value. Compared with the prior art, the method has the advantages of being convenient to use, low in cost, high in cost performance, wide in application range and the like.

Description

A kind of prosthetic hand control method based on MYO armlet
Technical field
The invention belongs to the technical field that computer intersects with rehabilitation engineering, especially relate to a kind of based on MYO armlet Prosthetic hand control method.
Background technology
Current people with disability's data survey shows, the quantity of China physical disabilities patient is up to 24,120,000 people, accounts for total people The 1.83% of mouth number, wherein patients with amputation 2,260,000 people.Conservative estimation, needs the patient installing prosthetic hand to reach more than 250,000 people, because of This, prosthetic hand has huge market.At present, the myoelectric limb hands of domestic-developed still based on single action mode, many actions mould The high-end myoelectric limb hands of formula relies primarily on import.
Electromyographic signal is that the reliable signal source of artificial limb hand-guided is widely used in myoelectric limb hands, myoelectric sensor Quality directly affects the precision of movement recognition, thus affects the overall performance of prosthetic hand.High-quality sensor is somewhat expensive, Such as Biomatric and Delsys etc., the price of single sensor is all more than 5000 yuan.The domestic many employings of artificial limb producer oneself The myoelectric sensor made, signal disturbing is relatively big, has a strong impact on movement recognition precision.
The Chinese patent of Application No. 201210580705.6 discloses a kind of myoelectric limb control system, and this system includes Myoelectric limb controller and host computer, wherein myoelectric limb controller includes electromyographic electrode, electromyographic signal collection module, controls mould Block, communication module and motor drive module, myoelectricity data are sent to communication module by control module, to the myoelectricity data collected Carry out type of action identification, and output category result;Myoelectricity data are sent to host computer and are sent by host computer by communication module Classification of motion device be sent to control module;Motor drive module receives the classification results of control module, in driving myoelectric limb Portion's motor work, completes corresponding actions;The myoelectricity data that the communication module that receives host computer transmits carry out pattern drill, obtain action Grader, and classification of motion device is sent to communication module, it is achieved that the On-line Control of artificial limb.
But the sensor in electromyographic signal collection module is output as analogue signal, need after data acquisition to use myoelectricity letter Number acquisition module is amplified the processes such as filtering, analog digital conversion and just can extract, electromyographic signal collection module and electromyographic electrode Wired connection, data transmission is easily subject to interference, and data processing precision affects real-time.
Summary of the invention
Defect that the purpose of the present invention is contemplated to overcome above-mentioned prior art to exist and provide a kind of use cost low, dynamic Make the prosthetic hand control method based on MYO armlet that discrimination is high.
The purpose of the present invention can be achieved through the following technical solutions:
A kind of prosthetic hand control method based on MYO armlet, the method is by using MYO armlet Real-time Collection arm muscles Electromyographic signal, read electromyographic signal and also extract its eigenvalue, utilize eigenvalue and the neural network model trained to know online Other hand motion pattern, is changed into corresponding motor movement instructions by this action pattern, drives prosthetic hand to make corresponding action, This process is ONLINE RECOGNITION;
Described neural network model training method includes: perform human body hand motion, uses MYO armlet to gather arm flesh The electromyographic signal of meat, reads electromyographic signal and extracts its eigenvalue, according to the sample of eigenvalue, the nerve net of training hand motion Network model, this process is the off-line procedure before ONLINE RECOGNITION.
The storage of neural network model off-line training process and training data and training result can be completed by PC, will instruction Weights and the threshold parameter of practicing gained are downloaded to panel based on embedded system, during ONLINE RECOGNITION, are processed by panel Electromyographic signal, identification hand motion pattern, then control the action of prosthetic hand by panel.
Described electromyographic signal is read by the blue tooth interface of MYO armlet.
The extraction of the eigenvalue of described electromyographic signal comprises the following steps:
S1, uses the electromyographic signal of M channel acquisition arm muscles of MYO armlet and reads electromyographic signal, wherein, and passage Number M is arranged between 2~8, arranges according to the number of hand motion type;
S2, according to the temporal signatures of each passage electromyographic signal, determines hand motion beginning and ending time point;
S3, within the hand motion beginning and ending time, one section of electromyographic signal of intercepting is as a signal sequence, to each signal sequence Row extract n eigenvalue, obtain the characteristic vector of M n dimension;
S4, uses PCA to characteristic vector dimensionality reduction, obtains the dimensionality reduction characteristic vector of M k dimension;
In described step S2, hand motion beginning and ending time point determines that method is: calculate the absolute of each passage electromyographic signal Meansigma methods is also sued for peace, and judgement action start-stop of making comparisons with the threshold value being previously set, using corresponding time point as action Terminal;
In described step S3, the electromyographic signal of intercepting is the electromyographic signal after action starting point in 100-200ms.
Described eigenvalue include absolute average MAV, zero passage count ZC, slope variation number SSC, waveform length WL peace All absolute value rate of change MAVS, each eigenvalue calculation method is as follows:
MAV i = 1 L Σ k = 1 L | x ( k ) | , i = 1 , 2 ... M - - - ( 1 )
In formula (1), x (k) is the electromyographic signal data of sampling every time, and L is the data amount check of each passage, and M is passage Number;
For some x (k) of continuous sampling, x (k+1), if meeting formula (2), then the value of ZC adds 1;
X (k) > 0 and x (k+1) < 0, or x (k) < 0 and x (k+1) > 0 (2)
If meeting the condition of formula (3), then the value of SSC increases by 1;
[x (k)-x (k-1)] × [x (k)-x (k+1)] > ε (3)
In formula (3), ε is a given threshold value more than 0;
W L = Σ k = 1 L | Δ x ( k ) | - - - ( 4 )
In formula (4), Δ x (k)=x (k)-x (k-1)
MAVS=MAVk-MAVk-1 (5)
When reading electromyographic signal, export highest frequency with the data of MYO armlet from MYO armlet, read electromyographic signal value.
According to action kind number selector channel number to be identified, in order to take into account discrimination and real-time, as preferably Scheme, channel number M takes 3, and 3 passages can accurately identify daily 8 conventional actions, meets the need of conventional hand motion recognition Want.
Compared with prior art, the invention have the advantages that
(1) built-in wireless digital sensor in MYO armlet, uses MYO armlet to obtain the electromyographic signal of arm muscles, passes through Blue tooth interface i.e. can read electromyographic signal, and MYO armlet has that price is low, signaling rate is fast, signal to noise ratio is low, it is little to disturb, letter Number quality is good, wear convenient advantage, meets modularized design thinking, greatly improves artificial limb hand-guided cost performance and application Prospect.
(2) neural network model parameter, off-line training and ONLINE RECOGNITION are set up by different by off-line training in advance Carrier realizes, and reduces the cost of ONLINE RECOGNITION, improves ONLINE RECOGNITION speed.
(3) in step S2, using temporal signatures to carry out active segment detection, temporal signatures calculates simple, can quickly judge to live Dynamic section, lays the basis of compacting for the real-time of prosthetic hand.
(4) during ONLINE RECOGNITION, to eigenvalue dimension-reduction treatment, it is possible to decrease the load of grader, improve accuracy of identification, improve vacation Limb hand-guided real-time.
(5) use absolute average, zero passage to count, slope variation number, waveform length and average absolute value rate of change conduct Eigenvalue, it is possible to obtain higher action recognition rate.
(6) preferably channel number M is that 3,3 passages can accurately identify 8 actions, can meet conventional hand motion recognition Need, taken into account reliability and the recognition speed of action recognition.
Accompanying drawing explanation
Fig. 1 is the prosthetic hand Control system architecture schematic diagram based on MYO armlet of the present embodiment;
Fig. 2 is the ONLINE RECOGNITION schematic flow sheet in the inventive method;
Fig. 3 is the original electromyogram of certain passage using the present embodiment method to obtain.
Detailed description of the invention
The present invention is described in detail with specific embodiment below in conjunction with the accompanying drawings.The present embodiment is with technical solution of the present invention Premised on implement, give detailed embodiment and concrete operating process, but protection scope of the present invention be not limited to Following embodiment.
Embodiment
A kind of prosthetic hand control method based on MYO armlet, the method is by using MYO armlet Real-time Collection arm muscles Electromyographic signal, read electromyographic signal and also extract its eigenvalue, utilize eigenvalue and the neural network model trained to know online Other hand motion pattern, is changed into corresponding motor movement instructions by this action pattern, drives prosthetic hand to make corresponding action, This process is ONLINE RECOGNITION;
Described neural network model training method includes: perform human body hand motion, uses MYO armlet to gather arm flesh The electromyographic signal of meat, reads electromyographic signal and extracts its eigenvalue, according to the sample of eigenvalue, the nerve net of training hand motion Network model, this process is the off-line procedure before ONLINE RECOGNITION.
Neural network model off-line training process, train samples model parameter storage can be completed by PC, Model parameter is downloaded to panel based on embedded system, during ONLINE RECOGNITION, processes electromyographic signal, identification by panel Hand motion pattern, then controls the action of prosthetic hand by panel.
The storage in neural network model off-line training process and neural network model storehouse can be completed by PC, is joined by model Number is downloaded to panel based on embedded system, during ONLINE RECOGNITION, is processed electromyographic signal by panel, is identified that hand moves Make, then controlled the action of prosthetic hand by panel.
MYO sensor is the novelty armlet that Canada venture company Thalmic Labs releases, and can be worn at any Article one, above the elbow joint of arm, the bioelectrical signals that detection human muscle produces.MYO sensor has eight passages, Mei Getong Road equidistantly arranges.It addition, signal is spread out of by MYO by lower powered bluetooth equipment, disturbing little, signal quality is good, and valency Lattice cheap (149 dollars).
For realizing the prosthetic hand control method of the present invention, the prosthetic hand in MYO armlet as shown in Figure 1 is used to control system System, system is made up of MYO armlet, ARM panel and prosthetic hand body, and MYO armlet is used for gathering arm bioelectrical signals (sEMG Signal), ARM panel is used for identifying that staff action pattern and finger motor drive, and prosthetic hand body is by mechanical part and driving The motor composition of finger motion.Operation principle is: the sEMG signal collected by MYO armlet, is transferred to ARM by blue tooth interface In panel, then obtain action pattern type through the judgement of action terminal, feature extraction and algorithm for pattern recognition, finally will be dynamic Corresponding motor movement instructions is become to drive prosthetic hand finger to do corresponding action as type change.
Core and key technology are to realize the identification of action pattern based on sEMG signal in ARM panel, process stream Journey is as in figure 2 it is shown, illustrate the principle of each link in detail below.
The training of movement recognition defiber and two processes of ONLINE RECOGNITION.Off-line training only when installing artificial limb on request Repeatedly train each action and preserve training data, above-mentioned data being processed and obtains eigenvalue sample, then performing training and calculate Method obtains the movement recognition model parameter of this patients with amputation, and this parameter is used for ONLINE RECOGNITION.Because ONLINE RECOGNITION and off-line are instructed The part before feature extraction when practicing is identical, so, hereafter only ONLINE RECOGNITION algorithm is elaborated.
As in figure 2 it is shown, ONLINE RECOGNITION process comprises the following steps:
S1, read electromyographic signal (sEMG) data of each passage
Use the electromyographic signal of M channel acquisition arm muscles of MYO armlet and read electromyographic signal, wherein, passage Number M is arranged between 2~8, arranges according to the number of hand motion type.
S2, active segment detect
I.e. according to the temporal signatures of each passage electromyographic signal, determine hand motion beginning and ending time point.
For the sEMG signal of M selected passage, first calculate every road signal temporal signatures absolute average (Mean Absolute Average, MAV), formula is:
MAV i = 1 L Σ k = 1 L | x ( k ) | , i = 1 , 2 ... I - - - ( 1 )
Wherein x (k) is the sEMG data of sampling every time.
The MAV of each road signal is added, judges action terminal by the threshold value set.In view of each tested The electromyographic signal characteristic of person is not quite alike, and the threshold value of each experimenter determines in off-line training link, by collecting Analysis for the data sample of training can obtain suitable threshold value.
S3, characteristics extraction
After active segment detects, after action starting point, intercept the sEMG data in each passage 100-200ms, each The data amount check of passage is denoted as L, and the eigenvalue of extraction is as follows: average absolute value (MAV), zero passage are counted (ZC), slope variation number And waveform length (WL), average absolute value rate of change (MAVS) (SSC)
(1) absolute average (MAV)
Shown in the formula of this feature value such as above formula (1), M passage sEMG signal MAV feature be denoted as MAV respectively1, MAV2,…MAVM
(2) zero passage counts (ZC)
For some x (k) of continuous sampling, x (k+1), if meeting the condition of formula (2), then the value of ZC increases 1.
X (k) > 0 and x (k+1) < 0, or x (k) < 0 and x (k+1) > 0 (2)
(3) slope variation number (SSC)
If meeting the condition of formula (3), then the value of SSC increases by 1.
[x (k)-x (k-1)] × [x (k)-x (k+1)] > ε (3)
Wherein ε is a given threshold value more than 0.
(4) waveform length (WL)
This feature value is the cumulative length of waveform in L data length, shown in computing formula such as formula (4).
W L = Σ k = 1 L | Δ x ( k ) | - - - ( 4 )
Wherein Δ x (k)=x (k)-x (k-1).
(5) average absolute value rate of change (MAVS)
This feature value is the difference of the MAV eigenvalue in two adjacent analysis windows, is expressed as:
MAVS=MAVk-MAVk-1 (5)
S4, PCA dimensionality reduction (PCA dimensionality reduction)
To M selected passage, if the eigenvalue after features described above is extracted is n altogether, the characteristic vector i.e. constituted Dimension is 1 × n.In the training process, if the dimensionality reduction matrix obtained through PCA algorithm is U, dimension is n × k, then during ONLINE RECOGNITION Dimension after the characteristic vector every time extracted is multiplied by dimensionality reduction matrix becomes 1 × k dimension, will tie up dimensionality reduction k dimension by former n.
S5, movement recognition
Constructing 3 layers of BP neutral net, input neuron number is characterized the dimension k after vector dimensionality reduction, output neuron Number is action species number C, and hidden neuron number H is less to performance impact, can first be chosen to be 10, empirical equation (6) count Calculation can obtain, and then adjusts according to training precision when training.
H = C K - - - ( 6 )
Each action is done 100 times, as training sample set after extraction feature, obtains weight coefficient w1, b1 by off-line training, W2, b2, wherein w1, b1 is threshold value and the weight coefficient that input layer arrives hidden layer, and w2, b2 are weight coefficient and the threshold that hidden layer arrives output layer Value.
During ONLINE RECOGNITION, during muscle movement activity, read one piece of data every time, extract a stack features and drop with PCA Dimension, is input to neural network model, and the action pattern that the neuron of neutral net output valve maximum is corresponding is current hand Type of action.Electromyographic signal collection, active segment detection, feature extraction and PCA dimensionality reduction and pattern classification in ONLINE RECOGNITION are the most logical Cross panel based on embedded system to realize.
As a example by identifying eight kinds of actions that prosthetic hand is commonly used in controlling, embodiments of the present invention are described.Eight kinds of actions are respectively For: (1) wrist varus (Wrist flexion, abbreviation WF), (2) wrist is turned up (Wrist extension is called for short WE), (3) Clenching fist (Hand close is called for short HC), (4) exrending boxing (Hand open is called for short HO), (5) hold ball (Spherical Grasping, is called for short SG), (6) hold cylinder Cylindrical grasping (CG), and (7) three refer to grab (Tripodal Precision grasping, is called for short TPG), (8) two refer to pinch (Key grasping is called for short KG) (see Fig. 3).Owing to hand is transported Dynamic is that forearm muscle contraction causes, and MYO armlet is worn on position (horizontal positioned arm, the centre of the palm near forearm elbow joint Down, " the Logo LED " of MYO armlet and the centre of the palm is in opposite direction).
Because MYO armlet maximum data output frequency is 200Hz, data sampling rate is set to 200Hz by the present invention, reads every time Taking the electromyographic signal of 8 passages, the curve of certain passage is as shown in Figure 3.
Experimental technique and experimental result:
In view of cost and the property easy for installation of practical prosthesis control device, in an experiment, respectively to five health volunteers The myoelectricity data gathering M (M=3) individual passage are studied.Including, the 3 road sEMG signals collected with MYO carry out pre-place The work such as reason, feature extraction, pattern recognition and control artificial limb motor.In order to improve the real-time of artificial limb, when feature extraction uses 5 features in territory, be respectively absolute average (MAV), zero passage count (ZC), slope variation number (SSC), waveform length (WL) and Average absolute value rate of change (MAVS).The characteristic value data collected by such method is 15 dimensions (M × 5).In ONLINE RECOGNITION In system, by PCA dimensionality reduction to 12 dimension, it is then fed in Classification and Identification model being trained.In the present invention, grader uses three layers Neural network algorithm, input layer nodes is determined by the intrinsic dimensionality after dimensionality reduction, i.e. 12;Output layer neuron node Number is determined by action kind, is here 8;Node in hidden layer be can be calculated by formula (6), is 10.
The weights of the BP neutral net above-mentioned emulation obtained and threshold coefficient, in writing system, it is achieved online actions is known Not.In test, each action does 100 times, gesture motion stably rear viewing system recognition result output.Experimental result such as table 1 institute Show.In the same way, gathering 100 groups of data of remaining 4 people respectively, the weights and the threshold value that training are obtained re-write system In.Then carrying out ONLINE RECOGNITION experiment, each gesture motion is done 100 times, and overall discrimination is all 100%.This is because it is each The weights of people's ONLINE RECOGNITION and threshold value are all that the sEMG signal according to everyone obtains, and meet self sEMG characteristics of signals.
1 eight kinds of action ONLINE RECOGNITION results of table
The results show, to the overall discrimination of eight kinds of actions up to 100% during this invention PCA-12, and meets in real time Property requirement, meets people with disability well to artificial limb hand-guided requirement.

Claims (7)

1. a prosthetic hand control method based on MYO armlet, it is characterised in that the method is adopted in real time by using MYO armlet The electromyographic signal of collection arm muscles, reads electromyographic signal and extracts its eigenvalue, utilizes eigenvalue and the neutral net trained Model ONLINE RECOGNITION hand motion pattern, is changed into corresponding motor movement instructions by this action pattern, drives prosthetic hand to make Corresponding action;
Described neural network model training method includes: perform human body hand motion, uses MYO armlet to gather arm muscles Electromyographic signal, reads electromyographic signal and extracts its eigenvalue, according to the sample of eigenvalue, the neutral net mould of training hand motion Type.
A kind of prosthetic hand control method based on MYO armlet the most according to claim 1, it is characterised in that described flesh The signal of telecommunication is read by the blue tooth interface of MYO armlet.
A kind of prosthetic hand control method based on MYO armlet the most according to claim 1, it is characterised in that described flesh The extraction of the eigenvalue of the signal of telecommunication comprises the following steps:
S1, uses the electromyographic signal of M channel acquisition arm muscles of MYO armlet and reads electromyographic signal, wherein, and channel number M is arranged between 2~8;
S2, according to the temporal signatures of each passage electromyographic signal, determines hand motion beginning and ending time point;
S3, within the hand motion beginning and ending time, one section of electromyographic signal of intercepting is as a signal sequence, carries each signal sequence Take n eigenvalue, obtain the characteristic vector of M n dimension;
S4, uses PCA to characteristic vector dimensionality reduction, obtains the dimensionality reduction characteristic vector of M k dimension.
A kind of prosthetic hand control method based on MYO armlet the most according to claim 3, it is characterised in that described step In rapid S2, hand motion beginning and ending time point determines that method is: calculates the absolute average of each passage electromyographic signal and sues for peace, and with The threshold value being previously set is made comparisons judgement action start-stop, using corresponding time point as action terminal.
A kind of prosthetic hand control method based on MYO armlet the most according to claim 3, it is characterised in that described step In rapid S3, the electromyographic signal of intercepting is the electromyographic signal after action starting point in 100-200ms.
A kind of prosthetic hand control method based on MYO armlet the most according to claim 3, it is characterised in that described spy Value indicative includes that absolute average MAV, zero passage are counted ZC, slope variation number SSC, waveform length WL and average absolute value rate of change MAVS, each eigenvalue calculation method is as follows:
In formula (1), x (k) is the electromyographic signal data of sampling every time, and L is the data amount check of each passage, and M is channel number;
For some x (k) of continuous sampling, x (k+1), if meeting formula (2), then the value of ZC adds 1;
X (k) > 0 and x (k+1) < 0, or x (k) < 0 and x (k+1) > 0 (2)
If meeting the condition of formula (3), then the value of SSC increases by 1;
[x (k)-x (k-1)] × [x (k)-x (k+1)] > ε (3)
In formula (3), ε is a given threshold value more than 0;
In formula (4), Δ x (k)=x (k)-x (k-1)
MAVS=MAVk-MAVk-1 (5) 。
A kind of prosthetic hand control method based on MYO armlet the most according to claim 1, it is characterised in that read myoelectricity During signal, export highest frequency with the data of MYO armlet from MYO armlet, read electromyographic signal value.
CN201610379614.4A 2016-06-01 2016-06-01 Prosthetic hand control method based on MYO armlet Pending CN105943206A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610379614.4A CN105943206A (en) 2016-06-01 2016-06-01 Prosthetic hand control method based on MYO armlet

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610379614.4A CN105943206A (en) 2016-06-01 2016-06-01 Prosthetic hand control method based on MYO armlet

Publications (1)

Publication Number Publication Date
CN105943206A true CN105943206A (en) 2016-09-21

Family

ID=56907327

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610379614.4A Pending CN105943206A (en) 2016-06-01 2016-06-01 Prosthetic hand control method based on MYO armlet

Country Status (1)

Country Link
CN (1) CN105943206A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530926A (en) * 2016-11-29 2017-03-22 东南大学 Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking
CN106890038A (en) * 2017-03-21 2017-06-27 上海师范大学 Prosthetic hand control system and its control method based on MYO armlets
CN107049570A (en) * 2017-03-13 2017-08-18 日照若比邻机器人科技有限公司 Manipulator control system
CN107378944A (en) * 2017-06-20 2017-11-24 东南大学 A kind of multi-dimensional surface electromyographic signal prosthetic hand control method based on PCA
CN107518896A (en) * 2017-07-12 2017-12-29 中国科学院计算技术研究所 A kind of myoelectricity armlet wearing position Forecasting Methodology and system
CN107870583A (en) * 2017-11-10 2018-04-03 国家康复辅具研究中心 artificial limb control method, device and storage medium
CN108703824A (en) * 2018-03-15 2018-10-26 哈工大机器人(合肥)国际创新研究院 A kind of bionic hand control system and control method based on myoelectricity bracelet
CN108983973A (en) * 2018-07-03 2018-12-11 东南大学 A kind of humanoid dexterous myoelectric prosthetic hand control method based on gesture identification
CN109446972A (en) * 2018-10-24 2019-03-08 电子科技大学中山学院 Gait recognition model establishing method, recognition method and device based on electromyographic signals
CN111522435A (en) * 2020-02-21 2020-08-11 浙江工业大学 Mechanical arm interaction method based on surface electromyogram signal
US10779740B1 (en) 2020-02-28 2020-09-22 Center For Quantitative Cytometry System and method to enhance self-recovery of nerve-muscle damage
CN112057212A (en) * 2020-08-03 2020-12-11 桂林电子科技大学 Artificial limb system based on deep learning
CN112773382A (en) * 2021-01-20 2021-05-11 钛虎机器人科技(上海)有限公司 Myoelectricity sensing method and system with user self-adaption capability
WO2022001771A1 (en) * 2020-06-29 2022-01-06 京东科技信息技术有限公司 Artificial limb control method, device and system and storage medium
WO2022012364A1 (en) * 2020-07-15 2022-01-20 京东科技信息技术有限公司 Electromyographic signal processing method and apparatus, and exoskeleton robot control method and apparatus

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102499797A (en) * 2011-10-25 2012-06-20 中国科学院深圳先进技术研究院 Artificial limb control method and system
CN102622605A (en) * 2012-02-17 2012-08-01 国电科学技术研究院 Surface electromyogram signal feature extraction and action pattern recognition method
CN103677289A (en) * 2013-12-09 2014-03-26 中国科学院深圳先进技术研究院 Intelligent interactive glove and interactive method
US20140128992A1 (en) * 2012-11-08 2014-05-08 The University Of Akron Biomimetic controller for increased dexterity prosthesis
CN103815991A (en) * 2014-03-06 2014-05-28 哈尔滨工业大学 Double-passage operation sensing virtual artificial hand training system and method
US20150112448A1 (en) * 2013-10-23 2015-04-23 Kurt W. Scott Hybrid Prosthetic Hand
CN105014676A (en) * 2015-07-03 2015-11-04 浙江大学 Robot motion control method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102499797A (en) * 2011-10-25 2012-06-20 中国科学院深圳先进技术研究院 Artificial limb control method and system
CN102622605A (en) * 2012-02-17 2012-08-01 国电科学技术研究院 Surface electromyogram signal feature extraction and action pattern recognition method
US20140128992A1 (en) * 2012-11-08 2014-05-08 The University Of Akron Biomimetic controller for increased dexterity prosthesis
US20150112448A1 (en) * 2013-10-23 2015-04-23 Kurt W. Scott Hybrid Prosthetic Hand
CN103677289A (en) * 2013-12-09 2014-03-26 中国科学院深圳先进技术研究院 Intelligent interactive glove and interactive method
CN103815991A (en) * 2014-03-06 2014-05-28 哈尔滨工业大学 Double-passage operation sensing virtual artificial hand training system and method
CN105014676A (en) * 2015-07-03 2015-11-04 浙江大学 Robot motion control method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
加玉涛等: "肌电信号特征提取方法综述", 《电子器件》 *
卜峰等: "基于ARM的肌电假肢手控制器", 《上海大学学报》 *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530926B (en) * 2016-11-29 2019-03-05 东南大学 Virtual artificial hand training platform and its training method based on Myo armband and Eye-controlling focus
CN106530926A (en) * 2016-11-29 2017-03-22 东南大学 Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking
CN107049570A (en) * 2017-03-13 2017-08-18 日照若比邻机器人科技有限公司 Manipulator control system
CN106890038A (en) * 2017-03-21 2017-06-27 上海师范大学 Prosthetic hand control system and its control method based on MYO armlets
CN107378944A (en) * 2017-06-20 2017-11-24 东南大学 A kind of multi-dimensional surface electromyographic signal prosthetic hand control method based on PCA
US10959863B2 (en) 2017-06-20 2021-03-30 Southeast University Multi-dimensional surface electromyogram signal prosthetic hand control method based on principal component analysis
WO2018233435A1 (en) * 2017-06-20 2018-12-27 东南大学 Multi-dimensional surface electromyographic signal based artificial hand control method based on principal component analysis method
CN107518896B (en) * 2017-07-12 2019-07-30 中国科学院计算技术研究所 A kind of myoelectricity armlet wearing position prediction technique and system
CN107518896A (en) * 2017-07-12 2017-12-29 中国科学院计算技术研究所 A kind of myoelectricity armlet wearing position Forecasting Methodology and system
CN107870583A (en) * 2017-11-10 2018-04-03 国家康复辅具研究中心 artificial limb control method, device and storage medium
CN108703824A (en) * 2018-03-15 2018-10-26 哈工大机器人(合肥)国际创新研究院 A kind of bionic hand control system and control method based on myoelectricity bracelet
CN108983973B (en) * 2018-07-03 2021-01-26 东南大学 Control method of humanoid smart myoelectric artificial hand based on gesture recognition
CN108983973A (en) * 2018-07-03 2018-12-11 东南大学 A kind of humanoid dexterous myoelectric prosthetic hand control method based on gesture identification
CN109446972A (en) * 2018-10-24 2019-03-08 电子科技大学中山学院 Gait recognition model establishing method, recognition method and device based on electromyographic signals
CN109446972B (en) * 2018-10-24 2021-08-31 电子科技大学中山学院 Gait recognition model establishing method, recognition method and device based on electromyographic signals
CN111522435A (en) * 2020-02-21 2020-08-11 浙江工业大学 Mechanical arm interaction method based on surface electromyogram signal
US10779740B1 (en) 2020-02-28 2020-09-22 Center For Quantitative Cytometry System and method to enhance self-recovery of nerve-muscle damage
WO2022001771A1 (en) * 2020-06-29 2022-01-06 京东科技信息技术有限公司 Artificial limb control method, device and system and storage medium
WO2022012364A1 (en) * 2020-07-15 2022-01-20 京东科技信息技术有限公司 Electromyographic signal processing method and apparatus, and exoskeleton robot control method and apparatus
CN112057212A (en) * 2020-08-03 2020-12-11 桂林电子科技大学 Artificial limb system based on deep learning
CN112773382A (en) * 2021-01-20 2021-05-11 钛虎机器人科技(上海)有限公司 Myoelectricity sensing method and system with user self-adaption capability

Similar Documents

Publication Publication Date Title
CN105943206A (en) Prosthetic hand control method based on MYO armlet
CN104107134B (en) Upper limbs training method and system based on EMG feedback
US10959863B2 (en) Multi-dimensional surface electromyogram signal prosthetic hand control method based on principal component analysis
CN109222969A (en) A kind of wearable human upper limb muscular movement fatigue detecting and training system based on Fusion
CN104382595B (en) Upper limb rehabilitation system and method based on myoelectric signal and virtual reality interaction technology
CN104207793B (en) A kind of grip function assessment and training system
CN102622605B (en) Surface electromyogram signal feature extraction and action pattern recognition method
CN108983973B (en) Control method of humanoid smart myoelectric artificial hand based on gesture recognition
CN110179643A (en) A kind of neck rehabilitation training system and training method based on annulus sensor
CN109875565A (en) A kind of cerebral apoplexy upper extremity exercise function method for automatically evaluating based on deep learning
CN102499797B (en) Artificial limb control method and system
CN106821680A (en) A kind of upper limb healing ectoskeleton control method based on lower limb gait
CN106420124B (en) A kind of myoelectricity control virtual robot is done evil through another person the method for analogue system
CN107397649A (en) A kind of upper limbs exoskeleton rehabilitation robot control method based on radial base neural net
CN202288542U (en) Artificial limb control device
CN108958474A (en) A kind of action recognition multi-sensor data fusion method based on Error weight
CN107822629A (en) The detection method of extremity surface myoelectricity axle
CN109009586A (en) A kind of myoelectricity continuous decoding method of the man-machine natural driving angle of artificial hand wrist joint
Wang et al. A portable artificial robotic hand controlled by EMG signal using ANN classifier
CN105563495A (en) Mechanical arm system controlled on basis of refined motor imagination electroencephalogram signals and method
Yang et al. Experimental study of an EMG-controlled 5-DOF anthropomorphic prosthetic hand for motion restoration
CN108634957A (en) Healing hand function quantitative evaluating method based on human hand " receiving abduction in finger " action
CN109765996A (en) Insensitive gesture detection system and method are deviated to wearing position based on FMG armband
CN106890038A (en) Prosthetic hand control system and its control method based on MYO armlets
Yu et al. Attenuating the impact of limb position on surface EMG pattern recognition using a mixed-LDA classifier

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20160921

RJ01 Rejection of invention patent application after publication