CN106890038A - Prosthetic hand control system and its control method based on MYO armlets - Google Patents

Prosthetic hand control system and its control method based on MYO armlets Download PDF

Info

Publication number
CN106890038A
CN106890038A CN201710168821.XA CN201710168821A CN106890038A CN 106890038 A CN106890038 A CN 106890038A CN 201710168821 A CN201710168821 A CN 201710168821A CN 106890038 A CN106890038 A CN 106890038A
Authority
CN
China
Prior art keywords
grip
prosthetic hand
value
myo
armlets
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710168821.XA
Other languages
Chinese (zh)
Inventor
李传江
任见
程璐璐
朱燕飞
张崇明
王朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Normal University
University of Shanghai for Science and Technology
Original Assignee
Shanghai Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Normal University filed Critical Shanghai Normal University
Priority to CN201710168821.XA priority Critical patent/CN106890038A/en
Publication of CN106890038A publication Critical patent/CN106890038A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/54Artificial arms or hands or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/54Artificial arms or hands or parts thereof
    • A61F2/58Elbows; Wrists ; Other joints; Hands
    • A61F2/583Hands; Wrist joints
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/54Artificial arms or hands or parts thereof
    • A61F2/58Elbows; Wrists ; Other joints; Hands
    • A61F2/583Hands; Wrist joints
    • A61F2/586Fingers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/70Operating or control means electrical
    • A61F2/72Bioelectric control, e.g. myoelectric
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/74Operating or control means fluid, i.e. hydraulic or pneumatic
    • A61F2/741Operating or control means fluid, i.e. hydraulic or pneumatic using powered actuators, e.g. stepper motors or solenoids
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/76Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
    • A61F2002/7615Measuring means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/76Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
    • A61F2002/7615Measuring means
    • A61F2002/7635Measuring means for measuring force, pressure or mechanical tension
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/76Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
    • A61F2002/7695Means for testing non-implantable prostheses

Landscapes

  • Health & Medical Sciences (AREA)
  • Transplantation (AREA)
  • Biomedical Technology (AREA)
  • Cardiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Vascular Medicine (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Prostheses (AREA)

Abstract

Prosthetic hand control system and its control method of the present invention based on MYO armlets, are related to prosthetic hand control technology field.The described prosthetic hand control system based on MYO armlets, including:Signal acquisition module 1, STM32 modules 2, fuzzy controller module 3, prosthetic hand module 4 grasps force feedback module 5 and coordinates the PC of off-line training.Its control method, including:Step A off-line trainings, step B ONLINE RECOGNITIONs and grip predict that step C grasps force feedback.Technical contribution of the invention is, centered on patient, with the On-line Control of the grip of hommization as impetus, using cost performance is high, patient is easy to receive, wear the simply and easily MYO sensors collection residual arm sEMG data of patient.It is input and output with sEMG data and grasping force value, the non-linear relation of the two is set up using BP neural network.Prediction effect is good, is the health of physical disabilities patient, there is provided technology material base.

Description

Prosthetic hand control system and its control method based on MYO armlets
Technical field
The present invention relates to prosthetic hand control technology field, the prosthetic hand control system based on MYO armlets and its control are referred specifically to Method processed.
Background technology
According to Chinese the Second China National Sample Survey on Disability, the quantity of current China physical disabilities patient is up to 24,120,000 people, about Account for the 1.83% of total number of people, the wherein people of patients with amputation 2,260,000.Conservative estimation is, it is necessary to install the patient of prosthetic hand up to 250,000 people More than, therefore prosthetic hand has huge market potential.
However, in current research, most scholar concentrates on to the differentiation of finger grasp mode or rests on grasping It is rarer to be applied to prosthetic hand in the theory analysis research of power control.In addition, the grasping that also there is artificial limb hand control is flexible Property and the aesthetic difference of operation, many patients are reluctant to use.The SensorHand Speed and Touch-Bionic of Otto Bock I-LIMB do evil through another person and be controlled using electromyographic signal, although can be realized to each finger motion by the method for pattern-recognition The differentiation of state, but this can only provide " switch " signal, it is impossible to the real-time active control of finger position and grip is realized, is dropped Physiology sensitivity during low disabled person's use;Business prosthetic hand adoption rate control mode, directly according to sEMG amplitudes Determine the size of grip, its limited precision;Ahmet Erdemir, Scott McLean etc. uses HILL models, it is considered to muscle And the dynamics of bone, Dare explicit model between power-ML-contraction of muscle speed-action potential is set up, but Due to the complexity of human physiological structure, it is difficult to set up accurate model, its application is restricted;Claudio Castellini With Patrick van der Smagt using 10 passage electromyographic electrodes (Otto Bock companies) record sEMG data, six-dimensional force- Torque sensor recorder holds force data, has inquired into various homing methods and can be obtained accuracy of identification, however, electrode number Increase the discomfort that can cause wearer.
So far, the research paper to prosthetic hand pattern emerges in an endless stream, and many patterns of prosthetic hand are basic Solve, and the On-line Control of the grip centered on patient does not possess still.
The content of the invention
It is an object of the invention to the missing or deficiency that overcome prior art to exist, and provide a kind of based on MYO armlets Prosthetic hand control system and its control method.
Thinking of the present invention:Using cost performance MYO sensors very high, patient's sEMG signals are gathered, passed with reference to vibrating tactile Sensor etc., the steps such as force feedback are predicted and grasped by off-line training, ONLINE RECOGNITION and grip makes patient realize accurately grabbing Hold.
Operation principle:The sEMG signals collected by MYO armlets, are transferred in STM32 control panels by blue tooth interface, then Through action terminal judge, temporal signatures value extract and sEMG- power regression model algorithms obtain grasp force value size, finally by Grip size information is transformed into corresponding motor movement instructions by fuzzy controller, drives artificial limb hand finger to do corresponding action And corresponding grasping force value, the actual grip class of prosthetic hand feeds back to the residual arm of patient, makes patient by vibrating tactile sensor The actual grip of prosthetic hand is obtained, the aesthetic of operation is improved.
Core of the invention and key technology are that the grasping force value based on sEMG signals is realized in STM32 control panels Prediction.
A kind of prosthetic hand control system based on MYO armlets of the present invention, including:Signal acquisition module, STM32 modules, mould Fuzzy controllers module, prosthetic hand module grasps force feedback module and coordinates the PC of off-line training.
The signal acquisition module, including for gathering sEMG data, be worn on the residual arm of patient place on the upper side MYO armlets and Six-dimension force sensor for gathering normal staff grip Value Data.
The STM32 modules, for reading, processing sEMG data.
The fuzzy controller module, the signal for receiving the transmission of STM32 modules sends order-driven prosthetic hand and grabs Hold.
The prosthetic hand module, by prosthetic hand body, drives the motor of finger motion and installed in prosthetic hand finger finger belly On FSR force snesors paster composition.
The grasping force feedback module, is worn on the residual arm of patient, is made up of vibrating tactile sensor, receives FSR power The grip value signal of sensor, is then fed back to the residual arm of patient, is easy to patient perceivable's grip size.
A kind of control method of the prosthetic hand control system based on MYO armlets of the present invention, including:Step A off-line trainings, step Rapid B ONLINE RECOGNITIONs and grip prediction, step C grasp force feedback.
The step A off-line trainings, including:A1 pattern classifications are trained and the training of A2sEMG- power regression model.
The A1 pattern classifications training, including:
A11 is divided into different class by force value is grasped, and the grasping force value collection normal person according to each class is corresponding SEMG data, normal staff grip Value Data is gathered by six-dimension force sensor;
A12 extracts temporal signatures value:Absolute average MAV, root-mean-square value RMS, standard deviation SD and waveform length WL;
After A13 temporal signatures value matrix is through PCA dimensionality reductions, pattern classification training is carried out using BP neural network, and preserve instruction Weights and threshold value after white silk.
The A2sEMG- power regression model training, including:
A21 builds training sample by the temporal signatures value matrix after dimensionality reduction together with grip Value Data;
A22 is respectively trained the sEMG- power regression models of each grasping force value class;
A23 preserves the parameter of each model of training.
The step B ONLINE RECOGNITIONs and grip prediction, including:
B1 is by MYO armlets, the sEMG signals of Real-time Collection patient's arm muscles;
B2 reads sEMG data by STM32 modules, and judgement action terminal extracts its temporal signatures value and carries out PCA Dimension-reduction treatment;
B3 identifies its corresponding grip class according to the result that step A1 pattern classifications are trained;
The grasping force value of the prediction of result hand that B4 is trained according to step A2sEMG- power regression model, the grasping force value is made It is the Setting signal of grip control;
B5 controls rotating speed and the steering of finger actuation motor, by the grip control of prosthetic hand by fuzzy controller module Make given class.
The step C grasps force feedback, including:Grip is used as by the different frequency and amplitude of vibrating tactile sensor Feedback signal, enables patients with amputation to obtain the class value of actual grip.
Compared with prior art, the present invention gathers the residual arm sEMG data of patient using cost performance MYO sensors very high, MYO armlets are worn simple and convenient, and patient is more easily accepted by.The present invention using sEMG data and grasp force value as input and it is defeated Go out, the non-linear relation of the two, the modeling process that the method need not be complicated are set up using BP neural network, and prediction effect is good It is good.
Brief description of the drawings
Fig. 1 is prosthetic hand Control system architecture block diagram of the present invention based on MYO armlets;
Fig. 2 is prosthetic hand control method FB(flow block) of the invention;
Fig. 3 is certain passage initial surface electromyographic signal figure and corresponding grip value signal figure that the present embodiment is obtained.
Specific embodiment
The invention will be further described with reference to the accompanying drawings and examples:
A kind of prosthetic hand control system (as shown in Figure 1) based on MYO armlets, including:Signal acquisition module 1, STM32 Module 2, fuzzy controller module 3, prosthetic hand module 4 grasps force feedback module 5 and coordinates the PC of off-line training.
The signal acquisition module 1, including for gathering sEMG data, it is worn on the MYO armlets at the residual arm of patient place on the upper side 11 and for gathering the six-dimension force sensor 12 of normal staff grip Value Data.
The STM32 modules 2, for reading, processing sEMG data.
The fuzzy controller module 3, the signal for receiving the transmission of STM32 modules 2 sends order-driven prosthetic hand and grabs Hold.
The prosthetic hand module 4, by prosthetic hand body, drives the motor of finger motion and installed in prosthetic hand finger finger belly On FSR force snesors paster composition.
The grasping force feedback module 5, is worn on the residual arm of patient, is made up of vibrating tactile sensor, receives FSR power The grip value signal of sensor, is then fed back to the residual arm of patient, is easy to patient perceivable's grip size.
The MYO armlets 11 are MYO sensors, are the novelty arms that Canadian venture company Thalmic Labs are released Ring, can be worn at the elbow joint top of any arm, the bioelectrical signals that detection human muscle produces.
The sEMG data of collection arm are read by the blue tooth interface of MYO armlets 11, its frequency acquisition is its data output Highest frequency 200Hz.
The six-dimension force sensor 12 produced by Anhui Aili Intelligent Technology Co., Ltd. is read by PC, normal person is gathered The grip Value Data of hand, its frequency acquisition is 20Hz;Wherein, according to 8 grip class number selector channels to be identified Number;To take into account discrimination and real-time, meet daily grasping needs or channel number M takes 3,3 passages and can accurately identify 8 Grip class.
A kind of control method (as shown in Figure 2) of the prosthetic hand control system based on MYO armlets of the present invention, including:
Step A off-line trainings, step B ONLINE RECOGNITIONs and grip predict that step C grasps force feedback.
The step A off-line trainings, including:A1 pattern classifications are trained and the training of A2sEMG- power regression model;
The A1 pattern classifications training, including:
A11 is divided into different class by force value is grasped, and the grasping force value collection normal person according to each class is corresponding SEMG data, normal staff grip Value Data is gathered by six-dimension force sensor 12;
A12 extracts temporal signatures value:Absolute average MAV, root-mean-square value RMS, standard deviation SD and waveform length WL;
After A13 temporal signatures value matrix is through PCA dimensionality reductions, pattern classification training is carried out using BP neural network, and preserve instruction Weights and threshold value after white silk;
The A2sEMG- power regression model training, including:
A21 builds training sample by the temporal signatures value matrix after dimensionality reduction together with grip Value Data;
A22 is respectively trained the sEMG- power regression models of each grasping force value class;
A23 preserves the parameter of each model of training.
The step B ONLINE RECOGNITIONs and grip prediction include:
B1 is by MYO armlets 11, the sEMG signals of Real-time Collection patient's arm muscles;
B2 reads sEMG data by STM32 modules 2, and judgement action starting point extracts its temporal signatures value and carries out PCA Dimension-reduction treatment;
B3 identifies its corresponding grip class according to the result that step A1 pattern classifications are trained;
The grasping force value of the prediction of result hand that B4 is trained according to step A2sEMG- power regression model, the grasping force value is made It is the Setting signal of grip control;
B5 controls rotating speed and the steering of finger actuation motor by fuzzy controller module 3, by the grip control of prosthetic hand Make given class.
The step C grasps force feedback, including:Grip is used as by the different frequency and amplitude of vibrating tactile sensor Feedback signal, enables patients with amputation to obtain the class value of actual grip.
Further, the step A off-line trainings, complete and store off-line training process and training data and instruction by PC Practice result, then the parameter obtained by training is downloaded to STM32 modules 2.
The A12 temporal signatures value, its each temporal signatures value calculating method is as follows:
X (i) is the sEMG signal datas of each sampling in formula, and N is the data amount check of each passage, and μ is the equal of N number of data Value, k=1 ..., M, M are channel number.
Above-mentioned off-line training is only repeatedly trained the grasp motion of each class and preserves instruction on request when prosthetic hand is installed Practice data, above-mentioned data are processed and obtains temporal signatures value sample, then perform training algorithm and obtain sEMG- power regression models Parameter, the parameter is used for ONLINE RECOGNITION.Because of the temporal signatures that step A off-line trainings and step B ONLINE RECOGNITIONs and grip are predicted Part before value treatment is identical, and only ONLINE RECOGNITION and grip prediction are elaborated below.
The step B ONLINE RECOGNITIONs and grip are predicted, are read by STM32 modules 2, process sEMG signals and recognize each The grasping force value of individual class.
Fuzzy controller module 3 is according to the theoretical actual grasping for grasping force value and prosthetic hand predicted from STM32 modules 2 Error and error change obtained by force value contrast control steering and the rotating speed of prosthetic hand finger actuation motor, reach control The purpose of grip.
The B2 extracts its temporal signatures value and through PCA dimension-reduction treatment, comprises the following steps:
B21 reads M passage sEMG data of MYO armlets 11;
B22 determines hand motion beginning and ending time point according to the temporal signatures value MAV of each passage sEMG signals;
B23 intercepts one section of sEMG signal as one-dimensional signal sequence successively within the hand motion beginning and ending time, to each letter Number n temporal signatures value of sequential extraction procedures, obtains the M temporal signatures value matrix of n dimensions;
B24, to temporal signatures value matrix dimensionality reduction, obtains the M dimensionality reduction temporal signatures value matrix of k dimensions using PCA method of descents.
The B5 fuzzy controllers module 3 uses the FUZZY ALGORITHMS FOR CONTROL of On-line accoun, with error and error change as defeated Enter, the rotating speed of motor and steering are output.
Motor in the vibrating tactile sensor produces 4 kinds of vibration frequencies, have under every kind of vibration frequency two kinds it is different Amplitude, i.e. combination produce the vibration signal of 8 kinds of different frequencies and amplitude, 8 kinds of grip class of correspondence;
The actual grip of the prosthetic hand, by the FSR power being arranged in prosthetic hand module 4 in prosthetic hand finger finger belly The grip Value Data of sensor patch measurement, the vibration signal for converting thereof into correspondence class through vibrating tactile sensor is transmitted Give patient residual arm, patient is obtained the actual grip class value of prosthetic hand, patient adjusts the grip of prosthetic hand according to the value, So as to realize the accurate grasping of prosthetic hand.
In the step B22, hand motion beginning and ending time point determines that method is:
Calculate the absolute average of each passage sEMG signals and sue for peace, and judgement of being made comparisons with the threshold value being previously set Action start-stop, using corresponding time point as action terminal.
The embodiment of the present invention
B1, collection sEMG signals
Using M channel acquisition arm muscles of MYO armlets 11 sEMG signals and read sEMG data.
B21, reading sEMG signals
M passage sEMG data of MYO armlets 11 are read by STM32 modules 2;
B22, the detection of hand motion terminal
For the sEMG signals for collecting, first calculate per road signal temporal signatures absolute average MAV, formula is:
Wherein x (i) is the sEMG data of each sampling, and k=1 ..., M, M are port number.
The MAV of each road signal is added, judges to act terminal by the threshold value for setting.
B23, extraction temporal signatures value
After active segment is detected, the sEMG data in each passage 100-200ms are intercepted after starting point is acted, each The data amount check of passage is denoted as N, and the temporal signatures value of extraction is as follows:Absolute average MAV, root-mean-square value RMS, standard deviation SD and Waveform length WL.
(1) absolute average (MAV)
Shown in the formula of the temporal signatures value such as above formula (1), the MAV characteristic values of M passage sEMG signal are denoted as respectively MAV1, MAV2..., MAVM
(2) root-mean-square value (RMS)
(3) standard deviation (SD)
(4) waveform length (WL)
X (i) is the sEMG signal datas of each sampling in formula, and N is the data amount check of each passage, and μ is the equal of N number of data Value, k=1 ..., M, M are channel number.
B24, PCA dimensionality reductions are made to temporal signatures value matrix
To the M passage sEMG signal selected, if by the characteristic value after the extraction of above-mentioned temporal signatures value altogether for n is individual, i.e., The temporal signatures value matrix dimension of composition is 1 × n.In the training process, if being U, dimension through the dimensionality reduction matrix that PCA algorithms are obtained It is n × k, then the dimension that the temporal signatures value matrix for being extracted every time during ONLINE RECOGNITION is multiplied by after dimensionality reduction matrix becomes 1 × k dimensions, i.e., Former n dimensions are fallen below into k dimensions.
A1, pattern classification training
3 layers of BP neural network of construction, input neuron number is the dimension K after temporal signatures value matrix dimensionality reduction, output god It is the class number C of grip through first number, H is smaller to performance impact for hidden layer neuron number, 10 can be first chosen to be, by passing through Test formula (5) can be calculated, then adjusted according to training precision in training.
The grasp motion of each class correspondence grip does 200 times, extracts temporal signatures value and after PCA dimensionality reductions as instruction Practice sample set, obtain the weight coefficient and threshold value w1, b1, w2, b2 of pattern classification by BP neural network model training, wherein w1, B1 is weight coefficient and threshold value of the input layer to hidden layer, and w2, b2 are weight coefficient and threshold value of the hidden layer to output layer.
By taking eight grades of grasping force value for recognizing the control of prosthetic hand grip as an example, embodiments of the present invention are illustrated.
Eight grades grasp force value and are respectively:(1) 0~2N, (2) 2~4N, (3) 4~6N, (4) 6~8N, (5) 8~10N, (6) 10~12N, (7) 12~14N, (8) 14~16N.
Hand exercise is that forearm contraction of muscle causes, so MYO armlets 11 are worn near the position of forearm elbow joint (horizontal positioned arm, down, " the Logo LED " of MYO armlets 11 is in opposite direction with the centre of the palm in the centre of the palm).
Because the maximum data output frequency of MYO armlets 11 is 200Hz, data sampling rate is set to 200Hz by the present invention, every time Read M (M=3) individual passage sEMG signals, grip be 6~8N force value signal and its corresponding sEMG signals certain lead to The curve (as shown in Figure 3) in road.
Cost and property easy for installation in view of practical artificial limb hand controls, it is in an experiment, tested to two health respectively The sEMG data that person gathers 8 passages are studied.Including, the 8 grades of grasping force value collected with six-dimension force sensor 12, And the corresponding 3 road sEMG signals collected with MYO armlets 11 carry out the detection of hand motion terminal, temporal signatures value and carry Take, pattern classification, grip prediction, control artificial limb motor and grasp the work such as force feedback.In order to improve the reality of prosthetic hand grasping Shi Xing, the method using temporal signatures value is extracted, 4 temporal signatures values of extraction are respectively absolute average (MAV), root mean square Value (RMS), standard deviation (SD) and waveform length (WL).The temporal signatures value matrix collected with such method is 12 dimensions (3 ×4).With PCA dimensionality reductions to 9 dimensions, it is then fed into being trained in pattern classification model.
In the present invention, model uses three-layer neural network algorithm, and input layer nodes are special by the time domain after dimensionality reduction The dimension decision of value indicative matrix, i.e., 9;Output layer neuron node number is determined by grip class classification, is herein 8; Node in hidden layer can be calculated by formula (5), be 8.
In the weights and threshold coefficient writing system of the BP neural network that above-mentioned emulation is obtained, online grip shelves are realized Secondary identification.In experiment, the grasp motion of each class correspondence grip does 200 times, observing system identification after gesture motion stabilization Result is exported.In the same way, 200 groups of data of another people are gathered, the weights and threshold value that training is obtained re-write and are In system, subsequently to carry out ONLINE RECOGNITION experiment.
A2, sEMG- power regression model are trained
3 layers of BP neural network of construction, input neuron node number is 9, and output layer neuron node number is 1, hidden layer section Points can be calculated by formula (5), be 3.By in step A1 through the temporal signatures value matrix after PCA dimensionality reductions with grasp force value number According to composing training sample set together, the parameter of each grip class sEMG- power regression model is obtained by off-line training.
In the parameter read-in system of each class sEMG- power regression models that above-mentioned emulation is obtained, online grip is realized Prediction.Same method, in the parameter read-in system that another person is trained through sEMG- power regression model, observation is online to be grasped Power prediction effect.
B3, the class identification of grasping force value
MYO armlets 11 gather the residual arm sEMG signals of patient, and STM32 modules 2 read sEMG data, by the method in step A1 Treatment sEMG data, obtain the temporal signatures value matrix of 9 dimensions.The matrix is imported into the neutral net mould trained in step A1 Type, the corresponding force value class that grasps of the maximum neuron of neutral net output valve is current hand grasping force value class.
The detection of sEMG signal acquisitions, hand motion starting point, the extraction of temporal signatures value, PCA dimensionality reductions in ONLINE RECOGNITION, grab Grip class is recognized and grip prediction is realized by STM32 modules 2.
B4, grip prediction
Each grip class BP neural network parameter read-in system for obtaining will be emulated, according to grip class in step B3 The result for identifying, the temporal signatures value matrix after dimensionality reduction is imported in the sEMG- power regression models of correspondence class, predicts it Grasp force value.
B5, grip control
Fuzzy controller module 3 is according to the theoretical actual grasping for grasping force value and prosthetic hand predicted from STM32 modules 2 Error and error change obtained by force value contrast control steering and the rotating speed of prosthetic hand finger actuation motor, reach control The purpose of grip.Note error is e, and error change is e ', fuzzy inference rule:
(1) if e for just, larger and e ' is larger, or e for just, larger and e ' is smaller, then motor at a high speed rotate forward;
(2) if e for just, smaller and e ' is larger, motor middling speed rotate forward;
(3) if e for just, smaller and e ' is smaller, motor low speed rotate forward;
(4) if e is close to 0 and e ' is larger, or e is close to 0 and e ' is smaller, then motor does not turn;
(5) if e for it is negative, smaller and e ' is smaller, motor low speed and reverse;
(6) if e for it is negative, smaller and e ' is larger, motor middling speed invert;
(7) if e for it is negative, larger and e ' is larger, or e be negative, larger and e ' is smaller, then motor high speed reverse rotation;
C, grasping force feedback
The feedback of grip can produce 4 kinds of vibration frequencies using vibrating tactile sensor, the motor in the sensor, often There are two kinds of different amplitudes under kind vibration frequency, have 8 kinds of combinations and represent 8 grips of class respectively.The reality of prosthetic hand Grip converts thereof into the corresponding signal of 8 grip class by the FSR force sensor measurings in finger finger belly After be sent to vibrational feedback device, vibrating tactile sensor produces the vibration letter of different frequency and amplitude according to the class of grip Number send the residual arm of patient to, patient is obtained the actual grip class of prosthetic hand, patient adjusts grabbing for prosthetic hand according to the value Grip, so as to realize the accurate grasping of prosthetic hand.The accuracy that vibrational feedback device improves grasping is aesthetic with operation.
In sum, technical contribution of the invention is, centered on patient, with the On-line Control of the grip of hommization It is impetus, is the health of physical disabilities patient, there is provided technology material base.
Achieve following breakthrough:
(1) the sEMG signals of arm muscles are obtained by MYO armlets, is that can read by blue tooth interface, vacation is greatly improved The cost performance and application prospect of limb control.
(2) neural network model storehouse is set up by advance off-line training, off-line training and ONLINE RECOGNITION are by different loads Body realization, reduces the cost of ONLINE RECOGNITION, improves ONLINE RECOGNITION speed.
(3) in step S2, active segment detection is carried out using temporal signatures value, temporal signatures value calculates simple, can quickly sentence Disconnected active segment, is that the real-time of prosthetic hand lays the basis of compacting.
(4) during ONLINE RECOGNITION, PCA dimension-reduction treatment is carried out to temporal signatures value matrix, it is possible to decrease the load of processor, improvement Accuracy of identification, the real-time for improving artificial limb hand control.
(5) can be obtained as characteristic value using the absolute average of time domain, root-mean-square value, standard deviation and waveform length Action recognition rate higher.
(6) class division is carried out to grasping force value, discrimination can be improved and accuracy is grasped.
(7) channel number M elects 3,3 passages as and can accurately identify 8 grip class, meets daily grasping needs, and The reliability and recognition speed of grip identification are turned round and look at.
(8) vibrating tactile sensor feeds back on the residual arm of patient the actual grip of prosthetic hand, realizes closed-loop control, suffers from Person improves the aesthetic of operation as the part in closed-loop control.

Claims (13)

1. a kind of prosthetic hand control system based on MYO armlets, it is characterised in that including:Signal acquisition module (1), STM32 moulds Block (2), fuzzy controller module (3), prosthetic hand module (4) grasps force feedback module (5) and coordinates the PC of off-line training;
The signal acquisition module (1), including for gathering sEMG data, it is worn on the MYO armlets at the residual arm of patient place on the upper side (11) and for gathering the six-dimension force sensor (12) of normal staff grip Value Data;
The STM32 modules (2), for reading, processing sEMG data;
The fuzzy controller module (3), the signal for receiving STM32 modules (2) transmission sends order-driven prosthetic hand and grabs Hold;
The prosthetic hand module (4), by prosthetic hand body, drives the motor of finger motion and in prosthetic hand finger finger belly FSR force snesors paster composition;
Grasping force feedback module (5), is worn on the residual arm of patient, is made up of vibrating tactile sensor, receives FSR power and passes The grip value signal of sensor, is then fed back to the residual arm of patient, is easy to patient perceivable's grip size.
2. the prosthetic hand control system of MYO armlets is based on as claimed in claim 1, it is characterised in that by MYO armlets (11) Blue tooth interface read collection arm sEMG data;
The grip Value Data that six-dimension force sensor (12) gathers normal staff is read by PC;Wherein, according to be identified 8 Individual grip class number selector channel number;To take into account discrimination and real-time, daily grasping needs or channel number M are met Taking 3,3 passages can accurately identify 8 grip class.
3. a kind of control method of the prosthetic hand control system based on MYO armlets, it is characterised in that including:Step A is instructed offline Practice, step B ONLINE RECOGNITIONs and grip predict that step C grasps force feedback.
4. the control method of the prosthetic hand control system of MYO armlets is based on as claimed in claim 3, it is characterised in that described Step A off-line trainings, including:A1 pattern classifications are trained and the training of A2sEMG- power regression model;
The A1 pattern classifications training, including:
Grip is divided into different class by A11, the corresponding sEMG data of grasping force value collection normal person according to each class, Normal staff grip Value Data is gathered by six-dimension force sensor (12);
A12 extracts temporal signatures value:Absolute average MAV, root-mean-square value RMS, standard deviation SD and waveform length WL;
After A13 temporal signatures value matrix is through PCA dimensionality reductions, pattern classification training is carried out using BP neural network, and after preserving training Weights and threshold value;
The A2sEMG- power regression model training, including:
A21 builds training sample by the temporal signatures value matrix after dimensionality reduction together with grip Value Data;
A22 is respectively trained the sEMG- power regression models of each grasping force value class;
A23 preserves the parameter of each model of training.
5. the control method of the prosthetic hand control system of MYO armlets is based on as claimed in claim 3, it is characterised in that described Step B ONLINE RECOGNITIONs and grip prediction, including:
B1 is by MYO armlets (11), the sEMG signals of Real-time Collection patient's arm muscles;
B2 reads sEMG data by STM32 modules (2), judges hand motion terminal, extracts its temporal signatures value and carries out PCA dimension-reduction treatment;
B3 identifies its corresponding grip class according to the result that step A1 pattern classifications are trained;
The grasping force value of the prediction of result hand that B4 is trained according to step A2sEMG- power regression model, using the grasping force value as grabbing The Setting signal of grip control;
B5 controls rotating speed and the steering of finger actuation motor by fuzzy controller module (3), and the grip of prosthetic hand is controlled To given class.
6. the control method of the prosthetic hand control system of MYO armlets is based on as claimed in claim 3, it is characterised in that described Step C grasps force feedback, including:Grasping force feedback signal is used as by the different frequency and amplitude of vibrating tactile sensor, is made Patients with amputation can obtain the class value of actual grip.
7. the control method of the prosthetic hand control system of MYO armlets is based on as claimed in claim 4, it is characterised in that described Step A off-line trainings, complete and store off-line training process and training data and training result by PC, then will training gained Parameter be downloaded to STM32 modules (2).
8. the control method of the prosthetic hand control system of MYO armlets is based on as claimed in claim 4, it is characterised in that described A12 temporal signatures values, its each temporal signatures value calculating method is as follows:
MAV k = 1 N Σ i = 1 N x ( i ) - - - ( 1 )
RMS k = 1 N Σ i = 1 N x ( i ) 2 - - - ( 2 )
SD k = 1 N - 1 Σ i = 1 N ( x ( i ) - μ ) 2 - - - ( 3 )
WL k = Σ i = 1 N - 1 | x ( i + 1 ) - x ( i ) | - - - ( 4 )
X (i) is the sEMG signal datas of each sampling in formula, and N is the data amount check of each passage, and μ is the average of N number of data, k =1 ..., M, M are channel number.
9. the control method of the prosthetic hand control system of MYO armlets is based on as claimed in claim 5, it is characterised in that described Step B ONLINE RECOGNITIONs and grip are predicted, are read by STM32 modules (2), process sEMG signals and recognize grabbing for each class Grip value;
The rotating speed of motor and steering on prosthetic hand are controlled by fuzzy controller module (3), the grip action of prosthetic hand is realized.
10. the control method of the prosthetic hand control system of MYO armlets is based on as claimed in claim 5, it is characterised in that described B2 extracts its temporal signatures value and through PCA dimension-reduction treatment, comprises the following steps:
B21 reads the M passage sEMG data of MYO armlets (11);
B22 determines hand motion beginning and ending time point according to the temporal signatures value MAV of each passage sEMG signals;
B23 intercepts one section of sEMG signal as one-dimensional signal sequence successively within the hand motion beginning and ending time, to each signal sequence Row extract n temporal signatures value, obtain the M temporal signatures value matrix of n dimensions;
B24, to temporal signatures value matrix dimensionality reduction, obtains the M dimensionality reduction temporal signatures value matrix of k dimensions using PCA method of descents.
The control method of the 11. prosthetic hand control systems based on MYO armlets as claimed in claim 5, it is characterised in that described B5 fuzzy controllers module (3) use the FUZZY ALGORITHMS FOR CONTROL of On-line accoun, are input, motor with error and error change Rotating speed and turn to for output.
The control method of the 12. prosthetic hand control systems based on MYO armlets as claimed in claim 6, it is characterised in that described Motor in vibrating tactile sensor produces 4 kinds of vibration frequencies, there is two kinds of different amplitudes under every kind of vibration frequency, i.e. combination is produced The vibration signal of raw 8 kinds of different frequencies and amplitude, 8 kinds of grip class of correspondence;
The actual grip of the prosthetic hand, is passed by the FSR power being arranged in prosthetic hand module (4) in prosthetic hand finger finger belly The signal of sensor paster measurement, it is residual that the vibration signal for converting thereof into correspondence class through vibrating tactile sensor sends patient to Arm, makes patient obtain the actual grip class value of prosthetic hand, and patient adjusts the grip of prosthetic hand according to the value, so as to realize The accurate grasping of prosthetic hand.
The control method of the 13. prosthetic hand control systems based on MYO armlets as claimed in claim 10, it is characterised in that institute State in step B22, hand motion beginning and ending time point determines that method is:
Calculate the absolute average of each passage sEMG signals and sue for peace, and judgement action of being made comparisons with the threshold value being previously set Start-stop, using corresponding time point as action terminal.
CN201710168821.XA 2017-03-21 2017-03-21 Prosthetic hand control system and its control method based on MYO armlets Pending CN106890038A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710168821.XA CN106890038A (en) 2017-03-21 2017-03-21 Prosthetic hand control system and its control method based on MYO armlets

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710168821.XA CN106890038A (en) 2017-03-21 2017-03-21 Prosthetic hand control system and its control method based on MYO armlets

Publications (1)

Publication Number Publication Date
CN106890038A true CN106890038A (en) 2017-06-27

Family

ID=59194087

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710168821.XA Pending CN106890038A (en) 2017-03-21 2017-03-21 Prosthetic hand control system and its control method based on MYO armlets

Country Status (1)

Country Link
CN (1) CN106890038A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109646242A (en) * 2017-10-12 2019-04-19 上海师范大学 A kind of Table top type upper limb rehabilitation robot and its application method
CN110859687A (en) * 2018-08-28 2020-03-06 深圳先进技术研究院 Vibration arm ring, tactile sense detection system, vibration control method, and tactile sense detection method
CN113171214A (en) * 2021-05-27 2021-07-27 山东大学 Multi-path feedback myoelectricity control prosthetic hand based on self-adaptive enhanced classifier and method
CN113952091A (en) * 2021-12-06 2022-01-21 福州大学 Multi-sensor fusion type artificial hand gripping force feedback control method
CN114115531A (en) * 2021-11-11 2022-03-01 合肥工业大学 End-to-end sign language identification method based on attention mechanism

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102228399A (en) * 2011-06-17 2011-11-02 东南大学 Myoelectricity artificial hand grip strength controlling method based on fuzzy controller
CN102379760A (en) * 2011-08-25 2012-03-21 东南大学 Myoelectric prosthetic hand force tactile feedback method and tactile feedback myoelectric prosthetic hand system
CN103519924A (en) * 2013-10-22 2014-01-22 深圳先进技术研究院 Intelligent artificial hand system
CN105943206A (en) * 2016-06-01 2016-09-21 上海师范大学 Prosthetic hand control method based on MYO armlet

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102228399A (en) * 2011-06-17 2011-11-02 东南大学 Myoelectricity artificial hand grip strength controlling method based on fuzzy controller
CN102379760A (en) * 2011-08-25 2012-03-21 东南大学 Myoelectric prosthetic hand force tactile feedback method and tactile feedback myoelectric prosthetic hand system
CN103519924A (en) * 2013-10-22 2014-01-22 深圳先进技术研究院 Intelligent artificial hand system
CN105943206A (en) * 2016-06-01 2016-09-21 上海师范大学 Prosthetic hand control method based on MYO armlet

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
朱高科: "抓取模式与力的同步肌电解码及假肢手肌电控制研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
梁富鑫: "基于表面肌电信号的人手动作模式识别和抓取力预测技术研究", 《中国优秀硕士学位论文全文数据库医药卫生科技辑》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109646242A (en) * 2017-10-12 2019-04-19 上海师范大学 A kind of Table top type upper limb rehabilitation robot and its application method
CN110859687A (en) * 2018-08-28 2020-03-06 深圳先进技术研究院 Vibration arm ring, tactile sense detection system, vibration control method, and tactile sense detection method
CN110859687B (en) * 2018-08-28 2024-03-12 深圳先进技术研究院 Vibrating arm ring, touch sense detection system, vibration control method and touch sense detection method
CN113171214A (en) * 2021-05-27 2021-07-27 山东大学 Multi-path feedback myoelectricity control prosthetic hand based on self-adaptive enhanced classifier and method
CN113171214B (en) * 2021-05-27 2023-10-24 山东大学 Multi-path feedback myoelectric control prosthetic hand based on self-adaptive enhancement classifier and method
CN114115531A (en) * 2021-11-11 2022-03-01 合肥工业大学 End-to-end sign language identification method based on attention mechanism
CN113952091A (en) * 2021-12-06 2022-01-21 福州大学 Multi-sensor fusion type artificial hand gripping force feedback control method

Similar Documents

Publication Publication Date Title
CN106890038A (en) Prosthetic hand control system and its control method based on MYO armlets
CN110653817B (en) Exoskeleton robot power-assisted control system and method based on neural network
CN104107134B (en) Upper limbs training method and system based on EMG feedback
CN105943206A (en) Prosthetic hand control method based on MYO armlet
US10959863B2 (en) Multi-dimensional surface electromyogram signal prosthetic hand control method based on principal component analysis
Huang et al. Development of a myoelectric discrimination system for a multi-degree prosthetic hand
CN102622605B (en) Surface electromyogram signal feature extraction and action pattern recognition method
CN110179643A (en) A kind of neck rehabilitation training system and training method based on annulus sensor
CN106980367A (en) A kind of gesture identification method based on myoelectricity topographic map
CN106109174A (en) A kind of healing robot control method based on myoelectric feedback impedance self-adaptive
CN108983973A (en) A kind of humanoid dexterous myoelectric prosthetic hand control method based on gesture identification
CN202288542U (en) Artificial limb control device
CN108958474A (en) A kind of action recognition multi-sensor data fusion method based on Error weight
Du et al. EMG-based continuous motion decoding of upper limb with spiking neural network
KR100994408B1 (en) Method and device for deducting pinch force, method and device for discriminating muscle to deduct pinch force
Lin et al. A normalisation approach improves the performance of inter-subject sEMG-based hand gesture recognition with a ConvNet
CN113952091A (en) Multi-sensor fusion type artificial hand gripping force feedback control method
Zhang et al. The design of a hemiplegic upper limb rehabilitation training system based on surface EMG signals
CN105813011B (en) Robot device based on cell-phone customer terminal perceives signal detecting method when participating in the cintest
Wang et al. EMG signal classification for myoelectric teleoperating a dexterous robot hand
Sakr et al. Regressing force-myographic signals collected by an armband to estimate torque exerted by the wrist: A preliminary investigation
Guo et al. A novel fuzzy neural network-based rehabilitation stage classifying method for the upper limb rehabilitation robotic system
Hasan et al. Hand Gesture Recognition Based on Surface Electromyogram Signal (sEMG) with Muscular Contraction Level and Real Time Implementation on An Artificial Prosthetic Wrist Using Artificial Neural Network (ANN)
Cui et al. sEMG-based prediction of human lower extremity movements by using a dynamic recurrent neural network
CN111897415A (en) Virtual artificial hand compliance visual control method based on electromyographic signals and variable stiffness control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170627

RJ01 Rejection of invention patent application after publication