CN104997582B - Device and method for controlling intelligent artificial limb based on eye and lower jaw electromyographic signals - Google Patents

Device and method for controlling intelligent artificial limb based on eye and lower jaw electromyographic signals Download PDF

Info

Publication number
CN104997582B
CN104997582B CN201510457634.4A CN201510457634A CN104997582B CN 104997582 B CN104997582 B CN 104997582B CN 201510457634 A CN201510457634 A CN 201510457634A CN 104997582 B CN104997582 B CN 104997582B
Authority
CN
China
Prior art keywords
module
eye
lower jaw
electromyographic signal
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510457634.4A
Other languages
Chinese (zh)
Other versions
CN104997582A (en
Inventor
白殿春
杨俊友
张守先
横井浩史
孙柏青
姜银来
苏笑滢
杨光
张家晋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yidong (Shenzhen) Photoelectric Technology Co., Ltd
Original Assignee
Shenyang University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang University of Technology filed Critical Shenyang University of Technology
Priority to CN201510457634.4A priority Critical patent/CN104997582B/en
Publication of CN104997582A publication Critical patent/CN104997582A/en
Application granted granted Critical
Publication of CN104997582B publication Critical patent/CN104997582B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

A device for controlling an intelligent artificial limb based on eye and lower jaw electromyographic signals comprises an eye or lower jaw surface electromyographic signal acquisition module, an electromyographic signal processing module, an eye or lower jaw movement feature extraction module, a feature matching module, and an artificial limb control module. The eye or lower jaw surface electromyographic signal acquisition module is connected with the electromyographic signal processing module, the electromyographic signal processing module is connected with the eye or lower jaw movement feature extraction module, the eye or lower jaw movement feature extraction module is connected with the feature matching module, and the feature matching module is connected with the artificial limb control module. The movement range is big, signal acquisition is easy, and the precision is high. A low-complexity artificial limb is driven to complete an advanced task based on a detailed electromyographic signal analysis result. The relationship between signals and different modes is searched through a variety of methods to make the signals become a reliable signal source of artificial limbs. Accurate control on artificial limbs can be better realized, and the needs of people can be satisfied better.

Description

Based on the control device and method of eye and lower jaw electromyographic signal to intelligent artificial limb
Technical field
The present invention relates to a kind of artificial limb for imitating human hand movement, more particularly to a kind of to be based on eye and lower jaw electromyographic signal pair The control device and method of intelligent artificial limb.
Background technology
Staff plays the role of very important as one of human body vitals, in daily life, not only needs Every work sexual behaviour is carried out by which, equally occupy critical role when exchanging with other people or explore unknown object.Due to Staff has very fine structure, determines which can complete the function of Various Complex, and constantly study for a long time, training make multiple Miscellaneous function is implemented as possibility.These are, because of the result mutually promoted between human brain and hand for a long time, affected, evolved, to cause people Brain is controlling to put into extra psychology concern when staff completes corresponding sports pattern.
Bionic arm is grown up within 10 years after 20th century with high performance artificial limb of new generation, is robotics With a frontier nature research topic of biomedical engineering field.Bionic arm it is intelligent be mainly reflected in do evil through another person do more complicated Many finger coordination abilities during action, capture object when grip of artifical control ability and to body form(Self adaptation)It is soft It is pliable.Bionic arm can freely control to do evil through another person according to the idea that the brain of people sends and complete various actions, in crawl object When be capable of the surface of perceptual object, the parameter of adjust automatically artificial hand system, making to do evil through another person completes flexible, reliable action.
It is substantially using brain signal control currently used for intelligent bionic arm, but brain control signal interference Greatly, jitter is gathered, error is larger.
The content of the invention
Goal of the invention:The present invention provides a kind of control device and side based on eye and lower jaw electromyographic signal to intelligent artificial limb Method, its objective is to overcome existing prosthetic hand interference greatly, gathers jitter, the larger deficiency of error.
Technical scheme:The present invention is achieved by the following technical solutions:
Based on the control device of eye and lower jaw electromyographic signal to intelligent artificial limb, it is characterised in that:It include eyes or Mandibular surface electromyographic signal collection module, electromyographic signal processing module, eyes or mandibular movement characteristic extracting module, characteristic matching Module and prosthesis control module;Eyes or mandibular surface electromyographic signal collection module connection electromyographic signal processing module, myoelectricity Signal processing module connects eyes or mandibular movement characteristic extracting module, eyes or mandibular movement characteristic extracting module connection features Matching module, characteristic matching module connection prosthesis control module.
Four finger motors, thumb motor and carpal joint motor that prosthesis control module includes being connected with characteristic matching module, four Refer to that motor connects four fingers of prosthetic hand by mechanical delivery device, thumb motor connects the thumb of prosthetic hand by mechanical delivery device Refer to, carpal joint motor connects the carpal joint of prosthetic hand by mechanical delivery device.
Using the above-mentioned control method implemented to the control device of intelligent artificial limb based on eye and lower jaw electromyographic signal, It is characterized in that:The method realizes blinking or lower jaw to eyes using eyes or mandibular surface electromyographic signal collection module The electromyographic signal collection that produces of opening and closing motor process, and be sent to signal processing module, characteristic extracting module is after process Electromyographic signal is sent to feature pre-stored module, by the characteristic matching module, realizes the motor control to prosthetic hand, and then Complete by eyes blink or the opening and closing motion event of lower jaw is combined, to produce various different features, so as to complete Into the control function of prosthetic hand.
Eyes or mandibular surface electromyographic signal collection module are the surface myoelectrics for obtaining skin surface from eye and lower jaw Signal simultaneously sends myoelectricity processing module to, and this module mainly includes three parts:Acquisition parameter sets, acquisition time record, moves Guide, guided by acquisition parameter setting and action, detect blink action and lower jaw opening and closing action data and record eyes and close Conjunction time and lower jaw duration.
Eyes or mandibular movement characteristic extracting module are mainly by eyes signal of blinking recognition unit and lower jaw opening and closing signal Recognition unit is constituted, and differentiates corresponding actions instruction, the wherein fortune to signal by the various features for extracting different motion pattern Dynamic feature is extracted by the method for time-domain analyses, frequency-domain analysiss, Time-Frequency Analysis, non-linear entropy and chaos.
The step of the method, is as follows:
(1), eyes or mandibular movement characteristic extracting module are affixed at eyes or lower jaw, it is special using eyes or mandibular movement Levy extraction module to obtain the surface electromyogram signal of skin surface from eye and lower jaw and send myoelectricity processing module to;
(2), myoelectricity processing module transmits a signal to eyes or mandibular movement characteristic extracting module carries out feature extraction;
(3), Land use models matching module be with transmission come in characteristic signal by contrast differentiation carry out characteristic matching, general The respective settings action command of output is sent to prosthesis control module, by the mistake of identification maneuver, confirmation action and matching action Journey, judges whether the feature extracted meets corresponding eyes or mandibular movement characteristic vector expectation action, if meeting, enters Enter prosthesis control module;
(4), control the prosthetic hand the five fingers and wrist motor using prosthesis control module and make corresponding instruction action.
Eyes or mandibular surface electromyographic signal be left eye is continuously blinked, right eye continuously blinks, eyes are while continuously blink Move, open one's mouth, grit one's teeth, the six kinds of states that stuck out one's tongue carry out signal characteristic identification.
During the EMG signal gathered under six kinds of patterns, it is desirable to which user is moved as follows:
1) clench fist:Left eye continuously blinks 5 times, and prosthetic hand is changed into the state of clenching fist from palm nature relaxed state;
2) exrending boxing:Right eye continuously blinks 5 times, and prosthetic hand is changed into exrending boxing state from palm nature relaxed state;
3) stretch the palm:Eyes eye continuously blinks 5 times, and prosthetic hand is changed into the state of clenching fist from palm nature relaxed state;
4) arm inward turning:Open one's mouth 3 seconds or so, prosthetic hand is changed into arm inward turning state from palm nature relaxed state;
5) arm outward turning:Double mouths close up, and grit one's teeth 3 seconds, and prosthetic hand is changed into arm outward turning state from palm nature relaxed state;
6)Stuck out one's tongue:Tongue continuously stretches out naturalness 5 times, and prosthetic hand recovers palm nature relaxed state.
Electromyographic signal processing module:By carrying out a square moving average method detection to surface electromyogram signal, extract effectively living Dynamic section, is to reduce noise to effect of signals, chooses Butterworth band logical and band-rejection digital filter is filtered process;
Eyes or mandibular movement characteristic extracting module:Using the feature extraction method of wavelet transformation, and pass through artificial neuron Network carries out pattern recognition to experimenter's action.
Advantage and effect:The present invention provide it is a kind of based on eye and lower jaw electromyographic signal to the control device of intelligent artificial limb and Method, its movement range are big, and signal is easy to collection, and high precision, by the analysis result of more detailed electromyographic signal, drives tool The artificial limb for having lower complexity completes higher task, finds the relation between signal and different mode by various methods, The reliable signal source of artificial limb is become, the accurate control to prosthetic hand can be preferably realized, be met the demand of people.
Description of the drawings:
Fig. 1 is the intelligent artificial limb hand control system principle schematic diagram that eyes or mandibular surface electromyographic signal are controlled;
Fig. 2 is the rate-determining steps of the intelligent artificial limb handss that eyes or mandibular surface electromyographic signal are controlled;
Fig. 3 is a certain setting action flowchart.
Specific embodiment:The present invention is described further below in conjunction with the accompanying drawings:
As shown in figure 1, the present invention provides a kind of control device based on eye and lower jaw electromyographic signal to intelligent artificial limb, it Comprising eyes or mandibular surface electromyographic signal collection module, electromyographic signal processing module, eyes or mandibular movement feature extraction Module, characteristic matching module and prosthesis control module;Eyes or mandibular surface electromyographic signal collection module connection electromyographic signal Processing module, electromyographic signal processing module connection eyes or mandibular movement characteristic extracting module, eyes or mandibular movement feature are carried Delivery block connection features matching module, characteristic matching module connection prosthesis control module.
Four finger motors, thumb motor and carpal joint motor that prosthesis control module includes being connected with characteristic matching module, four Refer to that motor connects four fingers of prosthetic hand by mechanical delivery device, thumb motor connects the thumb of prosthetic hand by mechanical delivery device Refer to, carpal joint motor connects the carpal joint of prosthetic hand by mechanical delivery device.
Based on the control method of eye and lower jaw electromyographic signal to intelligent artificial limb, the method utilizes eyes or mandibular surface Electromyographic signal collection module realize to eyes blink or lower jaw opening and closing motor process generation electromyographic signal collection, concurrently Signal processing module is sent to, characteristic extracting module is sent to feature pre-stored module the electromyographic signal after process, by described Characteristic matching module, realizes motor control to prosthetic hand, so complete by eyes blink or lower jaw opening and closing campaign Event is combined, to produce various different features, so as to complete the control function of prosthetic hand.
Eyes or mandibular surface electromyographic signal collection module are the surface myoelectrics for obtaining skin surface from eye and lower jaw Signal simultaneously sends myoelectricity processing module to, and this module mainly includes three parts:Acquisition parameter sets, acquisition time record, moves Guide, guided by acquisition parameter setting and action, detect blink action and lower jaw opening and closing action data and record eyes and close Conjunction time and lower jaw duration.
Eyes or mandibular movement characteristic extracting module are mainly by eyes signal of blinking recognition unit and lower jaw opening and closing signal Recognition unit is constituted, and differentiates corresponding actions instruction, the wherein motion of signal by the various features for extracting different motion pattern Feature is extracted by the method for wavelet transformation.
The step of the method, is as follows:
(1), eyes or mandibular movement characteristic extracting module are affixed at eyes or lower jaw, it is special using eyes or mandibular movement Levy extraction module to obtain the surface electromyogram signal of skin surface from eye and lower jaw and send myoelectricity processing module to;
(2), myoelectricity processing module transmits a signal to eyes or mandibular movement characteristic extracting module carries out feature extraction;
(3), Land use models matching module be with transmission come in characteristic signal by contrast differentiation carry out characteristic matching, general The respective settings action command of output is sent to prosthesis control module, by the mistake of identification maneuver, confirmation action and matching action Journey, judges whether the feature extracted meets corresponding eyes or mandibular movement characteristic vector expectation action, if meeting, enters Enter prosthesis control module;
(4), control the prosthetic hand the five fingers and wrist motor using prosthesis control module and make corresponding instruction action.
Eyes or mandibular surface electromyographic signal be left eye is continuously blinked, right eye continuously blinks, eyes are while continuously blink Move, open one's mouth, grit one's teeth, the six kinds of states that stuck out one's tongue carry out signal characteristic identification.
Eyes or mandibular surface electromyographic signal are the signal sources for controlling artificial limb execute instruction action.
Prosthetic hand require to clench fist, exrending boxing, stretch the palm, arm inward turning and arm outward turning in the way of move.
During the EMG signal gathered under six kinds of patterns, it is desirable to which user is moved as follows:
1) clench fist:Left eye continuously blinks 5 times, and prosthetic hand is changed into the state of clenching fist from palm nature relaxed state;
2) exrending boxing:Right eye continuously blinks 5 times, and prosthetic hand is changed into exrending boxing state from palm nature relaxed state;
3) stretch the palm:Eyes eye continuously blinks 5 times, and prosthetic hand is changed into the state of clenching fist from palm nature relaxed state;
4) arm inward turning:Open one's mouth 3 seconds or so, prosthetic hand is changed into arm inward turning state from palm nature relaxed state;
5) arm outward turning:Double mouths close up, and grit one's teeth 3 seconds, and prosthetic hand is changed into arm outward turning state from palm nature relaxed state;
6)Stuck out one's tongue:Tongue continuously stretches out naturalness 5 times, and prosthetic hand recovers palm nature relaxed state.
Table specific as follows:
Table 1
Eyes or mandibular movement characteristic vector The controlled prosthetic hand motor pattern of correspondence
Left eye continuously blinks (5t) Clench fist
Right eye continuously blinks (5t) Exrending boxing
Eyes continuously blink (5t) Stretch the palm
Open one's mouth (3s) Arm inward turning
Grit one's teeth (3s) Arm outward turning
Stuck out one's tongue(5t) Naturally relax

Claims (9)

1. the control device based on eye and lower jaw electromyographic signal to intelligent artificial limb, it is characterised in that:It is adopted comprising ocular signal Collection module, mandibular surface electromyographic signal collection module, electromyographic signal processing module, eye motion characteristic extracting module, lower jaw fortune Dynamic characteristic extracting module, characteristic matching module and prosthesis control module;Ocular signal acquisition module and mandibular surface electromyographic signal Acquisition module connects electromyographic signal processing module, electromyographic signal processing module connection eye motion characteristic extracting module and lower jaw fortune Dynamic characteristic extracting module, eye motion characteristic extracting module and mandibular movement characteristic extracting module connection features matching module, it is special Levy matching module connection prosthesis control module.
2. the control device based on eye and lower jaw electromyographic signal to intelligent artificial limb according to claim 1, its feature exist In:Prosthesis control module includes four finger motors, thumb motor and the carpal joint motor being connected with characteristic matching module, and four refer to motor Four fingers of prosthetic hand are connected by mechanical delivery device, thumb motor connects the thumb of prosthetic hand, wrist by mechanical delivery device Joint motor connects the carpal joint of prosthetic hand by mechanical delivery device.
3. using the control implemented to the control device of intelligent artificial limb based on eye and lower jaw electromyographic signal described in claim 1 Method processed, it is characterised in that:It is right that the method is realized using ocular signal acquisition module and mandibular surface electromyographic signal collection module The electromyographic signal collection blinked with the opening and closing motor process generation of lower jaw of eyes, and signal processing module is sent to, feature is carried Delivery block is sent to feature pre-stored module the electromyographic signal after process, by the characteristic matching module, realizes to artificial limb The motor control of handss, so complete by eyes blink and the opening and closing motion event of lower jaw is combined, with produce it is various not Same feature, so that complete the control function of prosthetic hand.
4. the control method based on eye and lower jaw electromyographic signal to intelligent artificial limb according to claim 3, its feature exist In:Ocular signal acquisition module and mandibular surface electromyographic signal collection module are to obtain the surface of skin surface from eye and lower jaw Electromyographic signal simultaneously sends myoelectricity processing module to, and this module mainly includes three parts:Acquisition parameter sets, acquisition time note Record, action are guided, by acquisition parameter setting is with action guide, detection blink action and lower jaw opening and closing action data and records eye Eyeball closing time and lower jaw duration.
5. the control method based on eye and lower jaw electromyographic signal to intelligent artificial limb according to claim 3, its feature exist In:Eye motion characteristic extracting module and mandibular movement characteristic extracting module mainly by eyes signal of blinking recognition unit and under Jaw opening and closing signal identification unit is constituted, and differentiates corresponding actions instruction by the various features for extracting different motion pattern, wherein The motion feature of signal is extracted by the method for time-domain analyses, frequency-domain analysiss, Time-Frequency Analysis, non-linear entropy and chaos, or Person is extracted using the feature extraction method of wavelet transformation.
6. the control method based on eye and lower jaw electromyographic signal to intelligent artificial limb according to claim 5, its feature exist In:The step of the method, is as follows:
(1), be affixed on eyes and lower jaw by eye motion characteristic extracting module and mandibular movement characteristic extracting module corresponding respectively Place, obtains the table of skin surface using eye motion characteristic extracting module and mandibular movement characteristic extracting module from eye and lower jaw The facial muscle signal of telecommunication simultaneously sends myoelectricity processing module to;
(2), myoelectricity processing module transmits a signal to eye motion characteristic extracting module and mandibular movement characteristic extracting module is entered Row feature extraction;
(3), Land use models matching module be with transmission come in characteristic signal by contrast differentiation carry out characteristic matching, will export Respective settings action command be sent to prosthesis control module, by the process of identification maneuver, confirmation action and matching action, sentence Break and the feature of extraction and whether meet corresponding eyes and mandibular movement characteristic vector expectation action, if meeting, into artificial limb Control module;
(4), control the prosthetic hand the five fingers and wrist motor using prosthesis control module and make corresponding instruction action.
7. the control method based on eye and lower jaw electromyographic signal to intelligent artificial limb according to claim 6, its feature exist In:Eyes and mandibular surface electromyographic signal be left eye is continuously blinked, right eye continuously blinks, eyes at the same continuously blink, open one's mouth, Grit one's teeth, the six kinds of states that stuck out one's tongue carry out signal characteristic identification.
8. the control method based on eye and lower jaw electromyographic signal to intelligent artificial limb according to claim 7, its feature exist In:During the EMG signal gathered under six kinds of patterns, it is desirable to which user is moved as follows:
1) clench fist:Left eye continuously blinks 5 times, and prosthetic hand is changed into the state of clenching fist from palm nature relaxed state;
2) exrending boxing:Right eye continuously blinks 5 times, and prosthetic hand is changed into exrending boxing state from palm nature relaxed state;
3) stretch the palm:Eyes eye continuously blinks 5 times, and prosthetic hand is changed into the state of clenching fist from palm nature relaxed state;
4) arm inward turning:Open one's mouth 3 seconds or so, prosthetic hand is changed into arm inward turning state from palm nature relaxed state;
5) arm outward turning:Double mouths close up, and grit one's teeth 3 seconds, and prosthetic hand is changed into arm outward turning state from palm nature relaxed state;
6)Stuck out one's tongue:Tongue continuously stretches out naturalness 5 times, and prosthetic hand recovers palm nature relaxed state.
9. the control method based on eye and lower jaw electromyographic signal to intelligent artificial limb according to claim 7, its feature exist In:
Electromyographic signal processing module:By carrying out a square moving average method detection to surface electromyogram signal, effective active segment is extracted, For noise being reduced to effect of signals, choose Butterworth band logical and band-rejection digital filter is filtered process;
Eye motion characteristic extracting module and mandibular movement characteristic extracting module:Using the feature extraction method of wavelet transformation, and Pattern recognition is carried out to experimenter's action by artificial neural network.
CN201510457634.4A 2015-07-30 2015-07-30 Device and method for controlling intelligent artificial limb based on eye and lower jaw electromyographic signals Active CN104997582B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510457634.4A CN104997582B (en) 2015-07-30 2015-07-30 Device and method for controlling intelligent artificial limb based on eye and lower jaw electromyographic signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510457634.4A CN104997582B (en) 2015-07-30 2015-07-30 Device and method for controlling intelligent artificial limb based on eye and lower jaw electromyographic signals

Publications (2)

Publication Number Publication Date
CN104997582A CN104997582A (en) 2015-10-28
CN104997582B true CN104997582B (en) 2017-03-22

Family

ID=54370641

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510457634.4A Active CN104997582B (en) 2015-07-30 2015-07-30 Device and method for controlling intelligent artificial limb based on eye and lower jaw electromyographic signals

Country Status (1)

Country Link
CN (1) CN104997582B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106067178B (en) * 2016-05-30 2018-11-23 杭州电子科技大学 A kind of continuous estimation method of hand joint movement based on muscle synergistic activation model
CN107260420B (en) * 2017-07-03 2018-11-23 南京邮电大学 Intelligent wheel chair human-computer interactive control system and method based on eye motion recognition
CN108646726A (en) * 2018-04-03 2018-10-12 山东农业大学 The wheelchair control system of wheelchair control method and combination voice based on brain wave
CN112057251B (en) * 2020-09-21 2022-10-25 山西白求恩医院(山西医学科学院) Electric intelligent wheelchair controlled by eye-gaze and lip action signals and control method
CN112241971A (en) * 2020-09-30 2021-01-19 天津大学 Method for measuring motion prediction capability by using entropy and eye movement data

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101317794A (en) * 2008-03-11 2008-12-10 清华大学 Myoelectric control ability detecting and training method for hand-prosthesis with multiple fingers and multiple degrees of freedom
US7593769B1 (en) * 2006-02-14 2009-09-22 Iq Biolabs, Inc. Surface electromyography index
CN101598973A (en) * 2009-06-26 2009-12-09 安徽大学 Man-machine interactive system based on electro-ocular signal
CN101947152A (en) * 2010-09-11 2011-01-19 山东科技大学 Electroencephalogram-voice control system and working method of humanoid artificial limb
CN101987048A (en) * 2009-08-03 2011-03-23 深圳先进技术研究院 Artificial limb control method and system thereof
CN202161439U (en) * 2011-07-21 2012-03-14 山东科技大学 Control system capable of controlling movement of upper artificial limbs through eye movement signals
CN103892945A (en) * 2012-12-27 2014-07-02 中国科学院深圳先进技术研究院 Myoelectric prosthesis control system
CN104010089A (en) * 2014-06-18 2014-08-27 中南大学 Mobile phone dialing method and system based on blinking electromyographic signal detection
CN204909749U (en) * 2015-07-30 2015-12-30 沈阳工业大学 Controlling means based on eye and lower jaw flesh signal of telecommunication are to intelligent artificial limb

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140277583A1 (en) * 2013-03-15 2014-09-18 The Florida International University Board Of Trustees Fitting system for a neural enabled limb prosthesis system
WO2014179704A1 (en) * 2013-05-02 2014-11-06 Vanderbilt University Coordinated control for an arm prosthesis

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7593769B1 (en) * 2006-02-14 2009-09-22 Iq Biolabs, Inc. Surface electromyography index
CN101317794A (en) * 2008-03-11 2008-12-10 清华大学 Myoelectric control ability detecting and training method for hand-prosthesis with multiple fingers and multiple degrees of freedom
CN101598973A (en) * 2009-06-26 2009-12-09 安徽大学 Man-machine interactive system based on electro-ocular signal
CN101987048A (en) * 2009-08-03 2011-03-23 深圳先进技术研究院 Artificial limb control method and system thereof
CN101947152A (en) * 2010-09-11 2011-01-19 山东科技大学 Electroencephalogram-voice control system and working method of humanoid artificial limb
CN202161439U (en) * 2011-07-21 2012-03-14 山东科技大学 Control system capable of controlling movement of upper artificial limbs through eye movement signals
CN103892945A (en) * 2012-12-27 2014-07-02 中国科学院深圳先进技术研究院 Myoelectric prosthesis control system
CN104010089A (en) * 2014-06-18 2014-08-27 中南大学 Mobile phone dialing method and system based on blinking electromyographic signal detection
CN204909749U (en) * 2015-07-30 2015-12-30 沈阳工业大学 Controlling means based on eye and lower jaw flesh signal of telecommunication are to intelligent artificial limb

Also Published As

Publication number Publication date
CN104997582A (en) 2015-10-28

Similar Documents

Publication Publication Date Title
CN104997582B (en) Device and method for controlling intelligent artificial limb based on eye and lower jaw electromyographic signals
Cipriani et al. Online myoelectric control of a dexterous hand prosthesis by transradial amputees
CN107273798A (en) A kind of gesture identification method based on surface electromyogram signal
CN105563495B (en) Arm-and-hand system and method based on refinement motion imagination EEG signals control
Mohamed et al. Single-trial EEG discrimination between wrist and finger movement imagery and execution in a sensorimotor BCI
CN105012057B (en) Intelligent artificial limb based on double-arm electromyogram and attitude information acquisition and motion classifying method
CN102499797B (en) Artificial limb control method and system
CN107957783A (en) A kind of Multimode Intelligent control system and method based on brain electricity with myoelectric information
CN104997581B (en) Artificial hand control method and apparatus for driving EEG signals on the basis of facial expressions
Kakoty et al. Recognition of grasp types through principal components of DWT based EMG features
CN110400619A (en) A kind of healing hand function training method based on surface electromyogram signal
CN111584031B (en) Brain-controlled intelligent limb rehabilitation system based on portable electroencephalogram acquisition equipment and application
Ruhunage et al. EMG signal controlled transhumerai prosthetic with EEG-SSVEP based approch for hand open/close
CN114822761A (en) Wrist rehabilitation training system based on muscle cooperation and variable stiffness impedance control
CN101371804A (en) On-line recognizing method of hand gesture mode established based on sEMG
Li et al. Increasing the robustness against force variation in EMG motion classification by common spatial patterns
CN114897012A (en) Intelligent prosthetic arm control method based on vital machine interface
Narayan et al. Pattern recognition of sEMG signals using DWT based feature and SVM Classifier
Li et al. Wireless sEMG-based identification in a virtual reality environment
CN114021604A (en) Motion imagery training system based on real-time feedback of 3D virtual reality technology
CN115482907A (en) Active rehabilitation system combining electroencephalogram and myoelectricity and rehabilitation training method
Park et al. EEG-based gait state and gait intention recognition using spatio-spectral convolutional neural network
CN204909749U (en) Controlling means based on eye and lower jaw flesh signal of telecommunication are to intelligent artificial limb
WO2022099807A1 (en) Robot natural control method based on electromyographic signal and error electroencephalographic potential
CN113730190A (en) Upper limb rehabilitation robot system with three-dimensional space motion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211116

Address after: 518100 Room 601, building A1, Tianrui Industrial Park, No. 35, Fuyuan 1st Road, Zhancheng community, Fuhai street, Bao'an District, Shenzhen, Guangdong

Patentee after: Yidong (Shenzhen) Photoelectric Technology Co., Ltd

Address before: 110870 No. 111, Shenliao West Road, Shenyang Economic and Technological Development Zone, Liaoning Province

Patentee before: Shenyang University of Technology