CN109498375A - A kind of human motion intention assessment control device and control method - Google Patents

A kind of human motion intention assessment control device and control method Download PDF

Info

Publication number
CN109498375A
CN109498375A CN201811404555.7A CN201811404555A CN109498375A CN 109498375 A CN109498375 A CN 109498375A CN 201811404555 A CN201811404555 A CN 201811404555A CN 109498375 A CN109498375 A CN 109498375A
Authority
CN
China
Prior art keywords
motion information
exoskeleton robot
human
trigger action
lower limb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811404555.7A
Other languages
Chinese (zh)
Other versions
CN109498375B (en
Inventor
侯磊
周玉凯
赵恩盛
邱静
程洪
黄璞玉
陈晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201811404555.7A priority Critical patent/CN109498375B/en
Publication of CN109498375A publication Critical patent/CN109498375A/en
Application granted granted Critical
Publication of CN109498375B publication Critical patent/CN109498375B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H2003/005Appliances for aiding patients or disabled persons to walk about with knee, leg or stump rests
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H2003/007Appliances for aiding patients or disabled persons to walk about secured to the patient, e.g. with belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Abstract

The present invention relates to a kind of human motion intention assessment control devices, it includes: motion information acquisition module: being strapped on the arm of human body, realize the motion information of acquisition human body and transmission in real time;Lower limb walk-aiding exoskeleton robot: it is worn on paraplegia, hemiplegia or the lower part of the body of amputation patient, it is communicated to connect with the motion information acquisition module, receives the data information that the motion information acquisition module is sent and be analyzed and processed the then corresponding movement of control execution.The present invention has the advantages that improving the accuracy of system by algorithm, the probability of exoskeleton robot false triggering can effectively reduced while accurate catching the value of control signal.All exoskeleton robot wearers can be adapted to, the cycle of training that patient dresses exoskeleton robot is greatly reduced.

Description

A kind of human motion intention assessment control device and control method
Technical field
The present invention relates to exoskeleton robot control field more particularly to a kind of human motion intention assessment control device and Control method.
Background technique
Exoskeleton robot is a kind of man-machine integration mechanical device that people can be allowed to wear, and has merged artificial intelligence, machine People learns, and bionics, the multi-field technology such as ergonomics is in one.Its major function is to expand or enhance the physiology machine of a people Can, the effects of power-assisted, protection are provided, supported for human body, it is mainly used in the fields such as medical rehabilitation and military affairs at present.For example, cutting Paralysed patient can stand again by walking aided exoskeleton robot, and can pass through walking aided exoskeleton robot band Dynamic human body walking, and rehabilitation training is carried out, the atrophy of reducing human lower limb muscles.
Current lower limb exoskeleton assistant robot control instruction is substantially to be given by key, either stand up, Walking stop still other exoskeleton robots to act being triggered by being similar to the device of key.Similar to foreign countries Famous rewalk ectoskeleton assistant robot, its control mode are used with the button on the crutch of left and right, are pressed by two Button control ectoskeleton stands up, and walks, sits down.The exoskeleton robot of studies in China is largely also similar to this control at present Mode.Although this key control mode safety, since one movement of every progress requires to be triggered with key, ectoskeleton machine The leg speed of device people is extremely difficult to the leg speed of ordinary person.In addition to this, since the movement of walking aided exoskeleton robot is by by keying System, this will will lead to an inevitable problem, and when exoskeleton robot function increases, number of keys will be inevitable Increase, this will make control mode become complicated.And since lower limb exoskeleton assistant robot is instructed for paralytic patient rehabilitation Practice and meet daily life requirement, so the task very severe that upper limb undertakes, in addition to walking aided ectoskeleton to be guaranteed The balance of robot also removes triggering key with hand to control exoskeleton robot movement.It accidentally touches in order to prevent, substantially one Hand controls a button, presses key assignments according to this, can preferably at most trigger three movements, rise, sit, walk.So working as walking aided When exoskeleton robot adds new function, this touch-tone man-machine interaction mode is simultaneously not suitable with wearer, then with regard to needing to make It is controlled with a kind of new control mode.
Publication No. CN 104188675A patent describes a kind of walking aided exoskeleton robot system and controlling party Method, in terms of human-computer interactive control, this patent only realizes controllably, not there is no the mode for improving human-computer interaction It can the limited control mode for solving the problems, such as multi-functional walking aided exoskeleton robot.Publication No. CN 107832686A is special Benefit describes a kind of lower limb motion mode recognition methods for merging surface myoelectric and acceleration signal, according to the human body lower limbs of acquisition Surface electromyogram signal, acceleration signal carry out data separating by algorithm, thus realize human body intention assessment, but for complete For full property paraplegia patient, lower limb muscles are substantially no any muscular strength, also just have no way of obtaining data, applied by this patent It is surface electromyogram signal, by perspiring and movement is affected, data are very unstable in people's motion process, therefore the algorithm It is not very practical.
Summary of the invention
The purpose of the present invention is to overcome the shortcomings of the existing technology, provides a kind of human motion intention assessment control device And control method, solve defect existing in the prior art, all walking aided ectoskeleton machines can be adapted to by providing one kind The human motion intention assessment system of device people makes the interaction between wearer and walking aided exoskeleton robot seem more certainly So.
The purpose of the present invention is achieved through the following technical solutions: a kind of human motion intention assessment control device, it is wrapped It includes:
Motion information acquisition module: being strapped on the arm of human body, realizes the motion information of acquisition human body and transmission in real time;
Lower limb walk-aiding exoskeleton robot: being worn on paraplegia, hemiplegia or the lower part of the body of amputation patient, with the motion information Acquisition module communication connection, the data information for receiving the motion information acquisition module transmission are analyzed and processed then control and hold The corresponding movement of row.
The operation information acquisition module includes elastic webbing, posture instrument chip, wirelessly transferred chip, power supply module, opens Pass and box;Posture instrument chip, wirelessly transferred chip, power supply module and the switch is all integrated into box, and described opens It closes and is mounted on the elastic webbing by external;The box is singly strapped on human arm by elasticity, and passes through magic Patch is fixed.
A kind of human motion intention assessment control method, comprising the following steps:
S1, motion information acquisition module acquire the motion information of human body in real time and are sent in lower limb walk-aiding exoskeleton robot;
Main control chip in S2, lower limb walk-aiding exoskeleton machine triggers control after being analyzed and processed according to the operation information received Lower limb walk-aiding exoskeleton robot processed executes corresponding actions.
The particular content of the step S2 is as follows:
S21, clustering algorithm operation is carried out according to the movement that collected motion information and wearer are intended to triggering;
Gauss distance between S22, the calculating motion information acquired and center matrix, current according to calculated result and wearer State waiting judges next trigger action.
It is executed accordingly crossing control method realization human body operation intention assessment control triggering lower limb walk-aiding exoskeleton robot Also need to establish the center matrix of all trigger actions before movement, specifically includes the following steps:
Judge to set whether typing trigger action terminates;
If typing terminates, the foundation of all trigger action center matrixs is completed;
If typing is not over, lower limb walk-aiding exoskeleton robot main control chip calculates the center matrix currently acted on, and The center matrix of unfinished foundation is added in center matrix list, then reacquisition and typing motion information, until complete At the foundation of center matrix.
Need to judge the motion information inputted before trigger action is with the trigger action similarity of corresponding center matrix It is no to meet triggering requirement;The following steps are included:
If meeting triggering requirement, corresponding movement in corresponding center matrix is triggered;
If being unsatisfactory for triggering requirement, need to continue to obtain motion information data.
Specific step is as follows for the setting typing trigger action:
Setting lower limb walk-aiding exoskeleton robot enters debugging mode;
Each trigger action is inputted one by one, and keeps each trigger action certain time;
Center matrix corresponding with this movement is completed to establish after inputting successfully, continues to input next movement;
After everything inputs successfully, that is, complete the foundation of all trigger action center matrixs.
The invention has the following advantages that
1, it in existing lower limb exoskeleton assistant robot, is controlled by the key on crutch, is similar to this The triggering mode of kind of personalized customization was from being not used.
2, relative to key control is pressed, this control mode for being intended to establish model by identification human body is more natural.Key Control can make the exoskeleton robot control period elongated, and cause the exoskeleton robot speed of travel that can not be promoted, user Body, which is intended to control, can effectively solve the problems, such as this, change man-machine interaction mode from more sheet, go to cooperate on one's own initiative from wearer outer Bone robot becomes exoskeleton robot and is actively applicable in different wearers.
3, the accuracy that system is improved by algorithm can effectively reduce dermoskeleton while accurate catching the value of control signal The probability of bone robot false triggering.
4, all exoskeleton robot wearers can be adapted to, patient is greatly reduced and dresses exoskeleton robot Cycle of training.
Detailed description of the invention
Fig. 1 is structural schematic diagram when apparatus of the present invention are dressed;
Fig. 2 is the internal module figure of human body motion information acquisition module;
Fig. 3 is center matrix Establishing process figure;
Fig. 4 is triggering decision flow chart;
In figure: 1- elastic webbing, 2- posture instrument chip, 3- wirelessly transferred chip, 4- power supply module, 5- switch, 6- box, 7- lower limb Walk-aiding exoskeleton robot.
Specific embodiment
The present invention will be further described with reference to the accompanying drawing, but protection scope of the present invention is not limited to following institute It states.
As depicted in figs. 1 and 2, a kind of human motion intention assessment control device, it includes:
Motion information acquisition module: being strapped on the arm of human body, realizes the motion information of acquisition human body and transmission in real time;
Lower limb walk-aiding exoskeleton robot: being worn on paraplegia, hemiplegia or the lower part of the body of amputation patient, with the motion information Acquisition module communication connection, the data information for receiving the motion information acquisition module transmission are analyzed and processed then control and hold The corresponding movement of row.
The operation information acquisition module includes elastic webbing 1, posture instrument chip 2, wirelessly transferred chip 3, power supply module 4, switch 5 and box 6;Posture instrument chip 2, wirelessly transferred chip 3, power supply module 4 and the switch 5 is all integrated into box 6 Interior, the switch 5 is mounted on the elastic webbing 1 by external;The box 6 is strapped in human body hand by elasticity single 1 On arm, and it is fixed by velcro.
Preferably, noise filtering is improved the precision of acquisition data by digital filtering technique by posture instrument chip 2, is led to It crosses and is internally integrated attitude algorithm device, cooperate dynamic Kalman filtering algorithm, the fortune of human body can be exported in the environment of dynamic Dynamic information, the precision of attitude measurement is 0.05 degree.Posture instrument chip 2 is mainly made of nine axis gyroscopes, can export three axis Acceleration, angular speed and angle, optional 9600 hertz or 115200 hertz of baud rate.Nine axis gyroscopes are relative to six axis gyroscopes More earth magnetism angle-datas of three axis, can effectively solve six axis gyroscope z-axis drifts (use the time with posture instrument, It is the most serious to will appear z-axis in the accumulation of error, especially six axis gyroscopes) the problem of.Posture instrument chip 2 obtains current human Motion information cooperates dynamic Kalman filtering algorithm, and data can be made more accurate, will be counted by wirelessly transferred chip 3 later According to passing in exoskeleton robot controller, by parsing body motion information, sorting algorithm is carried out, so as to trigger dermoskeleton The various instructions of bone robot.Wirelessly transferred chip 3 uses NRF24L01, and the chip power-consumption is low, is convenient for collection apart from remote, small in size At operating voltage is 1.9-3.6V, and the operating voltage of nine axis gyroscopes is also within this range, so power supply module 4 uses 3.7v poly-lithium battery, battery capacity are 1000 milliamperes, and small in size, only big as adult thumb.Complete equipment is total Weight is very light less than 120 grams, is highly susceptible to dressing, and wearer does not have any sense of discomfort.
A kind of human motion intention assessment control method, comprising the following steps:
S1, motion information acquisition module acquire the motion information of human body in real time and are sent in lower limb walk-aiding exoskeleton robot;
Main control chip in S2, lower limb walk-aiding exoskeleton machine triggers control after being analyzed and processed according to the operation information received Lower limb walk-aiding exoskeleton robot processed executes corresponding actions.
The particular content of the step S2 is as follows:
S21, clustering algorithm operation is carried out according to the movement that collected motion information and wearer are intended to triggering;
Preferably, using DBSCAN clustering algorithm, DBSCAN algorithm can by density up to (object p in kernel object q, Then p is reachable from the direct density of q) it is connected with density (if there is object q, so that object p1, p2 are from q about field Number of objects density is reachable, then claiming p1, p2 is to be connected about q with domain object number density) point all flock together, The data largely to peel off are weeded out, the accuracy of cluster result can be reinforced in this way, and wearer's mistake can be effectively removed It operates bring and interferes data.Since the movement that different wearers want triggering is not unique, so that finally cluster As a result it varies with each individual, is also achieved that personalized customization in this way, enables to any wearer of the model adaptation set up, And according to the trigger gesture of wearer oneself Demand Design oneself.
Gauss distance between S22, the calculating motion information acquired and center matrix, works as according to calculated result and wearer Preceding state waiting judges next trigger action.
It is executed accordingly crossing control method realization human body operation intention assessment control triggering lower limb walk-aiding exoskeleton robot Also need to establish the center matrixs of all trigger actions before movement, use Gauss distance as judgment basis, specifically include with Lower step:
Judge to set whether typing trigger action terminates;
If typing terminates, the foundation of all trigger action center matrixs is completed;
If typing is not over, lower limb walk-aiding exoskeleton robot main control chip calculates the center matrix currently acted on, and The center matrix of unfinished foundation is added in center matrix list, then reacquisition and typing motion information, until complete At the foundation of center matrix.
As shown in figure 3, needing to judge that the motion information of input is dynamic with the triggering of corresponding center matrix before trigger action Make whether similarity meets triggering requirement;The following steps are included:
If meeting triggering requirement, corresponding movement in corresponding center matrix is triggered;
If being unsatisfactory for triggering requirement, then it is assumed that the data are noise, need to continue to obtain motion information data.
As shown in figure 4, specific step is as follows for the setting typing trigger action:
Setting lower limb walk-aiding exoskeleton robot enters debugging mode;Lower limb walk-aiding exoskeleton robot 7 can in debugging mode To record and analyze the data of human body intention assessment module input;
It is inputted one by one per personalized trigger action, wearer makes corresponding touching under the premise of wearing intention assessment module Start to make, the trigger action for such as standing up or walking;And due to walking when have left hand in the preceding right hand in the rear or right hand in front left Hand is in latter two mode, so needing to input movements different twice, and keeps movement 3s to 5s;
Data in this period are parsed, and according to the DBSCAN clustering algorithm in algorithm flow to data at Reason;If data processing success, ectoskeleton can provide auditory tone cues, show that current action inputs successfully, dynamic with this after inputting successfully Make corresponding center matrix to complete to establish, continues to input next movement;
Main control chip completes the foundation of all trigger action center matrixs after everything inputs successfully.
Preferably, there are two body motion information acquisition module has altogether, left large arm and the right side for being worn on user respectively are big On arm, due to being fixed with velcro, so being put on easily.After wearing is good, according to the habit of wearer, if The trigger action of oneself is counted, which kind of posture for example to trigger the ectoskeletons such as stand up, sit down, walk with and instructs.
The working principle of the invention is: after posture instrument chip 2 powers on, initial acceleration point is had on each axis Amount and angle component, as position, direction etc. change, each axle acceleration and angle be may also change accordingly, their isomorphisms At the feature vector of human motion state, since when dressing lower limb walk-aiding exoskeleton robot, different movements can be generated Different feature vector, it is possible to using this feature vector as a kind of measurement of human motion state.Movement is believed first Breath acquisition module is worn on human body, does corresponding movement according to the function of wanting triggering later, for example wants to allow walking aided Exoskeleton robot stands up or walks, and wearer makes corresponding trigger action.Posture instrument chip can be by collected number at this time According on the processor for passing to exoskeleton robot, different data are got according to different trigger actions, data are gathered Class algorithm takes out the corresponding cluster centre of different gestures.Here it is the processes of machine learning in artificial intelligence, according to machine learning As a result, take out these cluster centres, and establish Markov state transfer matrix, it can be such that the probability accidentally touched greatly drops It is low.According to the input of 2 real time data of state-transition matrix and posture instrument chip, so that it may judge human motion intention.
The above is only a preferred embodiment of the present invention, it should be understood that the present invention is not limited to described herein Form should not be regarded as an exclusion of other examples, and can be used for other combinations, modifications, and environments, and can be at this In the text contemplated scope, modifications can be made through the above teachings or related fields of technology or knowledge.And those skilled in the art institute into Capable modifications and changes do not depart from the spirit and scope of the present invention, then all should be in the protection scope of appended claims of the present invention It is interior.

Claims (7)

1. a kind of human motion intention assessment control device, it is characterised in that: it includes:
Motion information acquisition module: being strapped on the arm of human body, realizes the motion information of acquisition human body and transmission in real time;
Lower limb walk-aiding exoskeleton robot: being worn on paraplegia, hemiplegia or the lower part of the body of amputation patient, with the motion information Acquisition module communication connection, the data information for receiving the motion information acquisition module transmission are analyzed and processed then control and hold The corresponding movement of row.
2. a kind of human motion intention assessment control device according to claim 1, it is characterised in that: the operation letter Breath acquisition module includes elastic webbing (1), posture instrument chip (2), wirelessly transferred chip (3), power supply module (4), switch (5) and box Sub (6);Posture instrument chip (2), wirelessly transferred chip (3), power supply module (4) and the switch (5) is all integrated into box (6) Interior, the switch (5) is mounted on the elastic webbing (1) by external;The box (6) passes through single (1) bondage of elasticity It is fixed on human arm, and through velcro.
3. a kind of human motion intention assessment control method, it is characterised in that: the following steps are included:
S1, motion information acquisition module acquire the motion information of human body in real time and are sent in lower limb walk-aiding exoskeleton robot;
Main control chip in S2, lower limb walk-aiding exoskeleton machine triggers control after being analyzed and processed according to the operation information received Lower limb walk-aiding exoskeleton robot processed executes corresponding actions.
4. a kind of human motion intention assessment control method according to claim 3, it is characterised in that: the step S2 Particular content it is as follows:
S21, clustering algorithm operation is carried out according to the movement that collected motion information and wearer are intended to triggering;
Gauss distance between S22, the calculating motion information acquired and center matrix, current according to calculated result and wearer State waiting judges next trigger action.
5. a kind of human motion intention assessment control method according to claim 4, it is characterised in that: crossing control method It realizes and also needs to establish before human body operation intention assessment control triggering lower limb walk-aiding exoskeleton robot executes corresponding actions The center matrix of all trigger actions, specifically includes the following steps:
Judge to set whether typing trigger action terminates;
If typing terminates, the foundation of all trigger action center matrixs is completed;
If typing is not over, lower limb walk-aiding exoskeleton robot main control chip calculates the center matrix currently acted on, and The center matrix of unfinished foundation is added in center matrix list, then reacquisition and typing motion information, until complete At the foundation of center matrix.
6. a kind of human motion intention assessment control method according to claim 4, it is characterised in that: trigger action it Preceding needs judge whether the motion information of input meets triggering requirement with the trigger action similarity of corresponding center matrix;Including with Lower step:
If meeting triggering requirement, corresponding movement in corresponding center matrix is triggered;
If being unsatisfactory for triggering requirement, need to continue to obtain motion information data.
7. a kind of human motion intention assessment control method according to claim 5, it is characterised in that: the setting record Entering trigger action, specific step is as follows:
Setting lower limb walk-aiding exoskeleton robot enters debugging mode;
Each trigger action is inputted one by one, and keeps each trigger action certain time;
Center matrix corresponding with this movement is completed to establish after inputting successfully, continues to input next movement;
After everything inputs successfully, that is, complete the foundation of all trigger action center matrixs.
CN201811404555.7A 2018-11-23 2018-11-23 Human motion intention recognition control device and control method Active CN109498375B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811404555.7A CN109498375B (en) 2018-11-23 2018-11-23 Human motion intention recognition control device and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811404555.7A CN109498375B (en) 2018-11-23 2018-11-23 Human motion intention recognition control device and control method

Publications (2)

Publication Number Publication Date
CN109498375A true CN109498375A (en) 2019-03-22
CN109498375B CN109498375B (en) 2020-12-25

Family

ID=65750036

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811404555.7A Active CN109498375B (en) 2018-11-23 2018-11-23 Human motion intention recognition control device and control method

Country Status (1)

Country Link
CN (1) CN109498375B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110405736A (en) * 2019-08-07 2019-11-05 广东博智林机器人有限公司 Walk help control method and system, exoskeleton robot and terminal
CN110834329A (en) * 2019-10-16 2020-02-25 深圳市迈步机器人科技有限公司 Exoskeleton control method and device
CN110974641A (en) * 2019-12-24 2020-04-10 中南民族大学 Intelligent walking stick system integrating machine learning and Internet of things technology for blind people
CN112315747A (en) * 2019-11-20 2021-02-05 河南水滴智能技术有限公司 Novel exoskeleton control scheme

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1877340A (en) * 2005-06-09 2006-12-13 索尼株式会社 Activity recognition apparatus, method and program
CN103153356A (en) * 2010-09-17 2013-06-12 艾克索仿生技术公司 Human machine interface for human exoskeleton
CN103876756A (en) * 2014-04-18 2014-06-25 南京工程学院 Lower limb power-assisted exoskeleton robot gait pattern identification method and system
US20140207037A1 (en) * 2002-11-25 2014-07-24 Robert W. Horst Intention-based therapy device and method
CN104188675A (en) * 2014-09-24 2014-12-10 哈尔滨工业大学 Exoskeleton robot system with human motion detecting function and control method of robot system
CN104523403A (en) * 2014-11-05 2015-04-22 陶宇虹 Method for judging lower-limb movement intentions of exoskeleton walking aid robot wearer
CN105142581A (en) * 2013-03-14 2015-12-09 埃克苏仿生公司 Machine to human interfaces for communication from a lower extremity orthotic
CN106095267A (en) * 2016-06-01 2016-11-09 惠州市德赛西威汽车电子股份有限公司 Mobile unit control method based on user view identification and system thereof
CN107538469A (en) * 2016-06-29 2018-01-05 深圳光启合众科技有限公司 The method and device of motion intention identification and power-assisted is carried out to human upper limb
CN108283569A (en) * 2017-12-27 2018-07-17 北京精密机电控制设备研究所 A kind of exoskeleton robot control system and control method
CN108764120A (en) * 2018-05-24 2018-11-06 杭州师范大学 A kind of human body specification action evaluation method
CN108888473A (en) * 2018-05-22 2018-11-27 哈尔滨工业大学 Joint of lower extremity based on wearable walk-aiding exoskeleton moves reproducing method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140207037A1 (en) * 2002-11-25 2014-07-24 Robert W. Horst Intention-based therapy device and method
CN1877340A (en) * 2005-06-09 2006-12-13 索尼株式会社 Activity recognition apparatus, method and program
CN103153356A (en) * 2010-09-17 2013-06-12 艾克索仿生技术公司 Human machine interface for human exoskeleton
CN105142581A (en) * 2013-03-14 2015-12-09 埃克苏仿生公司 Machine to human interfaces for communication from a lower extremity orthotic
CN103876756A (en) * 2014-04-18 2014-06-25 南京工程学院 Lower limb power-assisted exoskeleton robot gait pattern identification method and system
CN104188675A (en) * 2014-09-24 2014-12-10 哈尔滨工业大学 Exoskeleton robot system with human motion detecting function and control method of robot system
CN104523403A (en) * 2014-11-05 2015-04-22 陶宇虹 Method for judging lower-limb movement intentions of exoskeleton walking aid robot wearer
CN106095267A (en) * 2016-06-01 2016-11-09 惠州市德赛西威汽车电子股份有限公司 Mobile unit control method based on user view identification and system thereof
CN107538469A (en) * 2016-06-29 2018-01-05 深圳光启合众科技有限公司 The method and device of motion intention identification and power-assisted is carried out to human upper limb
CN108283569A (en) * 2017-12-27 2018-07-17 北京精密机电控制设备研究所 A kind of exoskeleton robot control system and control method
CN108888473A (en) * 2018-05-22 2018-11-27 哈尔滨工业大学 Joint of lower extremity based on wearable walk-aiding exoskeleton moves reproducing method
CN108764120A (en) * 2018-05-24 2018-11-06 杭州师范大学 A kind of human body specification action evaluation method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110405736A (en) * 2019-08-07 2019-11-05 广东博智林机器人有限公司 Walk help control method and system, exoskeleton robot and terminal
CN110405736B (en) * 2019-08-07 2021-01-26 广东博智林机器人有限公司 Walking aid control method and system, exoskeleton robot and terminal
CN110834329A (en) * 2019-10-16 2020-02-25 深圳市迈步机器人科技有限公司 Exoskeleton control method and device
CN110834329B (en) * 2019-10-16 2021-02-09 深圳市迈步机器人科技有限公司 Exoskeleton control method and device
CN112315747A (en) * 2019-11-20 2021-02-05 河南水滴智能技术有限公司 Novel exoskeleton control scheme
CN112315747B (en) * 2019-11-20 2023-08-22 河南水滴智能技术有限公司 Design method of exoskeleton equipment
CN110974641A (en) * 2019-12-24 2020-04-10 中南民族大学 Intelligent walking stick system integrating machine learning and Internet of things technology for blind people

Also Published As

Publication number Publication date
CN109498375B (en) 2020-12-25

Similar Documents

Publication Publication Date Title
CN109498375A (en) A kind of human motion intention assessment control device and control method
Liu et al. Development of an environment-aware locomotion mode recognition system for powered lower limb prostheses
Zhang et al. Static and dynamic human arm/hand gesture capturing and recognition via multiinformation fusion of flexible strain sensors
CN104571837B (en) A kind of method and system for realizing man-machine interaction
US20170119553A1 (en) A haptic feedback device
CN108478189A (en) A kind of human body ectoskeleton mechanical arm control system and method based on EEG signals
CN106530926A (en) Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking
Wu et al. Gait phase classification for a lower limb exoskeleton system based on a graph convolutional network model
CN106774926B (en) Virtual reality interactive glove system and virtual reality system
Yang et al. Experimental study of an EMG-controlled 5-DOF anthropomorphic prosthetic hand for motion restoration
CN106215380A (en) A kind of limb rehabilitation training system
CN105014676A (en) Robot motion control method
Tang et al. Wearable supernumerary robotic limb system using a hybrid control approach based on motor imagery and object detection
Luo et al. Research of intent recognition in rehabilitation robots: a systematic review
CN110251372A (en) Walk-aiding exoskeleton gait adjusting method based on intelligent crutch
Chapman et al. A wearable, open-source, lightweight forcemyography armband: on intuitive, robust muscle-machine interfaces
Sabuj et al. Human robot interaction using sensor based hand gestures for assisting disable people
Law et al. A cap as interface for wheelchair control
Zhou et al. A survey of the development of wearable devices
Vishal et al. Sign language to speech conversion
CN204748634U (en) Motion control system of robot
CN106843483A (en) A kind of virtual reality device and its control method
Long et al. Review of human-exoskeleton control strategy for lower limb rehabilitation exoskeleton
Zhang et al. Integrating intention-based systems in human-robot interaction: a scoping review of sensors, algorithms, and trust
CN109343702A (en) A kind of portable V R action input method based on intelligent ring and node

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant