CN109498375B - Human motion intention recognition control device and control method - Google Patents

Human motion intention recognition control device and control method Download PDF

Info

Publication number
CN109498375B
CN109498375B CN201811404555.7A CN201811404555A CN109498375B CN 109498375 B CN109498375 B CN 109498375B CN 201811404555 A CN201811404555 A CN 201811404555A CN 109498375 B CN109498375 B CN 109498375B
Authority
CN
China
Prior art keywords
motion information
action
exoskeleton robot
lower limb
trigger
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811404555.7A
Other languages
Chinese (zh)
Other versions
CN109498375A (en
Inventor
侯磊
周玉凯
赵恩盛
邱静
程洪
黄璞玉
陈晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201811404555.7A priority Critical patent/CN109498375B/en
Publication of CN109498375A publication Critical patent/CN109498375A/en
Application granted granted Critical
Publication of CN109498375B publication Critical patent/CN109498375B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H2003/005Appliances for aiding patients or disabled persons to walk about with knee, leg or stump rests
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H2003/007Appliances for aiding patients or disabled persons to walk about secured to the patient, e.g. with belts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns

Abstract

The invention relates to a human motion intention recognition control device, which comprises: motion information acquisition module: the device is bound on the arms of the human body, so that the motion information of the human body is collected and sent in real time; the lower limb walking aid exoskeleton robot comprises: the device is worn on the lower half of the body of a paraplegic patient, a hemiplegic patient or an amputation patient, is in communication connection with the motion information acquisition module, receives data information sent by the motion information acquisition module, analyzes and processes the data information, and then controls and executes corresponding actions. The invention has the advantages that: the accuracy of the system is improved through the algorithm, and the probability of false triggering of the exoskeleton robot is effectively reduced while the control signal can be accurately captured. All exoskeleton robot wearers can be adapted, and the training period of the exoskeleton robot worn by the patient is greatly shortened.

Description

Human motion intention recognition control device and control method
Technical Field
The invention relates to the field of exoskeleton robot control, in particular to a human motion intention recognition control device and a control method.
Background
The exoskeleton robot is a man-machine integrated mechanical device which can be worn by people, and integrates the technologies of multiple fields such as artificial intelligence, robotics, bionics, ergonomics and the like. The main function of the device is to expand or enhance the physiological function of a person, provide functions of assistance, protection, support and the like for the human body, and is mainly applied to the fields of medical rehabilitation, military and the like at present. For example, the paraplegic patient can re-stand by means of the paraplegic walking-assisting exoskeleton robot, and can drive the human body to walk by the paraplegic walking-assisting exoskeleton robot to perform rehabilitation training and slow down atrophy of muscles of the lower limbs of the human body.
The control instructions of the existing lower limb exoskeleton walking aid robot are basically given through keys, and actions of the existing lower limb exoskeleton walking aid robot, such as standing, walking, stopping or other exoskeleton robots, are triggered through devices similar to the keys. Similar to the famous foreign rewalk exoskeleton walking-aid robot, the control mode of the robot is to use buttons on a left crutch and a right crutch, and the exoskeleton is controlled to stand up, walk and sit down through the two buttons. The exoskeleton robot researched in China at present is also similar to the control mode. Although the key control mode is safe, the exoskeleton robot cannot reach the pace of a normal person easily because the exoskeleton robot needs to be triggered by keys every time an action is performed. In addition, since the movement of the paraplegia walking exoskeleton robot is controlled by the keys, an inevitable problem is caused, and when the function of the exoskeleton robot is increased, the number of the keys is inevitably increased, which complicates the control mode. And because the lower limb exoskeleton walking-aid robot is used for rehabilitation training of paraplegic patients and meeting the requirements of daily life, the task borne by the upper limbs is very severe, and the keys are triggered by hands to control the movement of the exoskeleton robot in order to ensure the balance of the paraplegic walking-aid exoskeleton robot. In order to prevent the mistaken touch, one hand is basically used for controlling one button, and at most three actions, namely rising, sitting and walking, can be triggered according to the setting of the button. Therefore, when a paraplegia walking-aid exoskeleton robot is added with new functions, the button-type man-machine interaction mode is not suitable for a wearer, and a new control mode is needed for control.
The patent publication No. CN 104188675A introduces a paraplegia walking-aid exoskeleton robot system and a control method thereof, and from the aspect of human-computer interaction control, the patent does not improve the human-computer interaction mode, only realizes controllability, and cannot solve the problem of the control mode of the multifunctional paraplegia walking-aid exoskeleton robot in a limited way. The patent publication No. CN 107832686A introduces a lower limb movement pattern recognition method fusing surface electromyography and acceleration signals, which performs data separation by an algorithm according to the acquired surface electromyography signals and acceleration signals of the lower limbs of a human body, thereby realizing human intention recognition, but for a patient with complete paraplegia, the muscles of the lower limbs basically have no muscle force, namely data cannot be acquired, the patent applies the surface electromyography signals, the influence of sweating and movement is large, and the data are very unstable in the human movement process, so the practicability of the algorithm is not strong.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, provides a human motion intention recognition control device and a control method, overcomes the defects in the prior art, and provides a human motion intention recognition system which can be adapted to all paraplegic walking-aid exoskeleton robots, so that the interaction between a wearer and the paraplegic walking-aid exoskeleton robots is more natural.
The purpose of the invention is realized by the following technical scheme: a human motion intention recognition control device, comprising:
motion information acquisition module: the device is bound on the arms of the human body, so that the motion information of the human body is collected and sent in real time;
the lower limb walking aid exoskeleton robot comprises: the device is worn on the lower half of the body of a paraplegic patient, a hemiplegic patient or an amputation patient, is in communication connection with the motion information acquisition module, receives data information sent by the motion information acquisition module, analyzes and processes the data information, and then controls and executes corresponding actions.
The operation information acquisition module comprises an elastic belt, an attitude instrument chip, a wireless transmission chip, a power supply module, a switch and a box; the attitude instrument chip, the wireless transmission chip, the power supply module and the switch are integrated into a box, and the switch is externally connected to the elastic belt; the box is tied on the arm of the human body through the elastic single band and is fixed through the magic tape.
A human motion intention recognition control method comprises the following steps:
s1, the motion information acquisition module acquires motion information of the human body in real time and sends the motion information to the lower limb walking assisting exoskeleton robot;
and S2, the main control chip in the lower limb walking assisting exoskeleton robot is analyzed and processed according to the received running information, and then the main control chip triggers and controls the lower limb walking assisting exoskeleton robot to execute corresponding actions.
The specific content of step S2 is as follows:
s21, performing clustering algorithm operation according to the collected motion information and the action triggered by the intention of the wearer;
and S22, calculating the Gaussian distance between the collected motion information and the central matrix, and waiting to judge the next trigger action according to the calculation result and the current state of the wearer.
Before the control method is used for realizing that the human body operation intention is recognized and controlled to trigger the lower limb walking assisting exoskeleton robot to execute corresponding actions, a central matrix of all trigger actions needs to be established, and the method specifically comprises the following steps:
judging whether the set input triggering action is finished or not;
if the recording is finished, finishing the establishment of all the trigger action center matrixes;
if the recording is not finished, the lower limb walking assisting exoskeleton robot main control chip calculates a currently acting center matrix, adds the center matrix which is not established into a center matrix list, and then acquires and records the motion information again until the establishment of the center matrix is finished.
Before triggering action, whether the similarity of the input motion information and the triggering action of the corresponding central matrix meets the triggering requirement needs to be judged; the method comprises the following steps:
if the triggering requirements are met, triggering corresponding actions in the corresponding central matrix;
if the trigger requirement is not met, the motion information data needs to be continuously acquired.
The specific steps of setting the logging trigger action are as follows:
setting a lower limb walking assisting exoskeleton robot to enter a debugging mode;
inputting each trigger action one by one, and keeping each trigger action for a certain time;
after the input is successful, the central matrix corresponding to the action is established, and the next action is continuously input;
and finishing the establishment of all the trigger action center matrixes after all the action input is successful.
The invention has the following advantages:
1. in the existing lower limb exoskeleton walking aid robots, the walking aid robots are controlled by keys on crutches, and a personalized and customized triggering mode similar to the control mode is never used.
2. Compared with key control, the control mode of establishing a model by recognizing human body intentions is more natural. The key control can prolong the control period of the exoskeleton robot, the walking speed of the exoskeleton robot cannot be increased, the problem can be effectively solved by using human body intention control, a human-computer interaction mode is changed from the aspect of changing a human-computer interaction mode, and the exoskeleton robot is changed from a wearer to be actively matched with the exoskeleton robot to be actively suitable for different wearers.
3. The accuracy of the system is improved through the algorithm, and the probability of false triggering of the exoskeleton robot is effectively reduced while the control signal can be accurately captured.
4. All exoskeleton robot wearers can be adapted, and the training period of the exoskeleton robot worn by the patient is greatly shortened.
Drawings
FIG. 1 is a schematic view of the device of the present invention in a worn configuration;
FIG. 2 is a block diagram of the internal components of the human motion information collection module;
FIG. 3 is a flow chart of center matrix creation;
FIG. 4 is a flowchart of a trigger determination;
in the figure: the exoskeleton robot comprises an elastic belt 1, a posture instrument 2, a wireless transmission chip 3, a power supply module 4, a switch 5, a box 6 and a lower limb walking aid exoskeleton 7.
Detailed Description
The invention will be further described with reference to the accompanying drawings, but the scope of the invention is not limited to the following.
As shown in fig. 1 and 2, a human motion intention recognition control apparatus includes:
motion information acquisition module: the device is bound on the arms of the human body, so that the motion information of the human body is collected and sent in real time;
the lower limb walking aid exoskeleton robot comprises: the device is worn on the lower half of the body of a paraplegic patient, a hemiplegic patient or an amputation patient, is in communication connection with the motion information acquisition module, receives data information sent by the motion information acquisition module, analyzes and processes the data information, and then controls and executes corresponding actions.
The operation information acquisition module comprises an elastic belt 1, an attitude instrument chip 2, a wireless transmission chip 3, a power supply module 4, a switch 5 and a box 6; the attitude instrument chip 2, the wireless transmission chip 3, the power supply module 4 and the switch 5 are integrated into a box 6, and the switch 5 is externally installed on the elastic belt 1; the box 6 is bound on the arm of the human body through the elastic sheet 1 and fixed through magic tape.
Preferably, the attitude instrument chip 2 filters noise through a digital filtering technology, improves accuracy of data acquisition, and can output motion information of a human body in a dynamic environment through an internal integrated attitude resolver in cooperation with a dynamic Kalman filtering algorithm, wherein accuracy of attitude measurement is 0.05 degrees. The attitude instrument chip 2 is mainly composed of a nine-axis gyroscope and can output three-axis acceleration, angular velocity and angle, and the baud rate can be selected to be 9600 Hz or 115200 Hz. The nine-axis gyroscope has more geomagnetic angle data of three axes relative to the six-axis gyroscope, and can effectively solve the problem that the z axis of the six-axis gyroscope is zero drift (along with the service time of the attitude indicator, the accumulation of errors can occur, and particularly the z axis of the six-axis gyroscope is the most serious). The posture instrument chip 2 acquires current human body motion information, the data can be more accurate by matching with a dynamic Kalman filtering algorithm, then the data are transmitted to the exoskeleton robot controller through the wireless transmission chip 3, and the classification algorithm is carried out by analyzing the human body motion information, so that various instructions of the exoskeleton robot can be triggered. The wireless transmission chip 3 adopts NRF24L01, the chip is low in power consumption, long in distance, small in size and convenient to integrate, the working voltage is 1.9-3.6V, the working voltage of the nine-axis gyroscope is also in the range, so that the power supply module 4 adopts a 3.7V polymer lithium battery, the battery capacity is 1000 milliamperes, and the size is small and is only as large as that of the thumb of an adult. The total weight of the whole set of equipment is less than 120 grams, the equipment is very light and easy to wear, and a wearer does not feel any discomfort.
A human motion intention recognition control method comprises the following steps:
s1, the motion information acquisition module acquires motion information of the human body in real time and sends the motion information to the lower limb walking assisting exoskeleton robot;
and S2, the main control chip in the lower limb walking assisting exoskeleton robot is analyzed and processed according to the received running information, and then the main control chip triggers and controls the lower limb walking assisting exoskeleton robot to execute corresponding actions.
The specific content of step S2 is as follows:
s21, performing clustering algorithm operation according to the collected motion information and the action triggered by the intention of the wearer;
preferably, by adopting a dbss clustering algorithm, the dbss clustering algorithm can cluster points with reachable density (if an object p is in a core object q, then p is directly reachable from q) and connected density (if the object q exists, the objects p1 and p2 are both called as p1 and p2 are connected with the number density of the field objects from q when the number density of the field objects is reachable), so that a large amount of clustered data can be removed, the accuracy of clustering results can be enhanced, and interference data caused by misoperation of a wearer can be effectively removed. Because the actions which different wearers want to trigger are not unique, the final clustering result is different from person to person, so that personalized customization is realized, the established model can be adapted to any wearer, and the triggering posture of the wearer is designed according to the requirements of the wearer.
And S22, calculating the Gaussian distance between the collected motion information and the central matrix, and waiting to judge the next trigger action according to the calculation result and the current state of the wearer.
Before the control method realizes that the human body operation intention is recognized and controlled to trigger the lower limb walking assisting exoskeleton robot to execute corresponding actions, a central matrix of all trigger actions needs to be established, and Gaussian distance is used as a judgment basis, and the method specifically comprises the following steps:
judging whether the set input triggering action is finished or not;
if the recording is finished, finishing the establishment of all the trigger action center matrixes;
if the recording is not finished, the lower limb walking assisting exoskeleton robot main control chip calculates a currently acting center matrix, adds the center matrix which is not established into a center matrix list, and then acquires and records the motion information again until the establishment of the center matrix is finished.
As shown in fig. 3, before triggering an action, it is necessary to determine whether the similarity between the input motion information and the triggering action of the corresponding center matrix meets the triggering requirement; the method comprises the following steps:
if the triggering requirements are met, triggering corresponding actions in the corresponding central matrix;
if the triggering requirement is not met, the data is considered as noise, and the motion information data needs to be continuously acquired.
As shown in fig. 4, the specific steps of setting the entry trigger action are as follows:
setting a lower limb walking assisting exoskeleton robot to enter a debugging mode; in the debugging mode, the lower limb walking assisting exoskeleton robot 7 can record and analyze data input by the human body intention recognition module;
inputting each personalized trigger action one by one, and making a corresponding trigger action, such as a standing or walking trigger action, by the wearer on the premise of wearing the intention identification module; when walking, the left hand is in front of the right hand and behind the right hand, or the right hand is in front of the right hand and behind the left hand, so that two different actions need to be input, and the actions are kept from 3s to 5 s;
analyzing the data in the period of time, and processing the data according to a DBSCAN clustering algorithm in the algorithm flow; if the data processing is successful, the exoskeleton gives a voice prompt to indicate that the input of the current action is successful, and after the input is successful, the central matrix corresponding to the action is established, and the next action is continuously input;
and after all actions are successfully input, the main control chip completes the establishment of all trigger action center matrixes.
Preferably, the human motion information acquisition modules are two in total and are respectively worn on the left big arm and the right big arm of a user, and the human motion information acquisition modules are fixed by the magic tapes, so that the human motion information acquisition modules are convenient to wear. After wearing, the user can design his/her own triggering actions, such as the posture to trigger exoskeleton commands for standing up, sitting down, walking, etc., according to the habit of the wearer.
The working principle of the invention is as follows: after the posture instrument chip 2 is powered on, an initial acceleration component and an initial angle component are generated on each axis, the acceleration and the angle of each axis can be correspondingly changed along with the change of the position, the direction and the like, and the acceleration and the angle of each axis form a characteristic vector of the motion state of the human body together. Firstly, the motion information acquisition module is worn on a human body, and then corresponding actions are performed according to functions to be triggered, for example, when a paraplegia walking-aid exoskeleton robot is required to stand up or walk, a wearer performs corresponding triggering actions. And at the moment, the attitude instrument chip transmits the acquired data to a processor of the exoskeleton robot, acquires different data according to different trigger actions, performs a clustering algorithm on the data, and takes out clustering centers corresponding to different postures. The method is a machine learning process in artificial intelligence, and the clustering centers are taken out according to the machine learning result and the Markov state transition matrix is established, so that the probability of false touch can be greatly reduced. According to the state transition matrix and the input of the real-time data of the attitude instrument chip 2, the human motion intention can be judged.
The foregoing is illustrative of the preferred embodiments of this invention, and it is to be understood that the invention is not limited to the precise form disclosed herein and that various other combinations, modifications, and environments may be resorted to, falling within the scope of the concept as disclosed herein, either as described above or as apparent to those skilled in the relevant art. And that modifications and variations may be effected by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (2)

1. A human motion intention recognition control device is characterized in that: it includes:
motion information acquisition module: the device is bound on the arms of the human body, so that the motion information of the human body is collected and sent in real time;
the lower limb walking aid exoskeleton robot comprises: the device is worn on the lower half of the body of a paraplegic patient, a hemiplegic patient or an amputation patient, is in communication connection with the motion information acquisition module, receives data information sent by the motion information acquisition module, analyzes and processes the data information, and then controls and executes corresponding actions;
the motion information acquisition module comprises an elastic belt (1), an attitude instrument chip (2), a wireless transmission chip (3), a power supply module (4), a switch (5) and a box (6); the attitude instrument chip (2), the wireless transmission chip (3), the power supply module (4) and the switch (5) are integrated into the box (6), and the switch (5) is externally installed on the elastic belt (1); the box (6) is bound on the arm of the human body through the elastic band (1) and is fixed through a magic tape;
the control method of the human motion intention recognition control device comprises the following steps:
s1, the motion information acquisition module acquires motion information of the human body in real time and sends the motion information to the lower limb walking assisting exoskeleton robot;
s2, a main control chip in the lower limb walking assisting exoskeleton robot carries out analysis processing according to the received running information and then triggers and controls the lower limb walking assisting exoskeleton robot to execute corresponding actions;
the specific content of step S2 is as follows:
s21, performing clustering algorithm operation according to the collected motion information and the action triggered by the intention of the wearer;
s22, calculating the Gaussian distance between the collected motion information and the central matrix, and waiting to judge the next trigger action according to the calculation result and the current state of the wearer;
before the control method is used for realizing that the human body operation intention is recognized and controlled to trigger the lower limb walking assisting exoskeleton robot to execute corresponding actions, a central matrix of all trigger actions needs to be established, and the method specifically comprises the following steps:
judging whether the set input triggering action is finished or not;
if the recording is finished, finishing the establishment of all the trigger action center matrixes;
if the input is not finished, the lower limb walking assisting exoskeleton robot master control chip calculates a currently acting center matrix, adds the center matrix which is not established into a center matrix list, and then acquires and inputs the motion information again until the establishment of the center matrix is finished;
before triggering action, whether the similarity of the input motion information and the triggering action of the corresponding central matrix meets the triggering requirement needs to be judged; the method comprises the following steps:
if the triggering requirements are met, triggering corresponding actions in the corresponding central matrix;
if the trigger requirement is not met, the motion information data needs to be continuously acquired;
the specific steps of setting the logging trigger action are as follows:
setting a lower limb walking assisting exoskeleton robot to enter a debugging mode;
inputting each trigger action one by one, and keeping each trigger action for a certain time;
after the input is successful, the central matrix corresponding to the action is established, and the next action is continuously input;
and finishing the establishment of all the trigger action center matrixes after all the action input is successful.
2. A human motion intention recognition control method is characterized in that: the method comprises the following steps:
s1, the motion information acquisition module acquires motion information of the human body in real time and sends the motion information to the lower limb walking assisting exoskeleton robot;
s2, a main control chip in the lower limb walking assisting exoskeleton robot carries out analysis processing according to the received running information and then triggers and controls the lower limb walking assisting exoskeleton robot to execute corresponding actions;
the specific content of step S2 is as follows:
s21, performing clustering algorithm operation according to the collected motion information and the action triggered by the intention of the wearer;
s22, calculating the Gaussian distance between the collected motion information and the central matrix, and waiting to judge the next trigger action according to the calculation result and the current state of the wearer;
before the control method is used for realizing that the human body operation intention is recognized and controlled to trigger the lower limb walking assisting exoskeleton robot to execute corresponding actions, a central matrix of all trigger actions needs to be established, and the method specifically comprises the following steps:
judging whether the set input triggering action is finished or not;
if the recording is finished, finishing the establishment of all the trigger action center matrixes;
if the input is not finished, the lower limb walking assisting exoskeleton robot master control chip calculates a currently acting center matrix, adds the center matrix which is not established into a center matrix list, and then acquires and inputs the motion information again until the establishment of the center matrix is finished;
before triggering action, whether the similarity of the input motion information and the triggering action of the corresponding central matrix meets the triggering requirement needs to be judged; the method comprises the following steps:
if the triggering requirements are met, triggering corresponding actions in the corresponding central matrix;
if the trigger requirement is not met, the motion information data needs to be continuously acquired;
the specific steps of setting the logging trigger action are as follows:
setting a lower limb walking assisting exoskeleton robot to enter a debugging mode;
inputting each trigger action one by one, and keeping each trigger action for a certain time;
after the input is successful, the central matrix corresponding to the action is established, and the next action is continuously input;
and finishing the establishment of all the trigger action center matrixes after all the action input is successful.
CN201811404555.7A 2018-11-23 2018-11-23 Human motion intention recognition control device and control method Active CN109498375B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811404555.7A CN109498375B (en) 2018-11-23 2018-11-23 Human motion intention recognition control device and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811404555.7A CN109498375B (en) 2018-11-23 2018-11-23 Human motion intention recognition control device and control method

Publications (2)

Publication Number Publication Date
CN109498375A CN109498375A (en) 2019-03-22
CN109498375B true CN109498375B (en) 2020-12-25

Family

ID=65750036

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811404555.7A Active CN109498375B (en) 2018-11-23 2018-11-23 Human motion intention recognition control device and control method

Country Status (1)

Country Link
CN (1) CN109498375B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110405736B (en) * 2019-08-07 2021-01-26 广东博智林机器人有限公司 Walking aid control method and system, exoskeleton robot and terminal
CN110834329B (en) * 2019-10-16 2021-02-09 深圳市迈步机器人科技有限公司 Exoskeleton control method and device
CN112315747B (en) * 2019-11-20 2023-08-22 河南水滴智能技术有限公司 Design method of exoskeleton equipment
CN110974641A (en) * 2019-12-24 2020-04-10 中南民族大学 Intelligent walking stick system integrating machine learning and Internet of things technology for blind people

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1877340A (en) * 2005-06-09 2006-12-13 索尼株式会社 Activity recognition apparatus, method and program
CN103153356A (en) * 2010-09-17 2013-06-12 艾克索仿生技术公司 Human machine interface for human exoskeleton
CN103876756A (en) * 2014-04-18 2014-06-25 南京工程学院 Lower limb power-assisted exoskeleton robot gait pattern identification method and system
CN104188675A (en) * 2014-09-24 2014-12-10 哈尔滨工业大学 Exoskeleton robot system with human motion detecting function and control method of robot system
CN104523403A (en) * 2014-11-05 2015-04-22 陶宇虹 Method for judging lower-limb movement intentions of exoskeleton walking aid robot wearer
CN105142581A (en) * 2013-03-14 2015-12-09 埃克苏仿生公司 Machine to human interfaces for communication from a lower extremity orthotic
CN106095267A (en) * 2016-06-01 2016-11-09 惠州市德赛西威汽车电子股份有限公司 Mobile unit control method based on user view identification and system thereof
CN107538469A (en) * 2016-06-29 2018-01-05 深圳光启合众科技有限公司 The method and device of motion intention identification and power-assisted is carried out to human upper limb
CN108283569A (en) * 2017-12-27 2018-07-17 北京精密机电控制设备研究所 A kind of exoskeleton robot control system and control method
CN108764120A (en) * 2018-05-24 2018-11-06 杭州师范大学 A kind of human body specification action evaluation method
CN108888473A (en) * 2018-05-22 2018-11-27 哈尔滨工业大学 Joint of lower extremity based on wearable walk-aiding exoskeleton moves reproducing method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6966882B2 (en) * 2002-11-25 2005-11-22 Tibion Corporation Active muscle assistance device and method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1877340A (en) * 2005-06-09 2006-12-13 索尼株式会社 Activity recognition apparatus, method and program
CN103153356A (en) * 2010-09-17 2013-06-12 艾克索仿生技术公司 Human machine interface for human exoskeleton
CN105142581A (en) * 2013-03-14 2015-12-09 埃克苏仿生公司 Machine to human interfaces for communication from a lower extremity orthotic
CN103876756A (en) * 2014-04-18 2014-06-25 南京工程学院 Lower limb power-assisted exoskeleton robot gait pattern identification method and system
CN104188675A (en) * 2014-09-24 2014-12-10 哈尔滨工业大学 Exoskeleton robot system with human motion detecting function and control method of robot system
CN104523403A (en) * 2014-11-05 2015-04-22 陶宇虹 Method for judging lower-limb movement intentions of exoskeleton walking aid robot wearer
CN106095267A (en) * 2016-06-01 2016-11-09 惠州市德赛西威汽车电子股份有限公司 Mobile unit control method based on user view identification and system thereof
CN107538469A (en) * 2016-06-29 2018-01-05 深圳光启合众科技有限公司 The method and device of motion intention identification and power-assisted is carried out to human upper limb
CN108283569A (en) * 2017-12-27 2018-07-17 北京精密机电控制设备研究所 A kind of exoskeleton robot control system and control method
CN108888473A (en) * 2018-05-22 2018-11-27 哈尔滨工业大学 Joint of lower extremity based on wearable walk-aiding exoskeleton moves reproducing method
CN108764120A (en) * 2018-05-24 2018-11-06 杭州师范大学 A kind of human body specification action evaluation method

Also Published As

Publication number Publication date
CN109498375A (en) 2019-03-22

Similar Documents

Publication Publication Date Title
CN109498375B (en) Human motion intention recognition control device and control method
Kundu et al. Hand gesture recognition based omnidirectional wheelchair control using IMU and EMG sensors
CN104134060B (en) Sign language interpreter and display sonification system based on electromyographic signal and motion sensor
Fukuda et al. A human-assisting manipulator teleoperated by EMG signals and arm motions
CN108478189A (en) A kind of human body ectoskeleton mechanical arm control system and method based on EEG signals
CN105943207B (en) One kind is based on idiodynamic intelligent artificial limb kinematic system and its control method
CN104571837B (en) A kind of method and system for realizing man-machine interaction
WO2018233435A1 (en) Multi-dimensional surface electromyographic signal based artificial hand control method based on principal component analysis method
US20170119553A1 (en) A haptic feedback device
Baldi et al. Design of a wearable interface for lightweight robotic arm for people with mobility impairments
Jackowski et al. A novel head gesture based interface for hands-free control of a robot
US20230022882A1 (en) Electromyography and motion based control of upper limb prosthetics
WO2019132692A1 (en) Method and system for controlling electronic devices with the aid of an electromyographic reading device
Yang et al. Experimental study of an EMG-controlled 5-DOF anthropomorphic prosthetic hand for motion restoration
CN105014676A (en) Robot motion control method
Chapman et al. A wearable, open-source, lightweight forcemyography armband: on intuitive, robust muscle-machine interfaces
Luo et al. Research of intent recognition in rehabilitation robots: a systematic review
Prasad et al. Head-motion controlled wheelchair
Long et al. Development and validation of a robotic system combining mobile wheelchair and lower extremity exoskeleton
Law et al. A cap as interface for wheelchair control
Gupta MAC-MAN
CN110251372A (en) Walk-aiding exoskeleton gait adjusting method based on intelligent crutch
CN106974780A (en) Method for controlling intelligent wheelchair based on difference navigation attitude
CN111035535A (en) Cerebral apoplexy rehabilitation training system and method
Sharma et al. Iris movement based wheel chair control using raspberry Pi—A state of art

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant