CN113902048A - Human motion posture recognition method and wearable exoskeleton - Google Patents

Human motion posture recognition method and wearable exoskeleton Download PDF

Info

Publication number
CN113902048A
CN113902048A CN202110792447.7A CN202110792447A CN113902048A CN 113902048 A CN113902048 A CN 113902048A CN 202110792447 A CN202110792447 A CN 202110792447A CN 113902048 A CN113902048 A CN 113902048A
Authority
CN
China
Prior art keywords
state
motion
data
human motion
human
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110792447.7A
Other languages
Chinese (zh)
Inventor
林西川
魏巍
查士佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Maybe Intelligent Technology Suzhou Co ltd
Original Assignee
Maybe Intelligent Technology Suzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maybe Intelligent Technology Suzhou Co ltd filed Critical Maybe Intelligent Technology Suzhou Co ltd
Priority to CN202110792447.7A priority Critical patent/CN113902048A/en
Publication of CN113902048A publication Critical patent/CN113902048A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Pathology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a human motion posture identification method and a wearable exoskeleton, wherein the human motion posture identification method comprises the following steps: collecting and transmitting human motion data; training the received human motion data to obtain a waist state recognition model; and carrying out filtering processing on the obtained waist state recognition model to obtain the actual motion posture of the human body. The human body motion posture identification method can solve the problem of human-computer coupled bending state identification, and can realize identification of the motion state of the human body in the bending process through the state identification algorithm. Meanwhile, the recognized motion state under man-machine coupling can be well matched with the motion state of the waist of the human body by introducing a finite state mechanism filtering means, and the condition that frequent state misjudgment occurs in the switching of the motion state in the prior art is further overcome.

Description

Human motion posture recognition method and wearable exoskeleton
Technical Field
The invention relates to the technical field of exoskeletons, in particular to a human motion posture identification method and a wearable exoskeleton.
Background
The exoskeleton is used as a human-computer coupled robot device, the movement intention of a wearer is required to be detected frequently, a given control strategy is given according to the movement intention obtained through detection, if the exoskeleton is required to detect the movement data and the movement intention of a human body, external sensing equipment is required to detect the interactive movement between the human and the computer generally, myoelectric and electroencephalogram devices for detecting skin electric signals are difficult to wear and easy to fall off and are not suitable for daily living environments, and therefore common physical detection modes such as plantar pressure sensing, angle sensing or force detection modes are more common.
For the human body bending motion, when the human body bending motion state is different from the lower limb walking, no obvious periodic signal exists, and prediction cannot be performed through periodic motion data, so that the common lower limb motion state discrimination method is not suitable for man-machine coupled bending motion state discrimination, and the state discrimination algorithm based on the motion threshold is not suitable for all wearers, and is not filtered through a filter conforming to the human body motion rule, so that the problem of misdiscrimination is easily caused. Therefore, it is necessary to provide a further solution to the above problems.
Disclosure of Invention
The invention aims to provide a human motion posture identification method and a wearable exoskeleton, so as to overcome the defects in the prior art.
In order to achieve the above object, the present invention provides a human motion gesture recognition method, which comprises the following steps:
collecting and transmitting human motion data;
training the received human motion data to obtain a waist state recognition model;
and carrying out filtering processing on the obtained waist state recognition model to obtain the actual motion posture of the human body.
As an improvement of the human motion gesture recognition method of the present invention, the human motion data includes: at least one of torso movement data, lower limb movement data, foot movement data, and interaction force of lower limb movement.
As an improvement of the human motion posture recognition method, the received human motion data is trained through a decision tree model, and the method comprises the following steps:
A. putting original data of a training sample into a tree root of a decision tree;
B. dividing original data into two groups, wherein one part is training group data, and the other part is test group data;
C. using training group data to establish a decision tree, and evaluating and selecting corresponding attributes at each internal node according to an information theory to continue to perform division;
D. using test group data to prune the decision tree, only having one node for each classification of the decision tree, judging whether the internal nodes are leaf nodes, if not, establishing a new secondary branch by taking the new internal node as the root of the branch;
E. continuously recursing the step A to the step D until all internal nodes are leaf nodes; and after the decision tree is classified, extracting the leaf node of each branch to obtain a learning rule.
As an improvement of the human motion posture identification method, the method also comprises the following steps before training the received human motion data: preprocessing the human motion data in a normalization mode or an interval scaling mode;
the expression corresponding to the normalization mode is as follows:
Figure 833144DEST_PATH_IMAGE002
(ii) a The expression corresponding to the interval scaling mode is as follows:
Figure 933824DEST_PATH_IMAGE004
wherein the content of the first and second substances,
Figure 943368DEST_PATH_IMAGE006
expressed to obtain raw sensing data, max and min are expressed as the maximum and minimum values of the sensing data, respectively,
Figure 321260DEST_PATH_IMAGE008
is the average of the sensed data and,
Figure 390847DEST_PATH_IMAGE010
is the variance of the sensed data.
As an improvement of the human motion gesture recognition method of the present invention, the filtering processing on the obtained waist state recognition model includes:
setting a plurality of sub-postures in the process of bending the human body;
sequentially pointing the plurality of sub-gestures according to a motion rule;
and eliminating the gesture recognition result which does not accord with the set pointing direction.
As an improvement of the human motion gesture recognition method of the present invention, the plurality of sub-gestures include: walking state, standing state, bending stop state and rising state.
As an improvement of the human motion gesture recognition method of the present invention, the sequentially pointing the plurality of sub-gestures according to the motion rule comprises:
the walking state points to the upright state;
the upright state points to the stoop state;
the stooping state points to a stooping stop state;
the stoop stopped state points to a standing up state;
the rising state points to the stoop stopping state;
the rising state points to the standing state.
As an improvement of the human motion posture identification method of the present invention, the human motion posture identification method is applied to identification of a human motion posture by a wearable exoskeleton, and the human motion posture identification method further comprises: and feeding back a corresponding control strategy to the obtained actual motion posture of the human body.
As an improvement of the human motion gesture recognition method, the control strategy comprises the following steps:
the calculation expression of the output torque of the motion control strategy for standing and walking is as follows:
Figure 978823DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure 792058DEST_PATH_IMAGE014
calculating the moment for the friction nonlinear part of the system, including the Coriolis centrifugal force and gravity of the system, and using the moment for the nonlinear compensation of the system;
Figure 24456DEST_PATH_IMAGE016
moment calculated for error of the system, Kp and Kd are constant values;
Figure 530524DEST_PATH_IMAGE018
the friction compensation moment of the system is obtained, since the friction term is related to the moving direction of the system, K is the viscous friction coefficient,
Figure 10000257511283
sgn (q) is a sign function for the velocity of the moving joint, which compensates for the static coefficient of friction;
the calculation expression of the motion control strategy output torque when bending and stopping bending is as follows:
Figure 222723DEST_PATH_IMAGE022
(ii) a Wherein KP and KD are constant values, the motion control moment is adjusted according to the tracking error of the system,
Figure 309627DEST_PATH_IMAGE024
a reference target position when the system is initialized;
the calculation expression of the motion control strategy output torque during getting up is as follows:
Figure 986596DEST_PATH_IMAGE026
(ii) a The device consists of nonlinear compensation torque, PD controller error calculation torque, planning torque and friction compensation torque, and the expression meaning of each function is as described above.
To achieve the above object, the present invention provides a wearable exoskeleton comprising:
at least one processor;
at least one memory having a computer program stored therein, the computer program when executed by the at least one processor performing the human motion gesture recognition method as described above.
Compared with the prior art, the invention has the beneficial effects that: the human body motion posture identification method can solve the problem of human-computer coupled bending state identification, and can realize identification of the motion state of the human body in the bending process through the state identification algorithm. Meanwhile, the recognized motion state under man-machine coupling can be well matched with the motion state of the waist of the human body by introducing a finite state mechanism filtering means, and the condition that frequent state misjudgment occurs in the switching of the motion state in the prior art is further overcome.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a human motion gesture recognition method according to an embodiment of the present invention;
FIG. 2 is a schematic view of the directions of the preset sub-postures during the human body bending exercise;
FIG. 3 is a block diagram of a control strategy for upright and walking state feedback;
FIG. 4 is a block diagram of a control strategy for stoop and stoop stop state feedback;
FIG. 5 is a block diagram of a control strategy for rise-up state feedback;
fig. 6 is a graph of the exoskeleton stooping state identification result.
Detailed Description
The present invention is described in detail below with reference to various embodiments, but it should be understood that these embodiments are not intended to limit the present invention, and those skilled in the art should be able to make modifications and substitutions on the functions, methods, or structures of these embodiments without departing from the scope of the present invention.
As shown in fig. 1, an embodiment of the present invention provides a method for recognizing a human motion gesture, which includes the following steps:
and S1, collecting and sending the human body motion data.
By collecting the human motion data, a data base is provided for the subsequent human motion posture identification.
The human body motion data includes, but is not limited to, torso motion data (angle, angular velocity, and angular acceleration), lower limb motion data (large and small leg angles, angular velocity, and angular acceleration), foot motion data (foot pressure, angle, angular velocity, and angular acceleration), interaction force data of lower limb motion (interaction force at thigh and interaction force data at calf in man-machine coupling), and the like.
And S2, training the received human motion data to obtain a waist state recognition model.
Wherein, training the received human motion data through a decision tree model, which comprises:
A. and putting the original data of the training sample into the root of the decision tree.
B. The raw data is divided into two groups, one is training group data, and the other is testing group data.
C. The decision tree is established by using the test group data, and the basis of selecting which attribute to continue to be segmented is evaluated at each internal node according to an information theory, which is also called node segmentation.
D. And (3) pruning the decision tree by using the test data, wherein each classification of the decision tree is pruned to have only one node so as to improve the prediction capability and speed. That is, after node segmentation, whether the internal nodes are leaf nodes is judged, and if not, a new secondary branch is established by taking the new internal node as the branch root.
E. Recursion is continued for steps A to D until all internal nodes are leaf nodes. After the decision tree is classified, the leaf nodes of each branch can be extracted to obtain the learning rule.
Therefore, the motion state of the human body in the process of bending can be recognized through the obtained waist state recognition model.
In addition, in order to facilitate training of the received human motion data by using the decision tree model, the training of the received human motion data further includes: and preprocessing the human motion data in a normalization mode or an interval scaling mode. Therefore, before the sensing data are trained, the model precision is improved and the convergence speed is accelerated.
Specifically, the expression corresponding to the normalization mode is as follows:
Figure 158952DEST_PATH_IMAGE002
the expression corresponding to the interval scaling mode is as follows:
Figure 438623DEST_PATH_IMAGE004
wherein the content of the first and second substances,
Figure 645614DEST_PATH_IMAGE006
expressed to obtain raw sensing data, max and min are expressed as the maximum and minimum values of the sensing data, respectively,
Figure 759063DEST_PATH_IMAGE008
is the average of the sensed data and,
Figure DEST_PATH_IMAGE027
is the variance of the sensed data.
And S3, filtering the obtained waist state recognition model to obtain the actual motion posture of the human body.
As shown in fig. 2, in order to solve the problem in the prior art that frequent state misjudgment occurs during switching of motion states, step S3 may implement that the motion state under the identified human-computer coupling better matches the waist motion state of the human body by introducing a means of finite state mechanism filtering.
Wherein, the filtering process of the obtained waist state identification model comprises the following steps:
s31, setting a plurality of sub-postures in the human body bending process;
s32, sequentially pointing the sub-postures according to the motion rule;
and S33, eliminating the gesture recognition result which does not accord with the set direction.
Specifically, the plurality of sub-gestures includes: walking state, standing state, bending stop state and rising state. The current motion state is continuously identified, and the state to be judged at the next moment is only directed to the state allowing the direction, so that the identification error rate of the motion state is reduced.
The specific motion state transition rule in the process of bending the human body is as follows: the walking state 0 is directed to the standing state 1, the standing state 1 is directed to the stooping state 2, the stooping state 2 is directed to the stooping stop state 3, the stooping stop state 3 is directed to the standing state 4, the standing state 4 is directed to the stooping stop state 3, and the standing state 4 is directed to the standing state 1. Thus, some abnormal conditions are prohibited by the finite state machine, such as prohibiting direct transition from the upright condition to the stooped upright condition, to protect the wearer and exoskeleton device from safety.
In addition, when the human motion posture identification method of the embodiment is applied to the wearable exoskeleton for identifying the human motion posture, the human motion posture identification method further includes: and feeding back a corresponding control strategy to the obtained actual motion posture of the human body.
Specifically, the corresponding control strategy is given after the corresponding state of the human body is identified as follows, and the output moment calculation expression for the upright and walking motion control strategies is as follows:
Figure 153135DEST_PATH_IMAGE012
(ii) a The device mainly comprises a nonlinear compensation torque, a PD controller error calculation torque and a friction compensation torque.
Wherein the content of the first and second substances,
Figure 236498DEST_PATH_IMAGE014
(ii) a The moment calculated for the non-linear part of the friction of the system mainly comprises the Coriolis centrifugal force and the gravity of the system and is used for compensating the non-linearity of the system.
Figure 32416DEST_PATH_IMAGE016
(ii) a The moments calculated for the error of the system, Kp and Kd are constants.
Figure 51187DEST_PATH_IMAGE018
(ii) a The friction compensation moment of the system is obtained, since the friction term is related to the moving direction of the system, K is the viscous friction coefficient,
Figure 10000257527319
is the velocity of the moving joint. sgn (q) is a sign function used to compensate for the static coefficient of friction.
The calculation expression of the motion control strategy output torque when bending and stopping bending is as follows:
Figure RE-GDA0003377665510000033
KP and KD are constant values, and the motion control moment theta is adjusted according to the tracking error of the systembaseA reference target position when the system is initialized;
the calculation expression of the motion control strategy output torque during getting up is as follows:
Figure DEST_PATH_IMAGE028
(ii) a The device mainly comprises nonlinear compensation torque, PD controller error calculation torque, planning torque and friction compensation torque, and the expression of each function has the meaning as described above.
The control block diagram for getting up is shown in fig. 5, in which
Figure DEST_PATH_IMAGE029
And
Figure DEST_PATH_IMAGE030
the control method is consistent with the control block diagram 4, both the control method and the control method are feedforward compensation control, and are used for compensating a medium dynamic model and a friction model of a system, for an interaction force control part, a PD block diagram and a PID control frame graph form a double closed loop control algorithm to be used as an interaction force planning control part, the PD block diagram is mainly used for controlling the interaction force at a preset value to be an interaction force closed loop control algorithm, and the calculation expression of torque planning is as follows:
Figure DEST_PATH_IMAGE032
will be provided with
Figure 518761DEST_PATH_IMAGE033
After being input into the system as feedforward control, in order to prevent the system from tracking the error of the interaction force, a PID block diagram is introduced into the control system to realize better following effect.
Figure DEST_PATH_IMAGE034
In order to verify the technical effect of the human motion gesture recognition method of this embodiment, as shown in fig. 6, a curve a indicates that when the recognition system is not filtered by a finite state mechanism, the state is recognized only by using a decision tree model, and the result of the human-computer coupled waist motion state recognition can find that frequent state misjudgment occurs for the switching of the motion state.
The curve B shows that after the finite state filtering algorithm is carried out, the man-machine coupled waist action state presents obvious periodic characteristics, and the misjudgment of the state does not occur. The data filtered by the finite-state machine is highly consistent with the originally given motion state, which shows that the actual motion state under the man-machine coupling identified by the algorithm better matches the waist motion state of the human body, and provides a basis for the subsequent control algorithm based on the motion state.
Based on the same technical concept, another embodiment of the present invention also provides a wearable exoskeleton comprising: at least one processor, at least one memory.
Wherein at least one memory has stored therein a computer program which, when executed by at least one processor, performs the human motion gesture recognition method as described in the above embodiments.
In addition, the wearable exoskeleton is integrated with a sensor capable of collecting the human motion data, the data collected by the sensor is sent to the corresponding memory, and the processor further reads the stored sensing data and executes a corresponding human motion posture recognition method.
In conclusion, the human motion posture identification method can solve the problem of human-computer coupled bending state identification, and can realize identification of the motion state of the human body in the bending process through the state identification algorithm. Meanwhile, the recognized motion state under man-machine coupling can be well matched with the motion state of the waist of the human body by introducing a finite state mechanism filtering means, and the condition that frequent state misjudgment occurs in the switching of the motion state in the prior art is further overcome.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.

Claims (10)

1. A human motion gesture recognition method is characterized by comprising the following steps:
collecting and transmitting human motion data;
training the received human motion data to obtain a waist state recognition model;
and carrying out filtering processing on the obtained waist state recognition model to obtain the actual motion posture of the human body.
2. The human motion gesture recognition method of claim 1, wherein the human motion data comprises: at least one of torso movement data, lower limb movement data, foot movement data, and interaction force of lower limb movement.
3. The human motion gesture recognition method of claim 1, wherein the received human motion data is trained by a decision tree model, which comprises:
A. putting original data of a training sample into a tree root of a decision tree;
B. dividing original data into two groups, wherein one part is training group data, and the other part is test group data;
C. using training group data to establish a decision tree, and evaluating and selecting corresponding attributes at each internal node according to an information theory to continue to perform division;
D. using test group data to prune the decision tree, only having one node for each classification of the decision tree, judging whether the internal nodes are leaf nodes, if not, establishing a new secondary branch by taking the new internal node as the root of the branch;
E. continuously recursing the step A to the step D until all internal nodes are leaf nodes; and after the decision tree is classified, extracting the leaf node of each branch to obtain a learning rule.
4. The human motion gesture recognition method of claim 1 or 3, wherein before training the received human motion data, further comprising: preprocessing the human motion data in a normalization mode or an interval scaling mode;
the expression corresponding to the normalization mode is as follows:
Figure DEST_PATH_IMAGE001
(ii) a The expression corresponding to the interval scaling mode is as follows:
Figure 958851DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE003
expressed to obtain raw sensing data, max and min are expressed as the maximum and minimum values of the sensing data, respectively,
Figure 259383DEST_PATH_IMAGE004
is the average of the sensed data and,
Figure DEST_PATH_IMAGE005
is the variance of the sensed data.
5. The human motion gesture recognition method of claim 1, wherein the filtering the obtained waist state recognition model comprises:
setting a plurality of sub-postures in the process of bending the human body;
sequentially pointing the plurality of sub-gestures according to a motion rule;
and eliminating the gesture recognition result which does not accord with the set pointing direction.
6. The human motion gesture recognition method of claim 5, wherein the plurality of sub-gestures comprise: walking state, standing state, bending stop state and rising state.
7. The human motion gesture recognition method of claim 6, wherein the sequentially pointing the plurality of sub-gestures according to a motion law comprises:
the walking state points to the upright state;
the upright state points to the stoop state;
the stooping state points to a stooping stop state;
the stoop stopped state points to a standing up state;
the rising state points to the stoop stopping state;
the rising state points to the standing state.
8. The human motion gesture recognition method of claim 1, wherein the human motion gesture recognition method is applied to recognition of human motion gestures by a wearable exoskeleton, and further comprises: and feeding back a corresponding control strategy to the obtained actual motion posture of the human body.
9. The human motion gesture recognition method of claim 8, wherein the control strategy comprises:
the calculation expression of the output torque of the motion control strategy for standing and walking is as follows:
τtotal=τDYpdsfc
wherein the content of the first and second substances,
Figure RE-FDA0003377665500000021
calculating the moment for the friction nonlinear part of the system, including the Coriolis centrifugal force and gravity of the system, and using the moment for the nonlinear compensation of the system;
τpd=KpFerr+Kd(F·err) Moment calculated for error of the system, Kp and Kd are constant values;
Figure RE-FDA0003377665500000032
the friction compensation moment of the system is obtained, since the friction term is related to the moving direction of the system, K is the viscous friction coefficient,
Figure RE-FDA0003377665500000033
sgn (q) is a sign function for the velocity of the moving joint, which compensates for the static coefficient of friction;
the calculation expression of the motion control strategy output torque when bending and stopping bending is as follows:
Figure RE-FDA0003377665500000031
KP and KD are constant values, and the motion control moment theta is adjusted according to the tracking error of the systembaseA reference target position when the system is initialized;
the calculation expression of the motion control strategy output torque during getting up is as follows: τ ═ τDYPIDsfc+force(ii) a The device consists of nonlinear compensation torque, PD controller error calculation torque, planning torque and friction compensation torque, and the expression meaning of each function is as described above.
10. A wearable exoskeleton, comprising:
at least one processor;
at least one memory having a computer program stored thereon, the computer program when executed by the at least one processor performing the human motion gesture recognition method of any one of claims 1 to 9.
CN202110792447.7A 2021-07-14 2021-07-14 Human motion posture recognition method and wearable exoskeleton Pending CN113902048A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110792447.7A CN113902048A (en) 2021-07-14 2021-07-14 Human motion posture recognition method and wearable exoskeleton

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110792447.7A CN113902048A (en) 2021-07-14 2021-07-14 Human motion posture recognition method and wearable exoskeleton

Publications (1)

Publication Number Publication Date
CN113902048A true CN113902048A (en) 2022-01-07

Family

ID=79187838

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110792447.7A Pending CN113902048A (en) 2021-07-14 2021-07-14 Human motion posture recognition method and wearable exoskeleton

Country Status (1)

Country Link
CN (1) CN113902048A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116442202A (en) * 2023-06-19 2023-07-18 贵州航天控制技术有限公司 Waist boosting equipment control method based on back posture information

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116442202A (en) * 2023-06-19 2023-07-18 贵州航天控制技术有限公司 Waist boosting equipment control method based on back posture information
CN116442202B (en) * 2023-06-19 2023-08-18 贵州航天控制技术有限公司 Waist boosting equipment control method based on back posture information

Similar Documents

Publication Publication Date Title
Begg et al. Support vector machines for automated gait classification
US10157313B1 (en) 3D gaze control of robot for navigation and object manipulation
CN110221699B (en) Eye movement behavior identification method of front-facing camera video source
CN106950841B (en) The PD-SMC bionic eye motion control method unrelated with model
Ji et al. Grasping mode analysis and adaptive impedance control for apple harvesting robotic grippers
Yanik et al. Use of kinect depth data and growing neural gas for gesture based robot control
Nomm et al. Monitoring of the human motor functions rehabilitation by neural networks based system with kinect sensor
Gallina et al. Progressive co-adaptation in human-machine interaction
Chalvatzaki et al. User-adaptive human-robot formation control for an intelligent robotic walker using augmented human state estimation and pathological gait characterization
CN113902048A (en) Human motion posture recognition method and wearable exoskeleton
Wang et al. Development of human-machine interface for teleoperation of a mobile manipulator
Schabron et al. Artificial neural network to detect human hand gestures for a robotic arm control
Dwivedi et al. A shared control framework for robotic telemanipulation combining electromyography based motion estimation and compliance control
CN104077591A (en) Intelligent and automatic computer monitoring system
Patel et al. EMG-based human machine interface control
CN113478462A (en) Method and system for controlling intention assimilation of upper limb exoskeleton robot based on surface electromyogram signal
CN111358659B (en) Robot power-assisted control method and system and lower limb rehabilitation robot
Wameed et al. Hand gestures robotic control based on computer vision
Chen et al. Hand tracking accuracy enhancement by data fusion using leap motion and myo armband
CN111897415B (en) Virtual artificial hand flexible visual control method based on electromyographic signals and variable stiffness control
Ding et al. A Deep Learning Model with a Self-Attention Mechanism for Leg Joint Angle Estimation across Varied Locomotion Modes
CN114594757A (en) Visual path planning method for cooperative robot
Moradi et al. Integrating Human Hand Gestures with Vision Based Feedback Controller to Navigate a Virtual Robotic Arm
CN109924984B (en) Robot motion control method and system based on human motion intention detection
Wang et al. Integrating sensor fusion for teleoperation control of anthropomorphic dual-arm robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination