CN110125909B - Multi-information fusion human body exoskeleton robot control protection system - Google Patents

Multi-information fusion human body exoskeleton robot control protection system Download PDF

Info

Publication number
CN110125909B
CN110125909B CN201910427971.7A CN201910427971A CN110125909B CN 110125909 B CN110125909 B CN 110125909B CN 201910427971 A CN201910427971 A CN 201910427971A CN 110125909 B CN110125909 B CN 110125909B
Authority
CN
China
Prior art keywords
cosθ
human body
sinθ
robot
exoskeleton robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910427971.7A
Other languages
Chinese (zh)
Other versions
CN110125909A (en
Inventor
钱伟行
吴文宣
蒲文浩
张彤彤
赵泽宇
刘志林
顾雅婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhenjiang Institute For Innovation And Development Nnu
Original Assignee
Zhenjiang Institute For Innovation And Development Nnu
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhenjiang Institute For Innovation And Development Nnu filed Critical Zhenjiang Institute For Innovation And Development Nnu
Priority to CN201910427971.7A priority Critical patent/CN110125909B/en
Publication of CN110125909A publication Critical patent/CN110125909A/en
Application granted granted Critical
Publication of CN110125909B publication Critical patent/CN110125909B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0006Exoskeletons, i.e. resembling a human figure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Mechanical Engineering (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Physiology (AREA)
  • Pulmonology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a control and protection system for a multi-information fusion human exoskeleton robot. The system comprises a plurality of groups of inertia measurement components, myoelectricity sensors, respiration sensors, depth cameras and a machine learning processing computer, wherein the inertia measurement components and the myoelectricity sensors are arranged on four limbs of a human body, the respiration sensors are arranged on the chest, the depth cameras are worn on the head, and the machine learning processing computer is arranged at any position of the human body. Effective data collected by each sensor at a certain moment is used as input of an LSTM neural network based on a BPTT algorithm, the sum of time consumed by a computer for processing a machine learning algorithm from the moment and maximum delay time of a steering engine at each position of a robot for receiving a control signal is used as a time interval, a control signal representing the movement intention of a human body at the moment after the time interval is used as output, a nonlinear function mapping relation which is difficult to express by an analytic expression in engineering is established through neural network training, the movement intention of the human body at the next moment is judged by the action of the moment, and therefore the function of outputting the corresponding control signal is realized.

Description

Multi-information fusion human body exoskeleton robot control protection system
Technical Field
The invention belongs to the technical field of robots, and relates to a human exoskeleton robot control and protection system.
Background
Motor dysfunction such as hemiplegia caused by brain injury imposes a heavy burden on the families and society of patients. The correct and scientific rehabilitation training plays an important role in recovering and improving the limb movement function. Obtaining timely and effective rehabilitation training has become an urgent need for patients with hemiplegia or paraplegia, but the falling behind of rehabilitation equipment is a major obstacle for the rehabilitation of patients. The combination of rehabilitation medicine and robot technology has improved the validity of rehabilitation training and has guaranteed the intensity of action training, has opened up new way for studying new rehabilitation technology, but how to discern the different motion intentions of human body and thereby take different measures, prevents that the human body from making the action of overstimulation because of dangerous motion intention and hurting oneself or other personnel on every side, still is the problem that needs to solve in the medical treatment rehabilitation robot technology urgently.
Disclosure of Invention
In order to solve the technical problems in the research background, the invention aims to provide a multi-information fusion human exoskeleton robot control and protection system, which flexibly uses various sensing means, collects inertia measurement components and myoelectric signal sensors which are arranged on four limbs of a human body, establishes a machine learning model by using information of a depth camera which is worn on the head, identifies different movement intentions of the human body by combining a respiration sensor, further controls the action of an exoskeleton robot and improves the reliability and the safety of the exoskeleton robot system.
In order to achieve the technical purpose, the invention provides the following technical scheme:
the invention provides a multi-information fusion human exoskeleton robot control protection system, which comprises a plurality of groups of inertia measurement components and electromyographic signal sensors, which are arranged at four limbs of a human body, a plurality of groups of breathing sensors, which are arranged at the chest position of the human body, a depth camera, which is worn on the head and the visual field of a machine is coincident with the visual field of the human body, and a machine learning processing computer, which is arranged at any position of the human body; the exoskeleton robot comprises an inertia measurement component, a myoelectric signal sensor, a respiration sensor and a depth camera, wherein the inertia measurement component, the myoelectric signal sensor, the respiration sensor and the depth camera are respectively connected with a machine learning processing computer in a wired or wireless mode, and after the machine learning processing computer identifies different types of movement intentions, the exoskeleton robot performs corresponding control and protection according to actual conditions.
Further, the invention provides a multi-information fusion human exoskeleton robot control protection system, joint posture angle and angular velocity acquired by inertia measurement components of all parts of a human body at a certain time, surface electromyogram signals acquired by an electromyogram signal sensor, respiratory frequency and respiratory amplitude acquired by a respiratory sensor, and object position and depth information acquired by a depth camera are transmitted to a machine learning processing computer to be used as input of a long-time memory neural network (LSTM) based on a time-dependent back propagation algorithm (BPTT), the sum of time consumed by the machine learning algorithm and maximum delay time of control signals received by steering engines of all positions of the robot is used as a time interval, control signals representing human motion intention at moments after the time interval are used as output of the long-time memory neural network, and a nonlinear function mapping relation between the input and the output is obtained by training, thereby realizing the advanced control of the exoskeleton robot.
Further, the control protection system for the multi-information fusion human exoskeleton robot provided by the invention is based on a long-time memory neural network (LSTM) of a back propagation algorithm (BPTT) along with time, and a nonlinear function mapping relation between input and output is obtained through training, and the control protection system is specifically as follows:
let x be input, s be hidden layer state, o be output, xtIs input at the t-th time RNN, stFor the input of the RNN hidden layer at the t-th instant, ytLabel, z at time ttFor the convergent input of the output layer, U is the weight from the previous moment to the current moment of the hidden layer, W is the weight from the input layer to the hidden layer, and V is the weight from the hidden layer to the output layer, and the weight is used after being expanded according to the time
Figure BDA0002068089860000021
Instead of o, then
st=tanh(Uxt+Wst-1)
Figure BDA0002068089860000022
Using cross entropy E as a loss function
Figure BDA0002068089860000023
Figure BDA0002068089860000024
Calculating the gradient in backward propagation using the chain rule, output E for the networktIs provided with
Figure BDA0002068089860000025
Figure BDA0002068089860000031
zt=Vst
st=tanh(Uxt+Wst-1)
Thus, the gradient of V can be obtained
Figure BDA0002068089860000032
Wherein s istThe derivation of W is a fractional derivation
Figure BDA0002068089860000033
The derivation of U is also a fractional derivation:
Figure BDA0002068089860000034
furthermore, the invention provides a multi-information fusion human body exoskeleton robot control protection system which is used for stWhen deriving W, if not limiting, all states from t to 0 need to be backtracked, truncation can be carried out according to scene and precision requirements,
Figure BDA0002068089860000035
further, the multi-information fusion human body exoskeleton robot control protection system provided by the invention is an information acquisition and processing stage before a machine learning algorithm model is constructed, and the specific steps of the stage are as follows:
step 1: under different motion states of a human body, synchronously acquiring information of an inertia measurement assembly, an electromyographic signal sensor, a respiration sensor and a depth camera at the same or different frequencies;
step 2: preprocessing the human skin surface electromyographic signals collected by the electromyographic signal sensor, wherein the preprocessing comprises signal effectiveness detection, signal denoising, signal activity section strengthening and starting threshold value reasonable setting;
and step 3: the signal effectiveness detection is carried out on the inertia measurement assemblies arranged at the four limbs of the human body, then the random error real-time modeling and correction are carried out on the effective signals, and the accuracy of signal acquisition is ensured;
and 4, step 4: the method comprises the steps of carrying out signal validity detection on a respiration sensor arranged at a proper position such as the chest of a human body, and carrying out random error real-time modeling and correction on a valid signal to ensure the accuracy of signal acquisition;
and 5: initializing the exoskeleton robot system on the premise that information such as inertial measurement components, myoelectric sensors, respiratory sensors and the like of four limbs of a human body is effective, otherwise, restarting the components, and returning to the step 2;
step 6: and carrying out attitude calculation, speed calculation and position calculation on the exoskeletal robot system.
Furthermore, after the different types of movement intentions are recognized, the exoskeleton robot can perform corresponding control and protection according to the actual situation, specifically: the motion intention identified by the various sensors is a normal behavior of the motion intention of the human body, if the human body generates but does not finish a certain action, the motion control system of the exoskeleton robot assists the human body to continue finishing the action on the premise of protecting the gravity center of the human body to be stable, so that the human body achieves the motion intention without being injured; if the movement intention is judged to be an overstimulation behavior, a certain damage is caused to a wearer or other surrounding personnel, the movement speed and the acceleration of the exoskeleton robot are subjected to feedback control through the exoskeleton robot movement control system, and parameter equations of all parts of the robot, including displacement and acceleration parameters, are established.
Further, the control protection system for the multi-information fusion human exoskeleton robot provided by the invention is characterized in that a parameter equation of each part of the robot is established, and the distance from a shoulder steering engine to an elbow steering engine of a mechanical arm is set to be L1Distance between elbow and wrist is L2Distance between wrist and palm is L3Arm L1、L2、L3Has a retracting and extending range of phi1、Φ2、Φ3X, y and z form a basic coordinate system of the robot, and the coordinates of the end effector of the mechanical arm can be expressed as the following equation:
x=L1cosθ11cosθ12+L2cosθ21cosθ22+L3cosθ31cosθ32
y=L1cosθ11sinθ12+L2cosθ21sinθ22+L3cosθ31sinθ32
z=L1sinθ11+L2sinθ2+L3sinθ3
wherein theta is11Is L1And plane x1y1z1The included angle of (A); theta12Is L1In plane x1y1z1Projection of (2) and x1The included angle of (A); theta21Is L2And plane x2y222The included angle of (A); theta22Is L2In plane x2y2z2Projection of (2) and x2The included angle of (A); theta31Is L3And plane x3y3z3The included angle of (A); theta32Is L3In plane x3y3z3Projection of (2) and x3The included angle of (A);
and obtaining the velocity of the tail end position by derivation of the tail end position of the mechanical arm, wherein the equation is as follows:
x=-L1ω11sinθ11cosθ12-L1ω12cosθ11sinθ11-L2ω21sinθ21cosθ22
-L2ω22cosθ21sinθ22-L3ω31sinθ31cosθ32-L3ω32cosθ31sinθ32
y=-L1ω11sinθ11sinθ12+L1ω12cosθ11cosθ12-L2ω21sinθ21sinθ22
+L2ω22cosθ21cosθ22-L3ω31sinθ31sinθ32+L3ω32cosθ31cosθ32
z=L1ω11cosθ11+L2ω21cosθ21+L3ω31cosθ31
through the equation, the relationship between the speed in the robot basic coordinate system and the speed of each joint and the relationship between the contact force between the hand and the outside and the corresponding joints are established, so that the system is helped to solve the specific motion situation of the mechanical arm, and the exoskeleton robot can conveniently perform corresponding control protection according to the actual situation.
Compared with the prior art, the invention adopting the technical scheme has the following technical effects:
according to the invention, surface electromyographic signals are introduced into the system to combine the movement intentions of the patient, and the human body electromyographic signals contain a large amount of physiological information related to the movement state of the human body, so that the combination and decomposition relation of movement modes is embodied, the movement intentions of limbs are indicated, and the accuracy of movement control is ensured by means of auxiliary control of a visual sensor. Meanwhile, different movement intentions of the human body can be identified through different respiratory frequency signals acquired by the respiratory sensor, so that different counter measures are taken for the different movement intentions of the human body, the overstimulation behavior of the human body is effectively avoided, and the safety and the reliability of the exoskeleton robot are enhanced. The invention integrates various sensors to identify human motion intention, utilizes a machine learning method to judge the current moment actions of four limbs of the human body, controls the robot to perform the next moment action, simultaneously protects the human body from being injured by overexcitation, and effectively improves the safety of human-computer interaction.
Drawings
Fig. 1 is a block diagram of the system of the present invention.
FIG. 2 is a diagram of the mounting locations of the sensors of the system of the present invention.
Fig. 3 is a block diagram of the machine learning architecture of the system of the present invention.
FIG. 4 is a diagram of a neural network architecture for the system of the present invention.
FIG. 5 is a block diagram of a neural network of the system of the present invention after deployment over time.
FIG. 6 is a model of the system machine learning algorithm of the present invention.
Fig. 7 is a block diagram of the structure of the system protection mechanism start judgment according to the present invention.
Detailed Description
The technical solutions provided by the present invention will be described in detail below with reference to specific examples, and it should be understood that the following specific embodiments are only illustrative of the present invention and are not intended to limit the scope of the present invention.
The exoskeleton robot motion control system is combined with modes of an inertia assembly, a human body physiological signal sensor, computer vision and the like to identify human body motion intentions, and then motion of the exoskeleton robot is controlled. The method comprises the steps of installing an inertia measurement assembly, a physiological signal sensor and a depth camera at corresponding parts of a human body, and constructing an output model of the exoskeleton robot system through analysis of human body kinematics mechanism and a machine learning algorithm based on different motion state signals of the human body. According to the bi-directionality of the exoskeleton robot system, the decision-making capability of an operator is combined with the work execution capability of the robot, different movement intentions of the human body are recognized, the coordinated movement of the human body and the exoskeleton robot system is realized, and the human body is prevented from being over-excited to hurt the self or other surrounding people due to dangerous movement intentions.
As shown in fig. 1, the invention provides a control and protection system of a multi-information fusion human exoskeleton robot. As shown in fig. 2, the electromyographic signal sensor and the inertial measurement unit are installed at each part of the human body, specifically: the electromyographic signal sensor is arranged on the surface of the skin of each part of a human body, a sensor with medium or low precision can be adopted in practical application, and the surface of muscle groups such as biceps brachii, triceps brachii and the like can be selected as the installation position. The inertia measurement assembly can be installed at four limbs, and an inertia measurement unit with low or medium precision, such as NV-IMU200 type equipment, can be adopted in practical application. The respiration sensor may be placed in a suitable position on the chest, for example, with a contact type sensor such as a HXB-1 type respiration sensor, or with a non-contact type detector such as a capacitive sensing or continuous wave radar sensing scheme. The depth camera may be worn on the head or the like where the machine view and the human view coincide. The machine learning processing computer can be installed at any position of a human body and is connected with the sensor assembly through a cable or completes data transmission by adopting wireless communication.
The specific steps of the multi-information fusion human body exoskeleton robot control protection method based on the system are as follows.
1. Collecting and preprocessing electromyographic signals:
surface electromyographic signals are a combination of the temporal and spatial activity of the muscular electricity at the skin surface. The electromyographic signals are collected in a wired mode, the signals are collected through an electrode paste pasted on the surface of the skin, and the electrodes are connected with a collecting device, a computer and other control devices. The preprocessing comprises signal validity detection, signal denoising, activity section strengthening, reasonable setting of an initial threshold value and the like.
Since the contact impedance between the surface electrode and the skin is affected by various factors such as contact tightness, skin cleanness, humidity, and seasonal variation, great attention is paid to a high common mode rejection ratio in designing a circuit. In practical use, an electromyographic signal acquisition circuit with high gain, high input impedance, high common mode rejection ratio and low power consumption can be used for acquiring signals, such as a pre-amplification circuit built by taking the in-phase parallel differential three-operational amplifier instrumentation amplifier INA128PA as a core device. Because the interference of the movement of the electrode, the 50Hz power frequency power supply and the like can introduce noise, a filter, such as a voltage-controlled voltage-derived second-order low-pass filter, can be adopted.
2. Signal acquisition of an inertia measurement assembly at four limbs of a human body:
and acquiring output signals of a gyroscope and a piezoelectric accelerometer in the inertial sensor assembly at four limbs of the human body to obtain the angular velocity and the angular acceleration under a three-dimensional coordinate system.
3. Signal acquisition of the respiration sensor:
the respiratory frequency and respiratory amplitude of the human body are obtained by installing respiratory sensors at proper positions of the chest and the like of the human body.
4. Image acquisition and processing of the depth camera:
the position and depth of the object are obtained by the acquisition of the camera image which coincides with the human vision field, and certain referential is provided for the following actions through the processing of the image and the interaction of the sensor.
5. Detecting the effectiveness of an inertial sensor and a respiratory sensor and modeling and correcting random errors in real time:
the self-correcting regulator is adopted to carry out on-line estimation on the parameters of the system or the controller, and the parameters can be correspondingly and automatically modified by identifying the change of the system and the environment in real time, so that the closed-loop control system can reach the expected performance index. Specifically, a linear difference equation (which may contain an interference term) representing the input-output relationship can be used as a predictive mathematical model of the system, which is called a controllable autoregressive moving average model, and parameters of the model are estimated on line by a recursive least square method to directly obtain a self-correcting regulator with the minimum output variance. In the example of a real-time control exoskeleton robot, the current value of the model can be viewed as a weighted sum of the finite terms of the past values and its now past weighted sum of the finite terms of the disturbance variables, and if a model can be constructed with minimal values of its AIC criteria function, then the model is the best model. Generally, the order of the ARMA model of the position sequence is relatively low, when a mathematical model is established, the upper limit of the model order is set firstly, the orders of n and m are selected from a certain small numerical value, then the orders are increased in sequence, the parameters and residual variance of the established ARMA model are estimated, the corresponding AIC criterion function value is worked out, and finally the model order and the parameters which can enable the AIC criterion function to take the minimum value are selected as the best fitting exoskeleton robot model.
6. Initialization of the exoskeleton robot system:
initializing an exoskeleton robot system on the premise that information such as an inertia measurement component, a myoelectricity sensor and a respiration sensor at four limbs of a human body is effective: and (3) after the system is started, obtaining the rotation angle, the pitch angle and the deflection angle of each component in the computer by utilizing the data of the accelerometer through horizontal self-alignment under a static condition, wherein the rotation angle, the pitch angle and the deflection angle are overlapped with the angle of the specified reference system, otherwise, restarting each component, and returning to the step 2.
7. Pose resolving and speed resolving of the exoskeleton robot system:
because the output is the pose and speed of the robot relative to the inertial space, the data obtained by the gyroscope and the accelerometer needs to be converted into a specified reference frame through calculation.
(1) Pose resolving:
the pose resolution involves the transformation from a coordinate system to a coordinate system, and the mapping of the coordinate system can be divided into a translation coordinate system and a rotation coordinate systemAnd mapping of a general coordinate system. The mapping solution for a generic coordinate system is given below: let the coordinate system of inertia space be the coordinate system { B }, and the coordinate system of reference system be { A }. Considering the most general case, the origins of { B } and { A } do not coincide, with a vector offset. For vectors defining the { B } originA PBORGIndicating that { B } is rotated relative to { A }
Figure BDA0002068089860000081
A description is given. It is known thatBP, first, theBTransforming P to an intermediate coordinate system, wherein the coordinate system is the same as the attitude of { A } and the origin is coincident with the origin of { B }, obtaining:
Figure BDA0002068089860000082
Figure BDA0002068089860000083
(2) and (3) speed calculation:
let the coordinate system of inertia space be the coordinate system { B }, and the coordinate system of reference system be { A }. Considering only the linear velocity, there are:
Figure BDA0002068089860000091
this applies only to the case where the relative orientations of B and A remain unchanged. Considering only the angular velocity, there are:
Figure BDA0002068089860000092
to sum up, when linear velocity and angular velocity exist simultaneously, there are:
Figure BDA0002068089860000093
8. constructing a machine learning algorithm model:
under the condition that different motion states of a human body such as walking, jumping and emotional states such as excitement and stability, the information of electromyographic sensors, inertial measurement components and respiratory sensors of the chest of four limbs of the human body are synchronously collected at the same or different frequencies and data preprocessing is carried out, as shown in figure 3, the information obtained by the processing is combined with an image collected by a depth camera to be used as the input of an LSTM neural network based on a BPTT algorithm, the sum of the time consumed by the machine learning algorithm and the maximum delay time of control signals received by steering engines at various positions of the robot is used as a time interval, and a control signal representing the motion intention of the human body at the next moment is transmitted to a machine learning processing computer to be used as the output of the LSTM neural network.
FIG. 4 shows a structure diagram of the LSTM neural network. Where x is the input, s is the hidden layer state, o is the output, xtIs input at the t-th time RNN, stFor the input of the RNN hidden layer at the t-th instant, ytLabel, z at time ttFor the convergent input of the output layer, U is the weight from the previous moment to the current moment of the hidden layer, W is the weight from the input layer to the hidden layer, V is the weight from the hidden layer to the output layer, and after spreading according to time, as shown in FIG. 5, the weight is calculated by
Figure BDA0002068089860000094
Instead of o, then
st=tanh(Uxt+Wst-1)
Figure BDA0002068089860000095
Using cross entropy E as a loss function
Figure BDA0002068089860000096
Figure BDA0002068089860000097
Calculating the gradient of backward propagation by using the chain rule, and outputting E from the network3For the purpose of example only,
Figure BDA0002068089860000098
Figure BDA0002068089860000101
z3=Vs3
s3=tanh(Ux3+Ws2)
thus, the gradient of V can be obtained
Figure BDA0002068089860000102
Gradient of relative V, because of stIs a function of W, U and contains st-1In derivation, it cannot be simply considered as a constant, so in derivation, if not limited, all states from t to 0 need to be traced back, and in practice, truncation is generally performed according to the scene and precision requirements.
Figure BDA0002068089860000103
Wherein s is3The derivation of W is a fractional derivation
Figure BDA0002068089860000104
The gradient of U is similar
Figure BDA0002068089860000105
The BPTT algorithm is actually a simple variant of the BP algorithm, the difference between the whole forward propagation algorithm and the forward propagation algorithm of the BP network is that the information of a hidden layer at the previous moment is added, and the back propagation is to transfer the accumulated residual back from the last time.
Although Simple RNNs can theoretically maintain dependencies between states for long time intervals, only short-term dependencies can actually be learned. This creates a "long term dependence" problem, requiring the RNN with LSTM elements to alleviate the gradient disappearance problem, and the RNN using LSTM elements is now commonly called LSTM directly. The LSTM unit incorporates a gate mechanism that controls the information flowing through the unit by forgetting gates, input gates and output gates. The gradient vanishing of the Simple RNN is due to the multiplication relationship between the error terms; if derived using LSTM, this multiplicative relationship is found to become additive, so that the gradient vanishing can be mitigated.
The invention summarizes the rules of sensor signals and relevant information of human kinematics and psychology, as shown in fig. 6, by constructing an LSTM neural network, the robot can input data according to the current time, control signals of possible motion intentions of the human body after steering engines at various positions of the robot receive the maximum delay time of the control signals are transmitted to a machine learning processing computer as the output of the current time, and a nonlinear function mapping relation which is difficult to express by an analytic expression in the engineering between the input and the output is obtained through training, thereby realizing the advanced control of the exoskeleton robot.
9. Feedback system of protection mode:
as shown in fig. 7, according to the constructed machine learning algorithm model, if the output next moment movement intention represents: the safety mode is activated when the emotion is excited and a hit or other danger is generated to the target of the expected action under the current speed and acceleration. Establishing feedback control of the exoskeleton robot by using a kinematic formula:
the distance from the shoulder steering engine to the elbow steering engine of the mechanical arm is set to be L1Distance between elbow and wrist is L2Distance between wrist and palm is L3Arm L1、L2、L3Has a retracting and extending range of phi1、Φ2、Φ3. x, y and z form the basic coordinate system of the robot. The coordinates of the end effector of the robotic arm may be expressed as the following equation:
x=L1cosθ11cosθ12+L2cosθ21cosθ22+L3cosθ31cosθ32
y=L1cosθ11sinθ12+L2cosθ21sinθ22+L3cosθ31sinθ32
z=L1sinθ11+L2sinθ2+L3sinθ3
wherein theta is11Is L1And plane x1y1z1The included angle of (A); theta12Is L1In plane x1y1z1Projection of (2) and x1The included angle of (A); theta21Is L2And plane x2y2z2The included angle of (A); theta22Is L2In plane x2y2z2Projection of (2) and x2The included angle of (A); theta31Is L3And plane x3y3z3The included angle of (A); theta32Is L3In plane x3y323Projection of (2) and x3The included angle of (a).
And obtaining the velocity of the tail end position by derivation of the tail end position of the mechanical arm, wherein the equation is as follows:
x=-L1ω11sinθ11cosθ12-L1ω12cosθ11sinθ11-L2ω21sinθ21cosθ22
-L2ω22cosθ21sinθ22-L3ω31sinθ31cosθ32-L3ω32cosθ31sinθ32
y=-L1ω11sinθ11sinθ12+L1ω12cosθ11cosθ12-L2ω21sinθ21sinθ22
+L2ω22cosθ21cosθ22-L3ω31sinθ31sinθ32+L3ω32cosθ31cosθ32
z=L1ω11cosθ11+L2ω21cosθ21+L3ω31cosθ31
through the equation, the relation between the speed in the robot basic coordinate system and the speed of each joint and the relation between the contact force between the hand and the outside and the corresponding joints are established, so that the system can be helped to solve the specific motion situation of the mechanical arm, and the control and protection can be facilitated.
By the equation, physical parameters such as damping force, motor rotating speed and the like required by movement resistance are obtained, new displacement information, speed information and acceleration information are continuously obtained in the process, resistance acceleration is adjusted through a relevant algorithm of human kinematics and motor dragging, timely and effective feedback control is realized, and finally the resistance acceleration obtained by the exoskeleton robot can stop limbs at a position of half displacement of a movement target, so that the wearer and other people around are better protected by preventing over-excited action on the premise of protecting the stability of the gravity center of a human body.
The detailed description is only for illustrating the technical idea of the present invention, and the scope of the present invention is not limited thereby, and any modifications based on the technical idea of the present invention are within the scope of the present invention.

Claims (5)

1. The utility model provides a human ectoskeleton robot control protection system of many information fusion which characterized in that: the system comprises a plurality of groups of inertia measurement components arranged at four limbs of a human body, an electromyographic signal sensor, a plurality of groups of breathing sensors arranged at the chest position of the human body, a depth camera which is worn on the head and the machine vision field of which is coincident with the human body vision field, and a machine learning processing computer arranged at any position of the human body; the exoskeleton robot comprises an inertia measurement component, an electromyographic signal sensor, a respiration sensor and a depth camera, wherein the inertia measurement component, the electromyographic signal sensor, the respiration sensor and the depth camera are respectively connected with a machine learning processing computer in a wired or wireless mode, and after the machine learning processing computer identifies different types of movement intentions, the exoskeleton robot performs corresponding control and protection according to actual conditions;
the specific process is as follows: the joint posture angle and the angular velocity acquired by an inertia measurement component at each part of a human body at a certain moment, the surface electromyogram signal acquired by an electromyogram signal sensor, the respiratory frequency and the respiratory amplitude acquired by a respiratory sensor, and the object position and depth information acquired by a depth camera are transmitted to a machine learning processing computer to be used as the input of a long-time memory neural network (LSTM) based on a time-dependent back propagation algorithm (BPTT), the sum of the time consumed by the machine learning algorithm and the maximum delay time of a steering engine at each position of the robot for receiving a control signal is used as a time interval, the control signal of the movement intention of the human body represented by the moment after the time interval is used as the output of the long-time memory neural network, and a nonlinear function mapping relation between the input and the output is obtained through training, so that the advanced control of the external skeletal robot is realized;
the long and short time memory neural network LSTM based on the time back propagation algorithm BPTT is trained to obtain a nonlinear function mapping relation between input and output, and the method specifically comprises the following steps:
let x be input, s be hidden layer state, o be output, xtIs input at the t-th time RNN, stFor the input of the RNN hidden layer at the t-th instant, ytLabel, z at time ttFor the convergent input of the output layer, U is the weight from the previous moment to the current moment of the hidden layer, W is the weight from the input layer to the hidden layer, and V is the weight from the hidden layer to the output layer, and the weight is used after being expanded according to the time
Figure FDA0003498957710000011
Instead of o, then
st=tanh(Uxt+Wst-1)
Figure FDA0003498957710000012
Using cross entropy E as a loss function
Figure FDA0003498957710000013
Figure FDA0003498957710000014
Calculating the gradient in backward propagation using the chain rule, output E for the networktIs provided with
Figure FDA0003498957710000021
Figure FDA0003498957710000022
zt=Vst
st=tanh(Uxt+Wst-1)
Thus, the gradient of V can be obtained
Figure FDA0003498957710000023
Wherein s istThe derivation of W is a fractional derivation
Figure FDA0003498957710000024
The derivation of U is also a fractional derivation:
Figure FDA0003498957710000025
2. the multi-information fusion human body exoskeleton robot control protection system based on claim 1, wherein: at stWhen deriving W, if not limiting, all states from t to 0 need to be backtracked, truncation can be carried out according to scene and precision requirements,
Figure FDA0003498957710000026
3. the multi-information fusion human body exoskeleton robot control protection system based on claim 1, wherein: before a machine learning algorithm model is constructed, an information acquisition and processing stage is adopted, and the specific steps of the stage are as follows:
step 1: under different motion states of a human body, synchronously acquiring information of an inertia measurement assembly, an electromyographic signal sensor, a respiration sensor and a depth camera at the same or different frequencies;
step 2: preprocessing the human skin surface electromyographic signals collected by the electromyographic signal sensor, wherein the preprocessing comprises signal effectiveness detection, signal denoising, signal activity section strengthening and starting threshold value reasonable setting;
and step 3: the signal effectiveness detection is carried out on the inertia measurement assemblies arranged at the four limbs of the human body, then the random error real-time modeling and correction are carried out on the effective signals, and the accuracy of signal acquisition is ensured;
and 4, step 4: the method comprises the steps of carrying out signal validity detection on a respiration sensor arranged at a proper position such as the chest of a human body, and carrying out random error real-time modeling and correction on a valid signal to ensure the accuracy of signal acquisition;
and 5: initializing the exoskeleton robot system on the premise that information such as inertial measurement components, myoelectric sensors, respiratory sensors and the like of four limbs of a human body is effective, otherwise, restarting the components, and returning to the step 2;
step 6: and carrying out attitude calculation, speed calculation and position calculation on the exoskeletal robot system.
4. The multi-information fusion human body exoskeleton robot control protection system based on claim 1, wherein: after different types of movement intentions are recognized, the exoskeleton robot can perform corresponding control and protection according to actual conditions, specifically: the motion intention identified by the various sensors is a normal behavior of the motion intention of the human body, if the human body generates but does not finish a certain action, the motion control system of the exoskeleton robot assists the human body to continue finishing the action on the premise of protecting the gravity center of the human body to be stable, so that the human body achieves the motion intention without being injured; if the movement intention is judged to be an overstimulation behavior, a certain damage is caused to a wearer or other surrounding personnel, the movement speed and the acceleration of the exoskeleton robot are subjected to feedback control through the exoskeleton robot movement control system, and parameter equations of all parts of the robot, including displacement and acceleration parameters, are established.
5. The multi-information fusion human body exoskeleton robot control protection system based on claim 4, wherein: establishing a parameter equation of each part of the robot, and setting the distance from the shoulder steering engine to the elbow steering engine of the mechanical arm to be L1Distance between elbow and wrist is L2Distance between wrist and palm is L3Arm L1、L2、L3Has a retracting and extending range of phi1、Φ2、Φ3X, y and z form a basic coordinate system of the robot, and the coordinates of the end effector of the mechanical arm can be expressed as the following equation:
x=L1cosθ11cosθ12+L2cosθ21cosθ22+L3cosθ31cosθ32
y=L1cosθ11sinθ12+L2cosθ21sinθ22+L3cosθ31sinθ32
z=L1sinθ11+L2sinθ2+L3sinθ3
wherein theta is11Is L1And plane x1y1z1The included angle of (A); theta12Is L1In plane x1y1z1Projection of (2) and x1The included angle of (A); theta21Is L2And plane x2y2z2The included angle of (A); theta22Is L2In plane x2y2z2Projection of (2) and x2The included angle of (A); theta31Is L3And plane x3y3z3The included angle of (A); theta32Is L3In plane x3y3z3Projection of (2) and x3The included angle of (A);
and obtaining the velocity of the tail end position by derivation of the tail end position of the mechanical arm, wherein the equation is as follows:
xs=-L1ω11sinθ11cosθ12-L1ω12cosθ11sinθ11-L2ω21sinθ21cosθ22-L2ω22cosθ21sinθ22-L3ω31sinθ31cosθ32-L3ω32cosθ31sinθ32
ys=-L1ω11sinθ11sinθ12+L1ω12cosθ11cosθ12-L2ω21sinθ21sinθ22+L2ω22cosθ21cosθ22-L3ω31sinθ31sinθ32+L3ω32cosθ31cosθ32
zs=L1ω11cosθ11+L2ω21cosθ21+L3ω31cosθ31
through the equation, the relationship between the speed in the robot basic coordinate system and the speed of each joint and the relationship between the contact force between the hand and the outside and the corresponding joints are established, so that the system is helped to solve the specific motion situation of the mechanical arm, and the exoskeleton robot can conveniently perform corresponding control protection according to the actual situation.
CN201910427971.7A 2019-05-22 2019-05-22 Multi-information fusion human body exoskeleton robot control protection system Active CN110125909B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910427971.7A CN110125909B (en) 2019-05-22 2019-05-22 Multi-information fusion human body exoskeleton robot control protection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910427971.7A CN110125909B (en) 2019-05-22 2019-05-22 Multi-information fusion human body exoskeleton robot control protection system

Publications (2)

Publication Number Publication Date
CN110125909A CN110125909A (en) 2019-08-16
CN110125909B true CN110125909B (en) 2022-04-22

Family

ID=67572186

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910427971.7A Active CN110125909B (en) 2019-05-22 2019-05-22 Multi-information fusion human body exoskeleton robot control protection system

Country Status (1)

Country Link
CN (1) CN110125909B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112706158B (en) * 2019-10-25 2022-05-06 中国科学院沈阳自动化研究所 Industrial man-machine interaction system and method based on vision and inertial navigation positioning
CN110916970B (en) * 2019-11-18 2021-09-21 南京伟思医疗科技股份有限公司 Device and method for realizing cooperative motion of weight-reducing vehicle and lower limb robot through communication
CN111652155A (en) * 2020-06-04 2020-09-11 北京航空航天大学 Human body movement intention identification method and system
CN112621714A (en) * 2020-12-02 2021-04-09 上海微电机研究所(中国电子科技集团公司第二十一研究所) Upper limb exoskeleton robot control method and device based on LSTM neural network
CN113459102B (en) * 2021-07-09 2022-07-05 郑州大学 Human upper limb intention identification method based on projection reconstruction
CN114366559A (en) * 2021-12-31 2022-04-19 华南理工大学 Multi-mode sensing system for lower limb rehabilitation robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106539573A (en) * 2016-11-25 2017-03-29 惠州市德赛工业研究院有限公司 A kind of Intelligent bracelet and the bracelet based reminding method based on user preference
CN108670244A (en) * 2018-05-29 2018-10-19 浙江大学 A kind of wearable physiology of flexible combination formula and psychological condition monitoring device
CN108875601A (en) * 2018-05-31 2018-11-23 郑州云海信息技术有限公司 Action identification method and LSTM neural network training method and relevant apparatus
CN109394476A (en) * 2018-12-06 2019-03-01 上海神添实业有限公司 The automatic intention assessment of brain flesh information and upper limb intelligent control method and system
CN109528450A (en) * 2019-01-24 2019-03-29 郑州大学 A kind of exoskeleton rehabilitation robot of motion intention identification
CN109549821A (en) * 2018-12-30 2019-04-02 南京航空航天大学 The exoskeleton robot assisted control system and method merged based on electromyography signal and inertial navigation signal

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015095611A2 (en) * 2013-12-18 2015-06-25 University Of Florida Research Foundation, Inc. Closed-loop hybrid orthotic system for rehabilitation and functional mobility assistance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106539573A (en) * 2016-11-25 2017-03-29 惠州市德赛工业研究院有限公司 A kind of Intelligent bracelet and the bracelet based reminding method based on user preference
CN108670244A (en) * 2018-05-29 2018-10-19 浙江大学 A kind of wearable physiology of flexible combination formula and psychological condition monitoring device
CN108875601A (en) * 2018-05-31 2018-11-23 郑州云海信息技术有限公司 Action identification method and LSTM neural network training method and relevant apparatus
CN109394476A (en) * 2018-12-06 2019-03-01 上海神添实业有限公司 The automatic intention assessment of brain flesh information and upper limb intelligent control method and system
CN109549821A (en) * 2018-12-30 2019-04-02 南京航空航天大学 The exoskeleton robot assisted control system and method merged based on electromyography signal and inertial navigation signal
CN109528450A (en) * 2019-01-24 2019-03-29 郑州大学 A kind of exoskeleton rehabilitation robot of motion intention identification

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
外骨骼机器人云脑架构及其学习算法研究;范敏;《中国优秀硕士学位论文全文数据库 信息科技辑》;20180915;全文 *
采用运动传感器的人体运动识别深度模型;滕千礼 等;《西安交通大学学报》;20180831;第52卷(第8期);全文 *

Also Published As

Publication number Publication date
CN110125909A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
CN110125909B (en) Multi-information fusion human body exoskeleton robot control protection system
Wilson et al. Formulation of a new gradient descent MARG orientation algorithm: Case study on robot teleoperation
Vogel et al. EMG-based teleoperation and manipulation with the DLR LWR-III
Wei et al. Synergy-based control of assistive lower-limb exoskeletons by skill transfer
US9278453B2 (en) Biosleeve human-machine interface
Cifuentes et al. Multimodal human–robot interaction for walker-assisted gait
Artemiadis et al. EMG-based teleoperation of a robot arm in planar catching movements using ARMAX model and trajectory monitoring techniques
CN111773027A (en) Flexibly-driven hand function rehabilitation robot control system and control method
Kiguchi et al. Motion estimation based on EMG and EEG signals to control wearable robots
Li et al. Active human-following control of an exoskeleton robot with body weight support
Dwivedi et al. A shared control framework for robotic telemanipulation combining electromyography based motion estimation and compliance control
CN113043248A (en) Transportation and assembly whole-body exoskeleton system based on multi-source sensor and control method
Xiao Proportional myoelectric and compensating control of a cable-conduit mechanism-driven upper limb exoskeleton
Yahya et al. Accurate shoulder joint angle estimation using single RGB camera for rehabilitation
El-Gohary et al. Joint angle tracking with inertial sensors
Wang et al. Prediction of contralateral lower-limb joint angles using vibroarthrography and surface electromyography signals in time-series network
Szczurek et al. Enhanced Human-Robot Interface with Operator Physiological Parameters Monitoring and 3D Mixed Reality
He et al. An sEMG based adaptive method for human-exoskeleton collaboration in variable walking environments
Morón et al. EMG-based hand gesture control system for robotics
Kim et al. Real-time motion artifact detection and removal for ambulatory BCI
Tham et al. Biomechanical ambulatory assessment of 3D knee angle using novel inertial sensor-based technique
Masters et al. Real-time arm tracking for HMI applications
Ruiz et al. Exoskeleton-based robotic platform applied in biomechanical modelling of the human upper limb
Passon et al. Hybrid inertial-robotic motion tracking for posture biofeedback in upper limb rehabilitation
Spasojević et al. Kinect-based application for progress monitoring of the stroke patients

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant