CN113925742A - Control method and control system of target-driven upper limb exoskeleton rehabilitation robot - Google Patents

Control method and control system of target-driven upper limb exoskeleton rehabilitation robot Download PDF

Info

Publication number
CN113925742A
CN113925742A CN202111220948.4A CN202111220948A CN113925742A CN 113925742 A CN113925742 A CN 113925742A CN 202111220948 A CN202111220948 A CN 202111220948A CN 113925742 A CN113925742 A CN 113925742A
Authority
CN
China
Prior art keywords
target
patient
target object
robot
plc
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111220948.4A
Other languages
Chinese (zh)
Other versions
CN113925742B (en
Inventor
瞿畅
张啸天
周建萍
张文波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nantong University
Original Assignee
Nantong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nantong University filed Critical Nantong University
Priority to CN202111220948.4A priority Critical patent/CN113925742B/en
Publication of CN113925742A publication Critical patent/CN113925742A/en
Application granted granted Critical
Publication of CN113925742B publication Critical patent/CN113925742B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus ; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0274Stretching or bending or torsioning apparatus for exercising for the upper limbs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus ; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0274Stretching or bending or torsioning apparatus for exercising for the upper limbs
    • A61H1/0277Elbow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus ; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0274Stretching or bending or torsioning apparatus for exercising for the upper limbs
    • A61H1/0281Shoulder
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/26Special purpose or proprietary protocols or architectures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/12Driving means
    • A61H2201/1207Driving means with electric or magnetic drive
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1635Hand or arm, e.g. handle
    • A61H2201/1638Holding means therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5007Control means thereof computer controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2205/00Devices for specific parts of the body
    • A61H2205/06Arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2205/00Devices for specific parts of the body
    • A61H2205/06Arms
    • A61H2205/062Shoulders

Abstract

The invention relates to the technical field of medical rehabilitation robots, in particular to a target-driven control method and a control system of an upper limb exoskeleton rehabilitation robot, wherein a training target to be grabbed is placed in a touch range of a patient, the patient watches the target to generate grabbing intention, a doctor issues a system starting instruction on an upper computer, the system tracks and positions the target in real time based on Azure Kinect, acquires a space coordinate of the target and sends the space coordinate to the upper computer, the coordinate of the target is converted from an Azure Kinect coordinate system to a robot coordinate system through coordinate conversion, motion parameters of each joint of a robot body are calculated through positive, inverse solution and track planning of the robot, the exoskeleton rehabilitation robot body is driven to drive a diseased limb to grab the target through the communication of a Snap-7 protocol and a PLC, and the subjective motion intention of the patient is achieved and visual and tactile feedback is given to the patient. The invention improves the subjective participation degree of the patient, fuses the subjective motion consciousness of the patient with the objectively acquired sensory information, and better realizes the interaction between the patient and the robot.

Description

Control method and control system of target-driven upper limb exoskeleton rehabilitation robot
Technical Field
The invention relates to the technical field of medical rehabilitation robots, in particular to a control method and a control system of a target-driven upper limb exoskeleton rehabilitation robot.
Background
The cerebral apoplexy is the first cause of death and disability of adults in China, is the local neurological function defect of a persistent cerebral hemisphere or brain stem caused by acute cerebrovascular circulatory disturbance, and has the characteristics of high morbidity, high disability rate, high death rate and high recurrence rate. The cerebral apoplexy patient usually has upper limbs dysfunction, carries out repeated stimulation training and specific task training of certain intensity to the patient, is the important means of upper limbs function rehabilitation. Because the upper limb rehabilitation robot has the advantages of no fatigue, quantification and individuation, on one hand, the robot can provide large-dose and high-repetition exercise training, on the other hand, objective and instant training data and evaluation data, and the robot assisted rehabilitation treatment becomes an effective scheme for limb function rehabilitation after stroke.
At present, an upper limb rehabilitation robot mainly has an exoskeleton type and a tail end traction type, and the exoskeleton type rehabilitation robot has the advantages of accurate action, high reliability and the like, so that the upper limb rehabilitation robot has a wide market application prospect. However, it has the following disadvantages:
1. the existing control technology of the upper limb exoskeleton rehabilitation robot mainly takes passive control as a main control, and a set of fixed motion modes are established in advance in a passive control method, so that individuation cannot be realized. Research shows that the higher the subjective participation degree of a patient in the rehabilitation treatment process is, the better the treatment effect is, while the simple passive rehabilitation has not only poor treatment effect, but also easily causes the conflicting emotion of the patient due to boring training.
2. In order to reflect the active participation degree of a patient, the upper limb exoskeleton rehabilitation robot mainly controls the motion of the robot in real time by combining physiological signals such as myoelectricity, electroencephalogram and the like, and because misjudgment is easy to occur to the signals, the rehabilitation robot can execute wrong motion actions during training, so that the training effect cannot be ensured, and the patient can be damaged; on the other hand, the patient needs to wear corresponding sensing equipment, and the travelling comfort is not good enough.
Disclosure of Invention
Aiming at the problems, the invention provides a control method and a control system of a target-driven upper limb exoskeleton rehabilitation robot, which enable a patient to perform tracking and grabbing training of a target object under the assistance of an exoskeleton robot on the premise of generating a subjective movement intention, improve the subjective participation degree of the patient, integrate the subjective movement consciousness of the patient with objectively acquired sensory information, and better realize the interaction between the patient and the robot.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a control method of a target-driven upper limb exoskeleton rehabilitation robot comprises the following steps:
firstly, a doctor inputs the length of an upper limb, the rotating speed of each joint of an exoskeleton and the rotating angle range of each joint in an upper computer according to the actual condition of a patient;
secondly, a doctor places a training target in a touch range of a patient, requires the patient to watch the target object and generates a grabbing intention for the target object, and issues a system starting instruction by the doctor on an upper computer;
acquiring a depth map and a color map of a target object area in real time by the Azure Kinect, tracking and positioning the target object in real time, and recording a space coordinate of the target object at a static moment;
converting the space coordinate of the target object from an Azure Kinect coordinate system to a robot coordinate system through coordinate conversion, and concretely, converting the space coordinate of the target object from the Azure Kinect coordinate system to the robot coordinate system;
step five, calculating the motion parameters of each joint of the exoskeleton through forward and backward solution and trajectory planning of the robot;
step six, connecting the upper computer and the PLC in the same network segment through a TCP/IP protocol, and reading and writing data through a Snap7 protocol; when the upper computer is successfully communicated with the PLC, successfully reading the numerical value of the angle sensor in the PLC analog input module, and carrying out the next step;
step seven, the upper computer sends the rotation angle and the rotation speed required by each joint to the PLC, the PLC converts the numerical value into a high-speed Pulse (PTO) and sends the PTO to a servo motor driver, the driving motor rotates the corresponding angle at the rated rotation speed, and the value of the angle sensor is read in real time and returned to the upper computer in a multithreading mode;
and step eight, assisting the patient to grab the target object by the upper limb exoskeleton rehabilitation robot, and returning to the initial position after the grabbing is finished.
Preferably, in the step (iii), the specific process is as follows:
(1) extracting a target object by utilizing point cloud segmentation and coordinate mapping according to the three-dimensional point cloud data of the target object;
(2) establishing a target object color-shape model fused with depth information in an HSV color space;
(3) carrying out depth and color filtering on the RGB image acquired by the Azure Kinect in real time;
(4) performing connected domain segmentation on the target object and the background with similar colors by using the depth variance to obtain a potential target;
(5) and completing target tracking and positioning through the target object shape deviation rate, the histogram Babbitt distance and the depth mean value of the target object centroid region.
Preferably, in the step (v), the specific process is as follows:
(1) reading a pre-established robot D-H parameter table, simplifying an upper limb exoskeleton rehabilitation mechanical arm into a connecting rod model, establishing a coordinate system of each connecting rod, solving a kinematics positive solution, and solving a spatial coordinate position corresponding to the hand of a patient when each joint of the robot rotates by a corresponding angle;
(2) solving inverse kinematics through a geometric method, and solving the angle of rotation required by each joint when the rehabilitation robot assists the patient to move the hand to the position according to the position of the target object in the space;
(3) carrying out forward solution again on the angle obtained by the inverse solution, and entering the next step when the space coordinate obtained by the second forward solution is consistent with the coordinate of the target object;
(4) planning a robot track, reading a joint angle value measured by an angle sensor and a joint rotation angle range preset by a doctor, and moving when the joint rotation angle solved in the steps (1), (2) and (3) is in the range; otherwise the out-of-range joint rotation angle is modified to the maximum value within the allowed range.
The invention also provides a control system of the target-driven upper limb exoskeleton rehabilitation robot, which comprises a hardware structure as follows: the device comprises an Azure Kinect sensor, a main control computer, a router, a PLC analog input module, a servo motor driver, a servo motor, an angle sensor and a three-degree-of-freedom upper limb exoskeleton mechanical body.
The Azure Kinect sensor is connected with a main control computer through a USB3.0, the PLC and the PLC analog input module are installed together, a network cable socket is arranged on the PLC, the PLC is connected with a Lan port of a router through a network cable, and IP addresses of the PLC, the router and the main control computer are modified to be under the same network segment, so that wireless communication between an upper computer and the PLC is realized;
the angle sensor is installed on a corresponding joint of the upper limb exoskeleton mechanical body with three degrees of freedom, and is connected with the PLC analog input module through a signal wire.
Preferably, the three-degree-of-freedom upper limb exoskeleton mechanical body comprises three degrees of freedom of shoulder abduction/adduction, forward flexion/backward extension and elbow flexion/extension.
By adopting the technical scheme, the control system enables the patient to perform tracking and grabbing training of the target object under the assistance of the exoskeleton robot on the premise of generating the subjective movement intention, so that the subjective participation degree of the patient is improved, the subjective movement consciousness of the patient is fused with objectively acquired feeling information, and the interaction between the patient and the robot is better realized. Meanwhile, the control method has the diversity of training targets, and the training interest and the enthusiasm of patients can be improved.
The invention has the beneficial effects that:
1. with the assistance of the exoskeleton rehabilitation robot, a patient can grab and train a target object in the upper limb movement range through the cooperative action of the shoulder joint and the elbow joint.
2. Compared with the traditional passive rehabilitation training mode, the target-driven rehabilitation training method can fully stimulate the subjective movement intention of the patient, and achieves better rehabilitation treatment effect.
3. Compared with the current electromyographic signals and brain wave technology, the target tracking and positioning method is more mature and stable, does not need to wear complicated equipment, is more comfortable, and can bring better humanistic care to patients.
Drawings
FIG. 1 is a flow chart of a control method of the present invention;
FIG. 2 is a schematic diagram of the hardware architecture of the control system of the present invention;
FIG. 3 is a block diagram of the control system of the present invention;
FIG. 4 is a D-H linkage coordinate system of the robotic arm of the present invention;
FIG. 5 is a geometric inverse kinematics analysis of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings, so that those skilled in the art can better understand the advantages and features of the present invention, and thus the scope of the present invention is more clearly defined. The embodiments described herein are only a few embodiments of the present invention, rather than all embodiments, and all other embodiments that can be derived by one of ordinary skill in the art without inventive faculty based on the embodiments described herein are intended to fall within the scope of the present invention.
Referring to fig. 1 to 5, a method for controlling a target-driven upper limb exoskeleton rehabilitation robot includes the steps of:
firstly, a doctor inputs the length of an upper limb, the rotating speed of each joint of an exoskeleton and the rotating angle range of each joint in an upper computer according to the actual condition of a patient;
secondly, a doctor places a training target in a touch range of a patient, requires the patient to watch the target object and generates a grabbing intention for the target object, and issues a system starting instruction by the doctor on an upper computer;
step three, acquiring a depth map and a color map of a target object area in real time by the Azure Kinect, tracking and positioning the target object in real time, and recording a space coordinate of the target object at a static moment, wherein the specific process comprises the following steps:
(1) extracting a target object by utilizing point cloud segmentation and coordinate mapping according to the three-dimensional point cloud data of the target object;
(2) establishing a target object color-shape model fused with depth information in an HSV color space;
(3) carrying out depth and color filtering on the RGB image acquired by the Azure Kinect in real time;
(4) performing connected domain segmentation on the target object and the background with similar colors by using the depth variance to obtain a potential target;
(5) and completing target tracking and positioning through the target object shape deviation rate, the histogram Babbitt distance and the depth mean value of the target object centroid region.
Step four, converting the space coordinate of the target object from the Azure Kinect coordinate system to the robot coordinate system through coordinate conversion, and the specific process is as follows:
for the object space coordinate point P in the AzureKinect coordinate system as (a, b, c), the coordinate point can be converted into the corresponding point Q in the robot coordinate system as (x, y, z) through the following formula
Figure RE-GDA0003358253650000051
Where α, β, and γ are angles of rotation of P about the axis X, Y, Z of the robot coordinate system, and d, e, and f are distances traveled by point P along the axis X, Y, Z of the robot coordinate system.
Step five, calculating the motion parameters of each joint of the exoskeleton through forward and backward solution and trajectory planning of the robot, and specifically comprising the following steps:
(1) reading a pre-established robot D-H parameter table, simplifying an upper limb exoskeleton rehabilitation mechanical arm into a connecting rod model, establishing a coordinate system of each connecting rod, solving a kinematics positive solution, and solving a spatial coordinate position corresponding to the hand of a patient when each joint of the robot rotates by a corresponding angle;
as shown in fig. 4, fig. 4 is a link coordinate system established by the upper limb exoskeleton rehabilitation robot with three degrees of freedom in an initial state (the rotation angle of each axis is zero), the shoulder link, the upper arm and the forearm are respectively simplified into a link 1, a link 2 and a link 3, a coordinate system { n } is corresponding to the link n, the X axis of each coordinate system is consistent with the length direction of the link, the Z axis is coincident with the rotation or movement axis of the joint, and the direction of the Y axis is given by the right-hand rule.
The relative position change of the connecting rod n to the connecting rod n-1 is described by a rotation and translation transformation matrix according to the D-H connecting rod parameters. The transformation matrix of the link 3 with respect to the base coordinate position is then:
Figure BDA0003312595260000051
in the formula, theta1Is the angle between the connecting rod 1 and the initial position, theta2Is the angle between the connecting rod 2 and the connecting rod 1, theta3Is the angle between the connecting rod 3 and the connecting rod 2, L2Is O2To O3Distance between points, L3Is O3To O4The distance between the points.
(2) Solving inverse kinematics through a geometric method, and solving the angle of rotation required by each joint when the rehabilitation robot assists the patient to move the hand to the position according to the position of the target object in the space;
as shown in fig. 5, θ1,θ2,θ3The rotation angles of the shoulder abduction/adduction, the shoulder flexion/retroflexion, and the elbow flexion/extension, respectively, when the coordinate of the end of the mechanical arm is Q ═ x, y, z, θ is solved from fig. 5 by a geometric method1,θ2,θ3The calculation formulas are respectively as follows:
Figure BDA0003312595260000052
Figure BDA0003312595260000053
θ3=arcos(x2+y2+z2-L2 2-L3 2)/2L2L3
wherein the content of the first and second substances,
Figure BDA0003312595260000054
(3) carrying out forward solution again on the angle obtained by the inverse solution, and entering the next step when the space coordinate obtained by the second forward solution is consistent with the coordinate of the target object;
(4) planning a robot track, reading a joint angle value measured by an angle sensor and a joint rotation angle range preset by a doctor, and moving when the joint rotation angle solved in the steps (1), (2) and (3) is in the range; otherwise the out-of-range joint rotation angle is modified to the maximum value within the allowed range.
Step six, connecting the upper computer and the PLC in the same network segment through a TCP/IP protocol, and reading and writing data through a Snap7 protocol; when the upper computer is successfully communicated with the PLC, successfully reading the numerical value of the angle sensor in the PLC analog input module, and carrying out the next step;
step seven, the upper computer sends the rotation angle and the rotation speed required by each joint to the PLC, the PLC converts the numerical value into a high-speed Pulse (PTO) and sends the PTO to a servo motor driver, the driving motor rotates the corresponding angle at the rated rotation speed, and the value of the angle sensor is read in real time and returned to the upper computer in a multithreading mode;
and step eight, assisting the patient to grab the target object by the upper limb exoskeleton rehabilitation robot, and returning to the initial position after the grabbing is finished.
As shown in fig. 2, a control system of a control method of a target-driven upper limb exoskeleton rehabilitation robot, the control system having a hardware configuration including: the device comprises an Azure Kinect sensor 10, a main control computer 20, a router 30, a PLC40, a PLC analog input module 50, a servo motor driver 60, a servo motor 70, an angle sensor 80 and a three-degree-of-freedom upper limb exoskeleton mechanical body 90.
The Azure Kinect sensor 10 is connected with the main control computer 20 through a USB3.0, the PLC40 and the PLC analog input module 50 are installed together, a network cable socket is arranged on the PLC40, the PLC40 is connected with a Lan port of the router 30 through a network cable, and IP addresses of the PLC40, the router 30 and the main control computer 20 are modified to be under the same network segment, so that wireless communication between an upper computer and the PLC is realized;
the servo motor driver 60 is connected with the PLC40 through a signal line, the servo motor 70 is connected with the servo motor driver 60 through a signal line, the angle sensor 80 is installed on a corresponding joint of the three-degree-of-freedom upper limb exoskeleton mechanical body 90, and the angle sensor 80 is connected with the PLC analog input module 50 through a signal line.
The three-degree-of-freedom upper limb exoskeleton mechanical body 90 comprises three degrees of freedom including shoulder abduction/adduction, forward flexion/backward extension and elbow flexion/extension.
In this embodiment, as shown in fig. 3, a specific process of the target-driven upper limb exoskeleton rehabilitation robot training mode is as follows:
the target-driven rehabilitation training is carried out under the monitoring of a doctor, and an upper computer interface is displayed in a main control screen for man-machine interaction; the doctor sets three parameters of the length of the upper limb, the range of the joint rotation angle and the maximum joint rotation speed of the patient in the upper computer; displaying a picture shot by the Azure Kinect in real time in the upper computer, and marking a target to be tracked; if the target tracking is successful, the target driving type rehabilitation training mode is started, the exoskeleton rehabilitation robot drives the affected limb to do spatial motion, and the target object watched is grabbed and trained. The upper computer displays the movement speed and the rotation angle of each joint in real time, and a doctor can start or stop target driving type rehabilitation training through the upper computer at any time.
In summary, with the assistance of the exoskeleton rehabilitation robot, the patient performs grabbing training on the target object in the upper limb movement range through the cooperative action of the shoulder joint and the elbow joint. Compared with the traditional passive rehabilitation training mode, the target-driven rehabilitation training method can fully stimulate the subjective movement intention of the patient, and achieves better rehabilitation treatment effect; compared with the current electromyographic signals and brain wave technology, the target tracking and positioning method is more mature and stable, does not need to wear complicated equipment, is more comfortable, and can bring better humanistic care to patients.
The embodiments of the present invention have been described in detail, but the description is only for the preferred embodiments of the present invention and should not be construed as limiting the scope of the present invention. All equivalent changes and modifications made within the scope of the present invention shall fall within the scope of the present invention.

Claims (5)

1. A control method of a target-driven upper limb exoskeleton rehabilitation robot is characterized in that: the method comprises the following steps:
firstly, a doctor inputs the length of an upper limb, the rotating speed of each joint of an exoskeleton and the rotating angle range of each joint in an upper computer according to the actual condition of a patient;
secondly, a doctor places a training target in a touch range of a patient, requires the patient to watch the target object and generates a grabbing intention for the target object, and issues a system starting instruction by the doctor on an upper computer;
acquiring a depth map and a color map of a target object area in real time by the Azure Kinect, tracking and positioning the target object in real time, and recording a space coordinate of the target object at a static moment;
converting the space coordinate of the target object from an Azure Kinect coordinate system to a robot coordinate system through coordinate conversion;
step five, calculating the motion parameters of each joint of the exoskeleton through forward and backward solution and trajectory planning of the robot;
step six, connecting the upper computer and the PLC in the same network segment through a TCP/IP protocol, and reading and writing data through a Snap7 protocol; when the upper computer is successfully communicated with the PLC, successfully reading the numerical value of the angle sensor in the PLC analog input module, and carrying out the next step;
step seven, the upper computer sends the rotation angle and the rotation speed required by each joint to the PLC, the PLC converts the numerical value into a high-speed Pulse (PTO) and sends the PTO to a servo motor driver, the driving motor rotates the corresponding angle at the rated rotation speed, and the value of the angle sensor is read in real time and returned to the upper computer in a multithreading mode;
and step eight, assisting the patient to grab the target object by the upper limb exoskeleton rehabilitation robot, and returning to the initial position after the grabbing is finished.
2. The method for controlling a target-driven upper extremity exoskeleton rehabilitation robot according to claim 1, wherein: in the step (III), the specific process is as follows:
(1) extracting a target object by utilizing point cloud segmentation and coordinate mapping according to the three-dimensional point cloud data of the target object;
(2) establishing a target object color-shape model fused with depth information in an HSV color space;
(3) carrying out depth and color filtering on the RGB image acquired by the Azure Kinect in real time;
(4) performing connected domain segmentation on the target object and the background with similar colors by using the depth variance to obtain a potential target;
(5) and completing target tracking and positioning through the target object shape deviation rate, the histogram Babbitt distance and the depth mean value of the target object centroid region.
3. The method for controlling a target-driven upper extremity exoskeleton rehabilitation robot according to claim 1, wherein: in the step (V), the specific process is as follows:
(1) reading a pre-established robot D-H parameter table, simplifying an upper limb exoskeleton rehabilitation mechanical arm into a connecting rod model, establishing a coordinate system of each connecting rod, solving a kinematics positive solution, and solving a spatial coordinate position corresponding to the hand of a patient when each joint of the robot rotates by a corresponding angle;
(2) solving inverse kinematics through a geometric method, and solving the angle of rotation required by each joint when the rehabilitation robot assists the patient to move the hand to the position according to the position of the target object in the space;
(3) carrying out forward solution again on the angle obtained by the inverse solution, and entering the next step when the space coordinate obtained by the second forward solution is consistent with the coordinate of the target object;
(4) planning a robot track, reading a joint angle value measured by an angle sensor and a joint rotation angle range preset by a doctor, and moving when the joint rotation angle solved in the steps (1), (2) and (3) is in the range; otherwise the out-of-range joint rotation angle is modified to the maximum value within the allowed range.
4. A control system of a target-driven upper extremity exoskeleton rehabilitation robot applying the method of claim 1, characterized in that: the hardware structure of the control system comprises: the device comprises an Azure Kinect sensor, a main control computer, a router, a PLC analog input module, a servo motor driver, a servo motor, an angle sensor and a three-degree-of-freedom upper limb exoskeleton mechanical body;
the Azure Kinect sensor is connected with a main control computer through a USB3.0, the PLC and the PLC analog input module are installed together, a network cable socket is arranged on the PLC, the PLC is connected with a Lan port of the router through a network cable, and IP addresses of the PLC, the router and the main control computer are modified to be under the same network segment, so that wireless communication between the upper computer and the PLC is realized;
the angle sensor is installed on a corresponding joint of the upper limb exoskeleton mechanical body with three degrees of freedom, and is connected with the PLC analog input module through a signal wire.
5. The control system of the target-driven upper extremity exoskeleton rehabilitation robot of claim 4, wherein: the three-degree-of-freedom upper limb exoskeleton mechanical body comprises three degrees of freedom of shoulder abduction/adduction, forward flexion/backward extension and elbow flexion/extension.
CN202111220948.4A 2021-10-20 2021-10-20 Control method and control system of target-driven upper limb exoskeleton rehabilitation robot Active CN113925742B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111220948.4A CN113925742B (en) 2021-10-20 2021-10-20 Control method and control system of target-driven upper limb exoskeleton rehabilitation robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111220948.4A CN113925742B (en) 2021-10-20 2021-10-20 Control method and control system of target-driven upper limb exoskeleton rehabilitation robot

Publications (2)

Publication Number Publication Date
CN113925742A true CN113925742A (en) 2022-01-14
CN113925742B CN113925742B (en) 2022-06-21

Family

ID=79280776

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111220948.4A Active CN113925742B (en) 2021-10-20 2021-10-20 Control method and control system of target-driven upper limb exoskeleton rehabilitation robot

Country Status (1)

Country Link
CN (1) CN113925742B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114732668A (en) * 2022-03-28 2022-07-12 上海神泰医疗科技有限公司 Method, system, equipment and medium for measuring motion precision of limb rehabilitation training robot

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110071675A1 (en) * 2009-09-22 2011-03-24 Gm Global Technology Operations, Inc. Visual perception system and method for a humanoid robot
WO2015177634A1 (en) * 2014-05-22 2015-11-26 Toyota Jidosha Kabushiki Kaisha Rehabilitation apparatus, control method, and control program
CN105527980A (en) * 2015-12-01 2016-04-27 上海宇航系统工程研究所 Target tracking control method of binocular visual system
CN107291811A (en) * 2017-05-18 2017-10-24 浙江大学 A kind of sense cognition enhancing robot system based on high in the clouds knowledge fusion
CN108098761A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 A kind of the arm arm device and method of novel robot crawl target
CN108257173A (en) * 2017-12-29 2018-07-06 上海物景智能科技有限公司 Object separation method and apparatus and system in a kind of image information
CN108638069A (en) * 2018-05-18 2018-10-12 南昌大学 A kind of mechanical arm tail end precise motion control method
CN108814894A (en) * 2018-04-12 2018-11-16 山东大学 The upper limb rehabilitation robot system and application method of view-based access control model human body pose detection
CN108858199A (en) * 2018-07-27 2018-11-23 中国科学院自动化研究所 The method of the service robot grasp target object of view-based access control model
CN108972549A (en) * 2018-07-03 2018-12-11 华南理工大学 Industrial machinery arm Real Time Obstacle Avoiding based on Kinect depth camera plans grasping system
CN109605385A (en) * 2018-11-28 2019-04-12 东南大学 A kind of rehabilitation auxiliary robot of mixing brain-computer interface driving
TW201931296A (en) * 2018-01-02 2019-08-01 元智大學 Skeleton tracking system and method for rehabilitation
CN110123572A (en) * 2019-04-04 2019-08-16 华南理工大学 A kind of healing robot training system of the multi-modal interaction of hemiplegic upper limb compensatory activity
CN113197754A (en) * 2021-06-04 2021-08-03 山东建筑大学 Upper limb exoskeleton rehabilitation robot system and method
CN113276120A (en) * 2021-05-25 2021-08-20 中国煤炭科工集团太原研究院有限公司 Control method and device for mechanical arm movement and computer equipment
CN113510690A (en) * 2021-03-30 2021-10-19 上海机电工程研究所 Four-degree-of-freedom series robot inverse kinematics solving method and system

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110071675A1 (en) * 2009-09-22 2011-03-24 Gm Global Technology Operations, Inc. Visual perception system and method for a humanoid robot
WO2015177634A1 (en) * 2014-05-22 2015-11-26 Toyota Jidosha Kabushiki Kaisha Rehabilitation apparatus, control method, and control program
CN105527980A (en) * 2015-12-01 2016-04-27 上海宇航系统工程研究所 Target tracking control method of binocular visual system
CN108098761A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 A kind of the arm arm device and method of novel robot crawl target
CN107291811A (en) * 2017-05-18 2017-10-24 浙江大学 A kind of sense cognition enhancing robot system based on high in the clouds knowledge fusion
CN108257173A (en) * 2017-12-29 2018-07-06 上海物景智能科技有限公司 Object separation method and apparatus and system in a kind of image information
TW201931296A (en) * 2018-01-02 2019-08-01 元智大學 Skeleton tracking system and method for rehabilitation
CN108814894A (en) * 2018-04-12 2018-11-16 山东大学 The upper limb rehabilitation robot system and application method of view-based access control model human body pose detection
CN108638069A (en) * 2018-05-18 2018-10-12 南昌大学 A kind of mechanical arm tail end precise motion control method
CN108972549A (en) * 2018-07-03 2018-12-11 华南理工大学 Industrial machinery arm Real Time Obstacle Avoiding based on Kinect depth camera plans grasping system
CN108858199A (en) * 2018-07-27 2018-11-23 中国科学院自动化研究所 The method of the service robot grasp target object of view-based access control model
CN109605385A (en) * 2018-11-28 2019-04-12 东南大学 A kind of rehabilitation auxiliary robot of mixing brain-computer interface driving
CN110123572A (en) * 2019-04-04 2019-08-16 华南理工大学 A kind of healing robot training system of the multi-modal interaction of hemiplegic upper limb compensatory activity
CN113510690A (en) * 2021-03-30 2021-10-19 上海机电工程研究所 Four-degree-of-freedom series robot inverse kinematics solving method and system
CN113276120A (en) * 2021-05-25 2021-08-20 中国煤炭科工集团太原研究院有限公司 Control method and device for mechanical arm movement and computer equipment
CN113197754A (en) * 2021-06-04 2021-08-03 山东建筑大学 Upper limb exoskeleton rehabilitation robot system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
瞿畅等人: "基于Kinect的上肢康复训练系统开发与应用", 《中国生物医学工程学报》 *
黄玲涛等人: "基于Kinect的机器人抓取系统研究", 《农业机械学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114732668A (en) * 2022-03-28 2022-07-12 上海神泰医疗科技有限公司 Method, system, equipment and medium for measuring motion precision of limb rehabilitation training robot
CN114732668B (en) * 2022-03-28 2023-08-11 上海神泰医疗科技有限公司 Method, system, equipment and medium for measuring motion precision of limb rehabilitation training robot

Also Published As

Publication number Publication date
CN113925742B (en) 2022-06-21

Similar Documents

Publication Publication Date Title
WO2021243918A1 (en) Upper-limb function evaluation apparatus and method, and upper-limb rehabilitation training system and method
WO2021068542A1 (en) Force feedback technology-based robot system for active and passive rehabilitation training of upper limbs
Barea et al. Wheelchair guidance strategies using EOG
Schröer et al. An autonomous robotic assistant for drinking
CN109820695A (en) A kind of horizontal bilateral brain paralysis lower limb rehabilitation robot in the ward ICU with communication and independent navigation locomotive function
CN107349570A (en) Rehabilitation training of upper limbs and appraisal procedure based on Kinect
CN113925742B (en) Control method and control system of target-driven upper limb exoskeleton rehabilitation robot
Martin et al. A novel approach of prosthetic arm control using computer vision, biosignals, and motion capture
CN106214163B (en) Recovered artifical psychological counseling device of low limbs deformity correction postoperative
CN115741732A (en) Interactive path planning and motion control method of massage robot
Zhao et al. SSVEP-based hierarchical architecture for control of a humanoid robot with mind
Úbeda et al. Control strategies of an assistive robot using a Brain-Machine Interface
CN113730190A (en) Upper limb rehabilitation robot system with three-dimensional space motion
Ning et al. Human brain control of electric wheelchair with eye-blink electrooculogram signal
CN111127991A (en) 3D medical teaching system
CN113995629B (en) Mirror image force field-based upper limb double-arm rehabilitation robot admittance control method and system
CN108888482B (en) Lower limb exoskeleton rehabilitation training system based on motor cortex related potential
CN114010184A (en) Motion data acquisition and mirror image method for planar rehabilitation robot
CN111134974A (en) Wheelchair robot system based on augmented reality and multi-mode biological signals
CN209253488U (en) A kind of bionical class brain intelligent hand electric mechanical ectoskeleton and its control system entirely
Bento et al. The SWORD tele-rehabilitation system.
Abiri et al. Real-time brain machine interaction via social robot gesture control
CN113996025A (en) Planar rehabilitation mirror image robot control system and training mode implementation method
Ai et al. Design and implementation of haptic sensing interface for ankle rehabilitation robotic platform
CN110584790B (en) Arm stiffness-based teleoperation proportion control method for surgical robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant