CN111015649A - Driving and controlling integrated control system - Google Patents

Driving and controlling integrated control system Download PDF

Info

Publication number
CN111015649A
CN111015649A CN201911102910.XA CN201911102910A CN111015649A CN 111015649 A CN111015649 A CN 111015649A CN 201911102910 A CN201911102910 A CN 201911102910A CN 111015649 A CN111015649 A CN 111015649A
Authority
CN
China
Prior art keywords
module
force
robot
joint
cooperative robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911102910.XA
Other languages
Chinese (zh)
Other versions
CN111015649B (en
Inventor
冯伟
吴新宇
张艳辉
陈清朋
徐天添
马跃
孙建铨
王大帅
郭师峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Nozoli Machine Tools Technology Co Ltd
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Publication of CN111015649A publication Critical patent/CN111015649A/en
Application granted granted Critical
Publication of CN111015649B publication Critical patent/CN111015649B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1633Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1641Programme controls characterised by the control loop compensation for backlash, friction, compliance, elasticity in the joints
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones

Abstract

The application discloses intelligence drives accuse integration control system, this control system includes: the system comprises a control module, an intelligent dynamic parameter identification module, a sensorless active compliance control module, a force feedback human-computer cooperation anti-collision control module and a multi-axis driving module; the control module is used for controlling the multi-axis driving module in real time; the intelligent dynamic parameter identification module provides a mechanical parameter identification signal of the cooperative robot to the control module in time according to the motion state of the cooperative robot; the sensorless active compliance control module provides a position signal, a force signal and an environment signal of the cooperative robot to the control module in time according to the motion state of the cooperative robot; the force feedback man-machine cooperation anti-collision control module provides a safety state signal of the cooperative robot to the control module in time according to the motion state of the cooperative robot; and the multi-axis driving module controls the motion of the cooperative robot in real time according to the instruction of the control module. By the mode, the control capability of the robot can be improved.

Description

Driving and controlling integrated control system
Technical Field
The invention belongs to the field of cooperative robots, and particularly relates to a driving and controlling integrated control system.
Background
With the development of industrial automation technology, industrial robots play an important role in more and more production tasks, but limited by technical maturity and implementation cost, some complex operation tasks still need to be completed manually by people, thereby forcing the generation of cooperative robots capable of operating in a human-machine co-fusion environment.
Compared with the traditional industrial robot, the cooperative robot does not need an independent isolation space, can cooperate closely with human beings to complete production tasks, for example, on the assembly line of 3C products, the human beings can complete complex assembly tasks, and the cooperative robot can rapidly and accurately complete part picking and placing tasks, so that the cooperative work division is greatly improved in production efficiency, and the production cost is reduced. In order to achieve the cooperation goal, a safe human-computer interaction environment needs to be ensured, and requirements on control of the cooperative robot in terms of accuracy and flexibility are far higher than those of a traditional robot.
At present, industrial robots generally adopt a distributed control mode of a central motion controller and a plurality of servo drivers, and the mode is convenient in layout and simple in application. Most of the traditional industrial robots work in a position control mode, each joint utilizes a driver to realize accurate position loop PID control, and receives the instruction requirement of a motion controller through a bus. For a cooperative robot, complex algorithms such as feedforward control and compliance control need to be realized, a distributed architecture has the problems of limited signal transmission rate and a synchronization mechanism, and the real-time performance and the rapidity of the distributed architecture hardly meet the requirements of the cooperative robot. In order to solve the problem, a driving and controlling integrated controller for a cooperative robot is provided at present, and has the characteristics of compact structure, high response speed, high control precision, low cost and the like. However, the existing integrated control device still has the following problems in application: firstly, the algorithm implementation still needs a controller at an upper layer, which can generate the problems of data transmission, real-time performance and synchronization among different systems; secondly, most of the existing kinetic model parameter identification algorithms are realized by iterative estimation based on the traditional excitation track and least square method, the modeling is complex, the estimation precision is not high, and the parameters which can not be modeled can not be identified; thirdly, the control information of the active compliance force of the existing robot is mainly obtained through a force/torque sensor on a joint, but the force/torque sensor is large in size and is not suitable for being used on a cooperative robot, and if the force/torque sensor with smaller size is adopted, the problem of high price exists; and fourthly, the collision prevention detection capability is limited, and the safety protection strategy is not suitable for being used on the cooperative robot.
Disclosure of Invention
In order to solve the problems, the invention provides an intelligent driving and controlling integrated control system which can identify the parameters which can not be modeled, can carry out active compliance control without a sensor, has good anti-collision detection capability and safety protection strategies and is suitable for being used on a cooperative robot.
The technical scheme of the invention is as follows: provided is a drive-control integrated control system for controlling a cooperative robot (6), the drive-control integrated control system including: the system comprises a control module (1), an intelligent dynamic parameter identification module (2), a sensorless active compliance control module (3), a force feedback man-machine cooperation anti-collision control module (4) and a multi-axis driving module (5); the control module (1) is used for controlling the multi-axis driving module (5) in real time according to a preset instruction or/and a signal fed back by the intelligent dynamic parameter identification module (2), the sensorless active compliance control module (3) and the force feedback man-machine cooperation anti-collision control module (4) according to the state of the cooperative robot (6) so as to further enable the multi-axis driving module (5) to control the movement of the cooperative robot (6); the intelligent dynamic parameter identification module (2) is used for feeding back a mechanical parameter identification signal of the cooperative robot (6) to the control module (1) according to the motion state of the cooperative robot (6); the sensorless active compliance control module (3) is used for feeding back a position signal, a force signal and an environment signal of the cooperative robot (6) to the control module (1) according to the motion state of the cooperative robot (6); the force feedback man-machine cooperation anti-collision control module (4) is used for feeding back a safety state signal of the cooperation robot (6) to the control module (1) according to the motion state of the cooperation robot (6).
Wherein, the intelligent kinetic parameter identification module (2) comprises: a nominal model (21), the nominal model (21) being based on a Lagrangian dynamics model for: according to the motion state of the cooperative robot (6), the motion speed of any point on each connecting rod of the cooperative robot is obtained, and the kinetic energy of each connecting rod of the cooperative robot in the motion process and the total kinetic energy of the cooperative robot in motion are calculated; calculating potential energy of each connecting rod of the cooperative robot in the motion process and total potential energy of each connecting rod of the cooperative robot relative to a reference potential energy plane in the motion process; constructing a Lagrange function of the cooperative robot according to the total kinetic energy and the total potential energy of the cooperative robot; performing derivation operation on the Lagrange function to obtain a nominal kinetic equation of the cooperative robot; the actual dynamic model (22) is used for obtaining an actual dynamic equation of the actual dynamic model of the cooperative robot according to preset parameters; a parameter discriminating neural network (23) for coupling the cooperating robotsSetting a torque working mode, selecting a smooth torque curve in the range from minimum to maximum of joint torque as the input of the cooperative robot, and acquiring angular displacement, angular velocity and angular acceleration of each joint by using a code disc of each joint; setting sampling time (T) in a sampling period (T), and taking N groups of data containing torque, angular displacement, angular velocity and angular acceleration as one-time training sample data; a learning optimization module (24) for obtaining a theoretical output value of the moment tau (k) in the sample data through a nominal model
Figure BDA0002270389920000031
Combining the moments τ (k) with the actual output values in the samples
Figure BDA0002270389920000032
Inputting the data into a parameter identification neural network to obtain an output correction value
Figure BDA0002270389920000033
And combining the theoretical output value with the output correction value to obtain an identification output value
Figure BDA0002270389920000034
The difference between the actual output value and the identified output value is used to obtain the output error
Figure BDA0002270389920000035
And establishing a loss function of the parameter identification neural network by using the output error, and training the parameter identification neural network to further finish the correction of the dynamic model.
Wherein, the nominal kinetic equation is:
Figure BDA0002270389920000036
wherein D (q) e Rn×nIs a symmetric positive definite inertia matrix;
Figure BDA0002270389920000037
is a matrix of Coriolis force and centrifugal force; g (q) ε Rn×1Is a gravity center item matrix;
Figure BDA0002270389920000038
q is a mechanical joint angular displacement vector,
Figure BDA0002270389920000039
Is the angular velocity vector of the mechanical arm
Figure BDA00022703899200000310
Is the angular acceleration vector of the mechanical arm; tau epsilon to RnAnd controlling the moment vector for each joint of the mechanical arm.
Wherein, the actual kinetic equation is:
Figure BDA0002270389920000041
wherein F (q) represents the friction of the joint movement,
Figure BDA0002270389920000042
representing disturbances in the motion of the robot arm.
The sensorless active compliance control module (3) comprises a position ring (31), a force ring (32) and a force-position hybrid control law output module (33); the position ring (31) comprises a tail end position input end (311), a position selection matrix (312) and a position control law module (313); the tail end position input end (311) is used for inputting tail end position signals to the position selection matrix (312), and the tail end position signals are input to the force position hybrid control law output module (33) according to the position signals processed by the position selection matrix (312) and the position control law module (313); the force ring (32) comprises a tail end force input end (321), a force selection matrix (322), a force control law module (313) and a joint moment estimation module (324) based on motor current, wherein the tail end force input end (321) is used for inputting a tail end force signal to the force selection matrix (322), the tail end force signal is input to a force position hybrid control law output module (33) according to the force signal processed by the force selection matrix (322) and the force control law module (323), and the joint moment estimation module (324) feeds back the real-time current of a joint motor (35) to the tail end force input end (321); and the force and position mixed control law output module (33) inputs a force and position mixed control law output signal (G) to the joint motor (35).
The sensorless active compliance control module (3) further comprises a robot kinematic model (314), and the robot kinematic model (314) feeds back joint angles and angular velocities of the cooperative robot (6) to the tail end position input end (311).
The sensorless active compliance control module (3) further comprises a compensation module (34), and the compensation module (34) is arranged between the force-position hybrid control law output module (33) and the joint motor (35).
The joint moment estimation module is constructed by a complete robot dynamic equation as follows:
Figure BDA0002270389920000043
wherein M is the same as Rn×nIs a joint space inertia matrix; c is belonged to Rn×nCalculating a matrix for the coriolis force and the centripetal force; g is as large as Rn×1Is a gravity term vector; q is an element of Rn×1Is the drive joint angle vector; tau epsilon to Rn×1To drive joint torque; motor torque taumThe equation for the drive is: τ ═ N τm(ii) a Wherein N ∈ Rn×nFor a diagonal matrix of each joint reduction ratio, let JmInertia of the motor rotor; in the derivation process, friction terms at the rotor of the motor are calculated
Figure BDA0002270389920000044
Substituting the relation between the motor torque module and the joint torque to obtain a joint torque estimation module based on the motor current, thereby obtaining force detection output:
Figure BDA0002270389920000051
wherein, collision avoidance detection module (4) of force feedback human-computer cooperation includes: the dynamic equation establishing module (41) is used for establishing a connecting rod coordinate system on a preset robot platform by adopting a D-H parameter method and establishing a robot dynamic equation according to a Lagrange dynamic formula; a collision detection operator and disturbance observer establishing module (42) which constructs a collision detection operator based on the unchanged energy of the robot and a disturbance observer based on the variable quantity of the generalized momentum according to a robot dynamic equation and a momentum equation; the data analysis module (43) determines the relation between the torque and the collision force of each joint based on the real-time feedback of the current of the robot system, provides a Jacobian matrix solving method for the robot, and analyzes the effectiveness of the detection collision; the safety protection strategy making module (44) is used for making different safety protection strategies according to different collision situations and by combining actual working conditions based on the detection result of the collision detection model; the simulation verification and optimization module (45) is used for performing simulation verification and optimization on the effectiveness of a robot collision detection operator and the rationality of a safety protection strategy based on an ADAMS-Simul ink combined simulation platform; and the actual effect verification module (46) is used for verifying and evaluating the actual effect of the obstacle avoidance and protection safety strategy based on force feedback based on a preset robot platform.
Wherein, the anti-collision detection module (4) of force feedback human-computer cooperation further comprises: the monocular binocular stereo matching module (47) is used for constructing a monocular binocular stereo matching model based on SVS, optimizing geometric constraint conditions on a loss function, and realizing accurate estimation of the depth of a detected target in a monocular image through a left-right view synthesis process and binocular stereo matching; the convolution feature extraction module (48) is used for extracting deep convolution features by adopting a ResNet model based on the RGB images collected by the monocular camera; the human skeleton key point processing module (49) optimizes the structure design of the double-branch depth convolution neural network according to the geometric priori knowledge of the human skeleton joints and the correlation among the joints, realizes the synchronous processing of the joint points and the correlation among the joints, one branch performs human skeleton key point regression in a mode of combining a probability heat map and an offset, and one branch detects the joint correlation information of a plurality of people in an image and forms human skeleton sequence data through bipartite map matching; and the human body skeleton image data processing module (410) is used for reconstructing a human body skeleton image data set by combining the characteristics of an industrial human-computer cooperation scene, labeling joint point data and obtaining an attitude data set facing the industrial cooperation scene by combining manual adjustment.
The invention has the advantages of identifying the parameters which can not be modeled, carrying out active compliance control without a sensor, having good anti-collision detection capability and safety protection strategy, and being suitable for being used on a cooperative robot.
Drawings
FIG. 1 is a block schematic diagram of one embodiment of the method of the present invention;
FIG. 2 is a block diagram of an intelligent kinetic parameter identification module according to the present invention;
FIG. 3 is a block diagram of a sensorless active compliance control module according to the present invention;
FIG. 4 is a block diagram of a force feedback cooperative human-machine collision avoidance control module according to the present invention;
FIG. 5 is a schematic diagram of the structure of another expression of FIG. 4.
Detailed Description
Referring to fig. 1, fig. 1 discloses an intelligent drive and control integrated control system for a multi-axis cooperative industrial robot, which comprises: the system comprises a control module 1, an intelligent dynamic parameter identification module 2, a sensorless active compliance control module 3, a force feedback human-computer cooperation anti-collision control module 4 and a multi-axis driving module 5.
The control module 1 is used for controlling the multi-axis driving module 5 in real time according to a preset instruction or/and a signal fed back by the intelligent dynamic parameter identification module 2, the sensorless active compliance control module 3 and the force feedback man-machine cooperation anti-collision control module 4 according to the state of the cooperative robot 6.
The intelligent kinetic parameter identification module 2 provides a mechanical parameter identification signal of the cooperative robot 6 to the control module 1 in time according to the motion state of the cooperative robot 6.
The sensorless active compliance control module 3 provides the position signal, the force signal and the environment signal of the cooperative robot 6 to the control module 1 in time according to the motion state of the cooperative robot 6.
And the force feedback man-machine cooperation anti-collision control module 4 is used for providing a safety state signal of the cooperation robot 6 to the control module 1 in time according to the motion state of the cooperation robot 6.
And the multi-axis driving module 5 controls the movement of the cooperative robot 6 in real time according to the instruction of the control module 1.
The utility power input interface 7 in fig. 1 supplies power to the control module 1 and the multi-axis driving module 5, and since the control module 1 needs low-voltage direct current, a power adapter 8 is provided between the utility power input interface 7 and the control module 1.
Referring to fig. 2, the intelligent kinetic parameter identification module 2 of the present invention includes a nominal model 21 based on a lagrangian kinetic model, and obtains the movement speed of any point on each link of the cooperative robot according to the movement state of the cooperative robot 6; calculating the kinetic energy of each connecting rod of the cooperative robot in the motion process and the total kinetic energy of the whole cooperative robot in motion; calculating potential energy of each connecting rod of the cooperative robot in the motion process and total potential energy of the whole cooperative robot relative to a reference potential energy surface in the motion process; constructing a Lagrange function of the cooperative robot system according to the obtained total kinetic energy and total potential energy of the cooperative robot; and performing derivation operation on the Lagrange function obtained in the process to obtain a nominal kinetic equation of the cooperative robot system.
And the actual dynamic model 22 is used for establishing an actual dynamic model on the basis of the nominal model, and adding preset parameters which are difficult to model from the actual use of the cooperative robot system to obtain an actual dynamic equation of the actual dynamic model of the cooperative robot.
The neural network training sample acquisition module 23 is used for acquiring neural network training sample data, setting the cooperative robot into a torque working mode, selecting a smooth torque curve as the input of the cooperative robot in the range from the minimum to the maximum of joint torque, and acquiring the angular displacement, the angular speed and the angular acceleration of each joint by using a code disc of each joint; setting the sampling time as T in a sampling period T, and adopting N groups of data containing torque, angular displacement, angular velocity and angular acceleration as one-time training sample data.
The parameter identification neural network training module 24 obtains a theoretical output value of the moment tau (k) in the sample data through a nominal model
Figure BDA0002270389920000071
Combining the moments τ (k) with the actual output values in the samples
Figure BDA0002270389920000072
Inputting the data into a parameter identification neural network; obtaining an output correction value
Figure BDA0002270389920000073
The theoretical output value and the output correction value are combined to obtain an identification output value
Figure BDA0002270389920000074
The difference between the actual output value and the identified output value is used to obtain the output error
Figure BDA0002270389920000075
Establishing a loss function of the parameter identification neural network by using the output error; adopting an optimization strategy of self-learning evolution; and training the neural network to further finish the correction of the dynamic model.
Preferably, the nominal kinetic equation is:
Figure BDA0002270389920000076
wherein D (q) e Rn×nIs a symmetric positive definite inertia matrix;
Figure BDA0002270389920000081
is a matrix of Coriolis force and centrifugal force; g (q) ε Rn×1Is a gravity center item matrix;
Figure BDA0002270389920000082
q is a mechanical joint angular displacement vector,
Figure BDA0002270389920000083
Is the angular velocity vector of the mechanical arm
Figure BDA0002270389920000084
Is the angular acceleration vector of the mechanical arm; tau epsilon to RnAnd controlling the moment vector for each joint of the mechanical arm.
Preferably, the actual kinetic equation is:
Figure BDA0002270389920000085
in the above formula, F (q) represents the friction of the joint movement,
Figure BDA0002270389920000086
representing disturbances in the motion of the robot arm.
Preferably, the disturbance includes load variation, modeling error, or/and electrical interference.
Preferably, the parameters difficult to model include friction parameters, clearance parameters or/and deformation parameters of the cooperative robot.
Referring to fig. 3, fig. 3 shows a sensorless active compliance control module 3, which includes a position ring 31 and a force ring 32, where the position ring 31 includes an end position input 311, a position selection matrix 312, and a position control law module 313; the end position input terminal 311 is configured to input an end position signal to the position selection matrix 312, and the end position signal is input to the force-position hybrid control law output module 33 according to a position signal processed by the position selection matrix 312 and the position control law module 313; the force loop 32 comprises a terminal force input end 321, a force selection matrix 322, a force control law module 323 and a joint moment estimation module 324 based on motor current, wherein the terminal force input end 321 is used for inputting a terminal force signal to the force selection matrix 322, the terminal force signal is input to the force position hybrid control law output module 33 according to the force signal processed by the force selection matrix 322 and the force control law module 323, and the joint moment estimation module 324 feeds back the real-time current of the joint motor 35 to the terminal force input end 321; the force position hybrid control law output module 33 inputs a force position hybrid control law output signal G to the joint motor 35, and the joint motor 35 controls the cooperative robot 6 to act through the transmission mechanism 36.
In the context of figure 3, it is shown,
Figure BDA0002270389920000087
and fdRespectively the desired tip position and contact force.
Preferably, the present invention further includes a robot kinematics model 314, and the robot kinematics model 314 feeds back the joint angle and the angular velocity of the cooperative robot 6 to the end position input 311.
Preferably, the present invention further comprises a compensation module 34, wherein the compensation module 34 is interposed between the force position hybrid control law output module 33 and the joint motor 35.
Preferably, the position selection matrix 312 and the force selection matrix 322 are combined into one, referred to as the compliance selection matrix S.
Preferably, the expression of the compliance selection matrix S is:
S=diag(s1,s2,...,sn);
when the ith degree of freedom of the end of the mechanical arm is position control, si1, when force control is applied, si0; the compliance selection matrix S is adjusted online based on the force sense information.
Preferably, the motor current-based joint torque estimation module 324 is configured such that the complete robot dynamics equation is one of the following equations:
Figure BDA0002270389920000091
wherein M is the same as Rn×nIs a joint space inertia matrix; c is belonged to Rn×nCalculating a matrix for the coriolis force and the centripetal force; g is as large as Rn×1Is a gravity term vector; q is an element of Rn×1Is the drive joint angle vector; tau epsilon to Rn×1To drive joint torque;
motor torque taumThe equation for the drive is:
τ=Nτm
wherein N ∈ Rn×nFor a diagonal matrix of each joint reduction ratio, let JmInertia of the motor rotor; in the derivation process, friction terms at the rotor of the motor are calculated
Figure BDA0002270389920000092
Substituted motor torque module and jointAnd obtaining a joint torque estimation module based on the motor current according to the relation of the torque, so as to obtain force detection output:
Figure BDA0002270389920000093
where Ψ (i) ═ τmThe motor output torque and current mapping module.
Referring to fig. 4 and 5, in the present invention, the force feedback human-machine cooperation anti-collision detection module 4 includes:
the dynamic equation establishing module 41 is used for establishing a connecting rod coordinate system on a preset robot platform by adopting a D-H parameter method and establishing a robot dynamic equation according to a Lagrange dynamic formula;
the collision detection operator and disturbance observer establishing module 42 is used for constructing a collision detection operator based on the unchanged energy of the robot and a disturbance observer based on the variable quantity of the generalized momentum according to a robot dynamic equation and a momentum equation;
the data analysis module 43 determines the relation between the torque and the collision force of each joint based on the real-time feedback of the current of the robot system, provides a Jacobian matrix solving method for the robot, and analyzes the effectiveness of the detection collision;
the safety protection strategy making module 44 is used for making different safety protection strategies according to different collision situations and by combining actual working conditions based on the detection result of the collision detection model;
the simulation verification and optimization module 45 is used for performing simulation verification and optimization on the effectiveness of the robot collision detection operator and the rationality of the safety protection strategy based on the ADAMS-Simulink combined simulation platform;
and the actual effect verification module 46 is used for verifying and evaluating the actual effect of the obstacle avoidance and protection safety strategy based on force feedback based on a preset robot platform.
As an improvement to the present invention, the security protection policy includes: 1. stopping after collision, namely immediately enabling the servo driver to be disconnected by the control system after the robot control system detects a collision signal; or 2, switching the control mode by the robot control system after collision, and converting the position mode into a moment mode; or 3, after the collision, the robot changes the original motion track and leaves the collision area.
Preferably, the present invention further comprises:
the monocular binocular stereo matching module 47 is used for constructing a monocular binocular stereo matching model based on SVS, optimizing geometric constraint conditions on a loss function, and realizing accurate estimation of the depth of a detection target in a monocular image through a left-right view synthesis process and binocular stereo matching;
the convolution feature extraction module 48 is used for performing deep convolution feature extraction by adopting a ResNet model based on the RGB image acquired by the monocular camera;
the human skeleton key point processing module 49 optimizes the structure design of the double-branch depth convolution neural network according to the geometric priori knowledge of the human skeleton joints and the correlation relationship among the joints, realizes the synchronous processing of the joint points and the correlation relationship among the joints, one branch performs human skeleton key point regression in a mode of combining a probability heat map and an offset, and one branch detects the joint correlation information of a plurality of people in an image and forms human skeleton sequence data through bipartite map matching;
the human body skeleton image data processing module 410 is used for reconstructing a human body skeleton image data set by combining the characteristics of an industrial man-machine cooperation scene based on a Microsoft COCO data set, carrying out joint point data annotation by adopting open source alphaphase of Shanghai university of transportation, and obtaining an attitude data set facing the industrial cooperation scene by combining manual adjustment.

Claims (10)

1. A drive-control integrated control system for controlling a cooperative robot (6), the drive-control integrated control system comprising: the system comprises a control module (1), an intelligent dynamic parameter identification module (2), a sensorless active compliance control module (3), a force feedback man-machine cooperation anti-collision control module (4) and a multi-axis driving module (5);
the control module (1) is used for controlling the multi-axis driving module (5) in real time according to a preset instruction or/and a signal fed back by the intelligent dynamic parameter identification module (2), the sensorless active compliance control module (3) and the force feedback man-machine cooperation anti-collision control module (4) according to the state of the cooperative robot (6) so as to further enable the multi-axis driving module (5) to control the movement of the cooperative robot (6);
the intelligent kinetic parameter identification module (2) is used for feeding back a mechanical parameter identification signal of the cooperative robot (6) to the control module (1) according to the motion state of the cooperative robot (6); the sensorless active compliance control module (3) is used for feeding back a position signal, a force signal and an environment signal of the cooperative robot (6) to the control module (1) according to the motion state of the cooperative robot (6); the force feedback man-machine cooperation anti-collision control module (4) is used for feeding back a safety state signal of the cooperation robot (6) to the control module (1) according to the motion state of the cooperation robot (6).
2. The drive-control integrated control system according to claim 1,
the intelligent kinetic parameter identification module (2) comprises:
a nominal model (21), said nominal model (21) being based on a Lagrangian dynamics model for: according to the motion state of the cooperative robot (6), the motion speed of any point on each connecting rod of the cooperative robot is obtained, and the kinetic energy of each connecting rod of the cooperative robot in the motion process and the total kinetic energy of the cooperative robot in motion are calculated; calculating potential energy of each connecting rod of the cooperative robot in the motion process and total potential energy of each connecting rod of the cooperative robot relative to a reference potential energy plane in the motion process; constructing a Lagrange function of the cooperative robot according to the total kinetic energy and the total potential energy of the cooperative robot; performing derivation operation on the Lagrangian function to obtain a nominal kinetic equation of the cooperative robot;
the actual dynamic model (22) is used for obtaining an actual dynamic equation of the actual dynamic model of the cooperative robot according to preset parameters;
the parameter identification neural network (23) is used for setting the cooperative robot to be in a torque working mode, selecting a smooth torque curve as the input of the cooperative robot within the range from the minimum to the maximum of joint torque, and acquiring the angular displacement, the angular speed and the angular acceleration of each joint by using a code disc of each joint; setting sampling time (T) in a sampling period (T), and taking N groups of data containing torque, angular displacement, angular velocity and angular acceleration as one-time training sample data;
a learning optimization module (24) for obtaining a theoretical output value of the moment tau (k) in the sample data through the nominal model
Figure FDA0002270389910000021
Combining the moments τ (k) with the actual output values in the samples
Figure FDA0002270389910000022
Input to the parameter identification neural network to obtain output correction value
Figure FDA0002270389910000023
And combining the theoretical output value with the output correction value to obtain an identification output value
Figure FDA0002270389910000024
Obtaining an output error by subtracting the actual output value from the identified output value
Figure FDA0002270389910000025
And establishing a loss function of the parameter identification neural network by using the output error, and training the parameter identification neural network to further finish the correction of the dynamic model.
3. The drive-control integrated control system according to claim 2,
the nominal kinetic equation is:
Figure FDA0002270389910000026
wherein D (q) e Rn×nIs a symmetric positive definite inertia matrix;
Figure FDA0002270389910000027
is a matrix of Coriolis force and centrifugal force; g (q) ε Rn×1Is a gravity center item matrix;
Figure FDA0002270389910000028
q is a mechanical joint angular displacement vector,
Figure FDA0002270389910000029
Is the angular velocity vector of the mechanical arm
Figure FDA00022703899100000210
Is the angular acceleration vector of the mechanical arm; tau epsilon to RnAnd controlling the moment vector for each joint of the mechanical arm.
4. The drive-control integrated control system according to claim 2,
the actual kinetic equation is as follows:
Figure FDA00022703899100000211
wherein F (q) represents the friction of the joint movement,
Figure FDA00022703899100000212
representing disturbances in the motion of the robot arm.
5. The drive-control integrated control system according to claim 1,
the sensorless active compliance control module (3) comprises a position ring (31), a force ring (32) and a force-position hybrid control law output module (33);
wherein the position ring (31) comprises a terminal position input (311), a position selection matrix (312) and a position control law module (313); the tail end position input end (311) is used for inputting a tail end position signal to the position selection matrix (312), and the tail end position signal is input to the force-position hybrid control law output module (33) according to the position signal processed by the position selection matrix (312) and the position control law module (313);
the force ring (32) comprises a tail end force input end (321), a force selection matrix (322), a force control law module (313) and a joint moment estimation module (324) based on motor current, wherein the tail end force input end (321) is used for inputting a tail end force signal to the force selection matrix (322), the tail end force signal is input to a force position hybrid control law output module (33) according to the force signal processed by the force selection matrix (322) and the force control law module (323), and the joint moment estimation module (324) feeds back the real-time current of a joint motor (35) to the tail end force input end (321); and the force and position hybrid control law output module (33) inputs a force and position hybrid control law output signal (G) to the joint motor (35).
6. The drive-control integrated control system according to claim 5,
the sensorless active compliance control module (3) further comprises a robot kinematics model (314), and the robot kinematics model (314) feeds back joint angles and angular velocities of the cooperative robot (6) to the end position input end (311).
7. The drive-control integrated control system according to claim 5 or 6,
the sensorless active compliance control module (3) further comprises a compensation module (34), and the compensation module (34) is arranged between the force and position hybrid control law output module (33) and the joint motor (35).
8. The drive-control integrated control system according to claim 5 or 6,
the joint moment estimation module is constructed by a complete robot dynamic equation as follows:
Figure FDA0002270389910000031
wherein M is the same as Rn×nIs a joint space inertia matrix; c is belonged to Rn×nCalculating a matrix for the coriolis force and the centripetal force; g is as large as Rn×1Is a gravity term vector; q is an element of Rn×1Is the drive joint angle vector; tau epsilon to Rn×1To drive joint torque;
motor torque taumThe equation for the drive is:
τ=Nτm
wherein N ∈ Rn×nFor a diagonal matrix of each joint reduction ratio, let JmInertia of the motor rotor; in the derivation process, friction terms at the rotor of the motor are calculated
Figure FDA0002270389910000041
Substituting the relation between the motor torque module and the joint torque to obtain a joint torque estimation module based on the motor current, thereby obtaining force detection output:
Figure FDA0002270389910000042
where Ψ (i) ═ τmThe motor output torque and current mapping module.
9. The drive-control integrated control system according to claim 1 or 2,
the force feedback human-computer cooperation anti-collision detection module (4) comprises:
the dynamic equation establishing module (41) is used for establishing a connecting rod coordinate system on a preset robot platform by adopting a D-H parameter method and establishing a robot dynamic equation according to a Lagrange dynamic formula;
a collision detection operator and disturbance observer establishing module (42) which constructs a collision detection operator based on the unchanged energy of the robot and a disturbance observer based on the variable quantity of the generalized momentum according to the robot dynamic equation and the momentum equation;
the data analysis module (43) determines the relation between the torque and the collision force of each joint based on the real-time feedback of the current of the robot system, provides a Jacobian matrix solving method for the robot, and analyzes the effectiveness of the detection collision;
the safety protection strategy making module (44) is used for making different safety protection strategies according to different collision situations and by combining actual working conditions based on the detection result of the collision detection model;
the simulation verification and optimization module (45) is used for performing simulation verification and optimization on the effectiveness of a robot collision detection operator and the rationality of a safety protection strategy based on an ADAMS-Simulink combined simulation platform;
and the actual effect verification module (46) is used for verifying and evaluating the actual effect of the obstacle avoidance and protection safety strategy based on force feedback based on a preset robot platform.
10. The drive-control integrated control system according to claim 9,
the force feedback man-machine cooperation anti-collision detection module (4) further comprises:
the monocular binocular stereo matching module (47) is used for constructing a monocular binocular stereo matching model based on SVS, optimizing geometric constraint conditions on a loss function, and realizing accurate estimation of the depth of a detected target in a monocular image through a left-right view synthesis process and binocular stereo matching;
the convolution feature extraction module (48) is used for extracting deep convolution features by adopting a ResNet model based on the RGB images collected by the monocular camera;
the human skeleton key point processing module (49) optimizes the structure design of the double-branch depth convolution neural network according to the geometric priori knowledge of the human skeleton joints and the correlation among the joints, realizes the synchronous processing of the joint points and the correlation among the joints, one branch performs human skeleton key point regression in a mode of combining a probability heat map and an offset, and one branch detects the joint correlation information of a plurality of people in an image and forms human skeleton sequence data through bipartite map matching;
and the human body skeleton image data processing module (410) is used for reconstructing a human body skeleton image data set by combining the characteristics of an industrial human-computer cooperation scene, labeling joint point data and obtaining an attitude data set facing the industrial cooperation scene by combining manual adjustment.
CN201911102910.XA 2019-10-12 2019-11-12 Driving and controlling integrated control system Active CN111015649B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910969504 2019-10-12
CN2019109695047 2019-10-12

Publications (2)

Publication Number Publication Date
CN111015649A true CN111015649A (en) 2020-04-17
CN111015649B CN111015649B (en) 2020-12-25

Family

ID=70205588

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911102910.XA Active CN111015649B (en) 2019-10-12 2019-11-12 Driving and controlling integrated control system

Country Status (1)

Country Link
CN (1) CN111015649B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112936267A (en) * 2021-01-29 2021-06-11 华中科技大学 Man-machine cooperation intelligent manufacturing method and system
CN113517827A (en) * 2021-04-13 2021-10-19 高创传动科技开发(深圳)有限公司 Motor servo system, control method and device thereof, braiding machine, equipment and medium
CN113517839A (en) * 2021-04-13 2021-10-19 高创传动科技开发(深圳)有限公司 Motor servo system, control method and device thereof, component inserter, equipment and medium
CN113552807A (en) * 2021-09-22 2021-10-26 中国科学院自动化研究所 Data set generation method and device, electronic equipment and storage medium
WO2021243945A1 (en) * 2020-06-03 2021-12-09 杭州键嘉机器人有限公司 Method for robotic arm high-precision force feedback in stationary or low-speed working condition, robotic arm-assisted surgical method, and nonvolatile computer-readable medium having processor-executable program code
WO2022166328A1 (en) * 2021-02-05 2022-08-11 深圳市优必选科技股份有限公司 Task execution control method and apparatus, control device, and readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07285402A (en) * 1994-04-20 1995-10-31 Nippon Steel Corp Collision softening device for shiftable device
JP2000277039A (en) * 1999-03-26 2000-10-06 Toshiba Corp Image display device and manufacture thereof
US20110071676A1 (en) * 2009-09-22 2011-03-24 Gm Global Technology Operations, Inc. Interactive robot control system and method of use
CN103279031A (en) * 2013-05-03 2013-09-04 北京航空航天大学 Robust convergence control method of uncertain multi-agent system
CN104985598A (en) * 2015-06-24 2015-10-21 南京埃斯顿机器人工程有限公司 Industrial robot collision detection method
CN105159084A (en) * 2015-09-06 2015-12-16 台州学院 Manipulator nerve network control system with interference observer and control method
CN108115669A (en) * 2016-11-26 2018-06-05 沈阳新松机器人自动化股份有限公司 A kind of robot floating control method, apparatus and system
CN108582070A (en) * 2018-04-17 2018-09-28 上海达野智能科技有限公司 robot collision detecting system and method, storage medium, operating system
CN108801255A (en) * 2017-05-04 2018-11-13 罗伯特·博世有限公司 Methods, devices and systems for avoiding robot from colliding

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07285402A (en) * 1994-04-20 1995-10-31 Nippon Steel Corp Collision softening device for shiftable device
JP2000277039A (en) * 1999-03-26 2000-10-06 Toshiba Corp Image display device and manufacture thereof
US20110071676A1 (en) * 2009-09-22 2011-03-24 Gm Global Technology Operations, Inc. Interactive robot control system and method of use
CN103279031A (en) * 2013-05-03 2013-09-04 北京航空航天大学 Robust convergence control method of uncertain multi-agent system
CN104985598A (en) * 2015-06-24 2015-10-21 南京埃斯顿机器人工程有限公司 Industrial robot collision detection method
CN105159084A (en) * 2015-09-06 2015-12-16 台州学院 Manipulator nerve network control system with interference observer and control method
CN108115669A (en) * 2016-11-26 2018-06-05 沈阳新松机器人自动化股份有限公司 A kind of robot floating control method, apparatus and system
CN108801255A (en) * 2017-05-04 2018-11-13 罗伯特·博世有限公司 Methods, devices and systems for avoiding robot from colliding
CN108582070A (en) * 2018-04-17 2018-09-28 上海达野智能科技有限公司 robot collision detecting system and method, storage medium, operating system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
孙海波: "六自由度机械手臂的力/位混合控制", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021243945A1 (en) * 2020-06-03 2021-12-09 杭州键嘉机器人有限公司 Method for robotic arm high-precision force feedback in stationary or low-speed working condition, robotic arm-assisted surgical method, and nonvolatile computer-readable medium having processor-executable program code
CN112936267A (en) * 2021-01-29 2021-06-11 华中科技大学 Man-machine cooperation intelligent manufacturing method and system
CN112936267B (en) * 2021-01-29 2022-05-27 华中科技大学 Man-machine cooperation intelligent manufacturing method and system
WO2022166328A1 (en) * 2021-02-05 2022-08-11 深圳市优必选科技股份有限公司 Task execution control method and apparatus, control device, and readable storage medium
CN113517827A (en) * 2021-04-13 2021-10-19 高创传动科技开发(深圳)有限公司 Motor servo system, control method and device thereof, braiding machine, equipment and medium
CN113517839A (en) * 2021-04-13 2021-10-19 高创传动科技开发(深圳)有限公司 Motor servo system, control method and device thereof, component inserter, equipment and medium
CN113517839B (en) * 2021-04-13 2024-01-05 高创传动科技开发(深圳)有限公司 Motor servo system, control method and device thereof, plug-in machine, equipment and medium
CN113517827B (en) * 2021-04-13 2024-01-05 高创传动科技开发(深圳)有限公司 Motor servo system, control method and device thereof, braiding machine, equipment and medium
CN113552807A (en) * 2021-09-22 2021-10-26 中国科学院自动化研究所 Data set generation method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111015649B (en) 2020-12-25

Similar Documents

Publication Publication Date Title
CN111015649B (en) Driving and controlling integrated control system
US9862090B2 (en) Surrogate: a body-dexterous mobile manipulation robot with a tracked base
CN108656112B (en) Mechanical arm zero-force control experiment system for direct teaching
CN111055281B (en) ROS-based autonomous mobile grabbing system and method
CN106945043B (en) Multi-arm cooperative control system of master-slave teleoperation surgical robot
CN109848983A (en) A kind of method of highly conforming properties people guided robot work compound
Jafarinasab et al. Model-based motion control of a robotic manipulator with a flying multirotor base
CN113829343B (en) Real-time multitasking and multi-man-machine interaction system based on environment perception
CN112894821B (en) Current method based collaborative robot dragging teaching control method, device and equipment
CN110497405B (en) Force feedback man-machine cooperation anti-collision detection method and module for driving and controlling integrated control system
CN110385694A (en) Action teaching device, robot system and the robot controller of robot
CN111515951A (en) Teleoperation system and teleoperation control method for robot
Guo et al. A small opening workspace control strategy for redundant manipulator based on RCM method
CN114800517A (en) Multi-degree-of-freedom hydraulic mechanical arm real-time control system and method
CN112894827B (en) Method, system and device for controlling motion of mechanical arm and readable storage medium
Ye et al. Velocity decomposition based planning algorithm for grasping moving object
Siradjuddin et al. A real-time model based visual servoing application for a differential drive mobile robot using beaglebone black embedded system
Tong et al. Neural network based visual servo control under the condition of heavy loading
Matsuda et al. Control system for object transportation by a mobile robot with manipulator combined with manual operation and autonomous control
Li et al. A novel semi-autonomous teleoperation method for the tiangong-2 manipulator system
Ma et al. Unknown constrained mechanisms operation based on dynamic hybrid compliance control
Zhou et al. A cooperative shared control scheme based on intention recognition for flexible assembly manufacturing
CN110488608B (en) Intelligent kinetic parameter identification method and module for driving and controlling integrated control system
CN109773773A (en) A kind of master-slave control device, the system and method for novel six freedom parallel connection platform
CN114888768A (en) Mobile duplex robot cooperative grabbing system and method based on multi-sensor fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231016

Address after: 518000 A-301, office building, Shenzhen Institute of advanced technology, No. 1068, Xue Yuan Avenue, Shenzhen University Town, Shenzhen, Guangdong, Nanshan District, China

Patentee after: Shenzhen shen-tech advanced Cci Capital Ltd.

Address before: 1068 No. 518055 Guangdong city in Shenzhen Province, Nanshan District City Xili University School Avenue

Patentee before: SHENZHEN INSTITUTES OF ADVANCED TECHNOLOGY

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240108

Address after: 200120 Building 1, No. 1235 and 1237, Miaoxiang Road, Lingang New Area, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai

Patentee after: SHANGHAI NOZOLI MACHINE TOOLS TECHNOLOGY Co.,Ltd.

Address before: 518000 A-301, office building, Shenzhen Institute of advanced technology, No. 1068, Xue Yuan Avenue, Shenzhen University Town, Shenzhen, Guangdong, Nanshan District, China

Patentee before: Shenzhen shen-tech advanced Cci Capital Ltd.