CN110716557B - Machine parameter identification and contact force monitoring method based on priori dynamics knowledge - Google Patents

Machine parameter identification and contact force monitoring method based on priori dynamics knowledge Download PDF

Info

Publication number
CN110716557B
CN110716557B CN201911127310.9A CN201911127310A CN110716557B CN 110716557 B CN110716557 B CN 110716557B CN 201911127310 A CN201911127310 A CN 201911127310A CN 110716557 B CN110716557 B CN 110716557B
Authority
CN
China
Prior art keywords
robot
joint
moment
contact force
actual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911127310.9A
Other languages
Chinese (zh)
Other versions
CN110716557A (en
Inventor
郭士杰
朱立爽
刘今越
李洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei University of Technology
Original Assignee
Hebei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei University of Technology filed Critical Hebei University of Technology
Priority to CN201911127310.9A priority Critical patent/CN110716557B/en
Publication of CN110716557A publication Critical patent/CN110716557A/en
Application granted granted Critical
Publication of CN110716557B publication Critical patent/CN110716557B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a robot parameter identification and contact force monitoring method based on prior dynamics knowledge, which obtains joint position and speed information and joint acceleration information of a robot through an encoder, and a data acquisition card acquires actual joint torque obtained after current information processing of a motor driver analog quantity monitoring end in real time; controlling the robot to move in different tracks as much as possible in space, collecting angles, angular velocities, angular accelerations and actual moments of the robot at different moments as samples, and identifying kinetic parameters of the robot through off-line learning by a gradient descent method to obtain an ideal kinetic model of the robot; in the real-time monitoring process, the acquired and processed current information is brought into an ideal dynamic model to obtain the theoretical moment of the robot at the moment; and comparing the actual joint moment with the theoretical moment, and if the difference exceeds the threshold range, indicating that collision occurs. The method can accurately acquire contact force data at low cost.

Description

Machine parameter identification and contact force monitoring method based on priori dynamics knowledge
Technical Field
The invention relates to the technical field of contact force monitoring methods of a co-fusion robot and the outside world, in particular to a robot parameter identification and contact force monitoring method based on priori dynamics knowledge.
Background
Robots have the advantages of large load, fast response, high precision and the like, are generally used for replacing human beings to perform repeated, heavy or dangerous tasks, and are widely applied to various fields of industrial manufacturing. With the progress of subjects and technologies such as sensors, artificial intelligence, automatic control and the like, robots are gradually developing into intelligent equipment with sensing, cognition and autonomous action capabilities, and particularly, the task connotation of the robots is greatly enriched by the proposal of a co-fusion robot concept. The co-fusion robot is characterized in that the robot and the human share the working space and the production activity at the same time, the advantages of large bearing capacity and high-precision execution of the robot are exerted by utilizing the more direct cognition and the highly intelligent decision-making capability of the human, and the non-deterministic task in the unstructured environment is completed through the man-machine cooperative operation. The robot and the staff are in the same working space and usually need to be mutually matched in operation, in this case, the safety of the robot is very important, if the robot does not take necessary anti-collision measures, equipment damage or casualties can be caused, and therefore, the problem of the contact safety of the robot and the external environment must be solved firstly. In order to ensure the safety of the robot, the contact force between the robot and the external environment needs to be detected, and necessary control strategies are adopted in time to avoid serious collision and control the collision contact force within a completely bearable range.
The solution to the safety problem of robots can start from collision detection, and some scholars have made efforts and achieved certain results in this respect. The method for detecting the contact force between the robot and the external environment is a method of adding an external sensor, for example, a force sensor is added to the wrist of the robot, chinese patent CN201510006024.2 sets a six-dimensional force sensor between a flange at the end of a mechanical arm and a load, and continuously collects measurement data of the sensor according to a set sampling frequency to detect and respond the collision contact force of the mechanical arm. In addition, a method for detecting sensitive skin wrapped on the outer surface of the robot can well detect collision contact force and collision positions of the robot, Chinese patent CN201480076246.5 wraps the robot by using a soft covering component capable of containing gas, and the collision contact force is detected by detecting the differential pressure change between the gas pressure in the inner space of the robot and the external pressure of the soft covering component, so that the method increases the difficulty of processing external information, increases the complexity of wiring of the robot, increases the volume of the robot and reduces the flexibility. Also, a scholars proposes a visual sensor detection method, and chinese patent CN201810365980.3 obtains position information of a vacuum manipulator based on a multidimensional visual sensor and judges that the vacuum manipulator is in a dangerous state in real time. The method can comprehensively master the external environment information, but the image information is large in quantity, and the instantaneity is difficult to guarantee.
After the robot takes place to contact with external environment, the contact force can increase in the twinkling of an eye, consequently, collision detection algorithm need possess real-time, accuracy and certain contact orientation discernment ability to control the contact force better, reduce the collision injury.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a machine parameter identification and contact force monitoring method based on priori dynamics knowledge. According to the method, an ideal dynamic model of the robot in free space, namely when the robot does not collide and contact with the surrounding environment including people, is obtained through off-line learning by a gradient descent method, and the corresponding joint moment at a certain moment under the condition is called as a theoretical moment. And comparing the actual joint moment of the robot with the theoretical moment, and obtaining collision information of the robot through an upper computer resolving program.
The invention aims to conveniently and quickly identify the contact force and the contact position of the robot with the external environment so as to provide reference for the control of the robot.
The technical scheme adopted for solving the technical problems is that the robot number identification and contact force monitoring method based on the priori dynamics knowledge is provided, the method obtains the joint position and speed information of the robot through a coder, removes noise interference through median filtering, differentiates the filtered speed information and performs mean filtering to obtain the joint acceleration information of the robot, and a data acquisition card acquires the current information of a motor driver analog quantity monitoring end in real time and obtains the actual joint torque of the robot through median filtering and proportional amplification;
controlling the robot to move in different tracks as much as possible in space, simultaneously acquiring angles, angular velocities, angular accelerations and actual moments of the robot at different moments as samples, and performing offline learning by a gradient descent method based on the priori dynamics knowledge of the robot to identify dynamics parameters of the robot, including joint connecting rod mass, rotational inertia and distances from a mass center of a connecting rod to respective rotating shafts, so as to obtain an ideal dynamics model of the robot when the robot does not collide and contact with the surrounding environment including the human in free space;
in the real-time monitoring process, the information of the encoder and the motor driver is collected and processed according to the method, and the information is brought into an ideal dynamic model of a free space to obtain the theoretical moment of the robot at the moment; during real-time monitoring, current signals collected by a driver analog quantity monitoring end are subjected to median filtering and proportional amplification to obtain the actual joint moment at the moment;
and comparing the actual joint moment with the theoretical moment, indicating that collision occurs if the difference exceeds the threshold range, converting the forced Jacobian matrix according to the relation between the robot joint moment and the tail end contact force to obtain the contact force and the direction of the robot, and adjusting the motion of the robot by using the contact force information as a control basis to achieve the purpose of ensuring the man-machine cooperation safety.
A real-time collision detection system based on robot dynamics comprises a robot body, a data acquisition module and an upper computer, wherein the robot body is a two-degree-of-freedom joint type mechanical arm, the robot body is provided with two mechanical arms, each mechanical arm is provided with two joint connecting rods, one end of one joint connecting rod is connected with a shoulder of a robot, the other end of the joint connecting rod is connected with the other joint connecting rod, and a motor, a speed reducer and a motor driver are arranged at the connecting position for driving;
the data acquisition module comprises an encoder and a data acquisition card, wherein the encoder is used for measuring the joint angle q ═ q of the robot1,q2]TAnd joint angular velocity
Figure GDA0002987623580000021
The data acquisition card is used for acquiring the joint angle q ═ q measured by the encoder1,q2]TAnd joint angular velocity
Figure GDA0002987623580000022
The analog quantity monitoring end of the driver detects the current A and transmits the current A to the upper computer, the encoder is arranged at the connecting position of the two joint connecting rods and the connecting position of the joint connecting rods and the shoulder of the robot, and communication and data transmission are realized between the encoder and the data acquisition card through signal lines; the upper computer and the data acquisition card communicate through a PCI bus, and the upper computer is used for receiving data of the data acquisition card and carrying out phase inversionProcessing and controlling the motor to move;
the upper computer comprises a main controller, a data processing unit, a robot dynamics parameter identification unit and a real-time monitoring unit;
the main controller is used for controlling the robot to realize the motion of the appointed track, and controlling the robot to make the robot move in different tracks as much as possible in a free space so as to provide a large amount of sample data for the kinetic parameter identification unit;
the data processing unit is used for processing the information transmitted by the data acquisition card on line in real time, and comprises the steps of filtering the average value of the speed information to make up for the speed information fluctuation caused by low sampling frequency, and obtaining the angular acceleration of each joint by carrying out first-order differentiation on the angular velocity of each joint after the average value filtering
Figure GDA0002987623580000023
Amplifying the current signals in equal proportion according to the proportional relation between the current and the torque to obtain the actual torque of each joint;
τ=n·v·η·A
wherein τ represents an actual torque of the robot joint, n represents a reduction ratio of the robot joint reducer, v is a ratio of a rated torque to a rated current, η is a reducer efficiency, and a is an actual monitoring current;
the dynamic parameter identification unit is used for simultaneously collecting and counting angles, angular velocities, angular accelerations and actual moments of the robot at different moments as samples by the data processing unit to obtain a large number of data samples, and the dynamic parameters of the robot are identified by a gradient descent method, so that an ideal dynamic model of the robot is obtained;
the real-time monitoring unit is used for judging whether collision occurs or not and calculating the position of contact and the magnitude and the direction of contact force.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides a robot body information-based real-time actual moment of each joint of a robot, which is obtained by an encoder, angle and angular velocity information of each joint and a current signal acquired by a driver analog quantity monitoring end, wherein the angular velocity is subjected to mean value filtering and then differentiation to obtain angular acceleration information, and the current signal is subjected to median filtering and proportional amplification to obtain the real-time actual moment of each joint of the robot. The robot is controlled to move in different tracks as much as possible in a free space, the angles, the angular velocities, the angular accelerations and the actual moments of the robot at different moments are collected as samples, the kinetic parameters of the robot are identified in an off-line learning mode by adopting a gradient descent method based on the priori kinetic knowledge of the robot, so that an ideal kinetic model of the robot in the free space is obtained, and the corresponding joint moment is called as a theoretical moment under the condition. During real-time monitoring, current signals collected by a driver analog quantity monitoring end are converted into actual joint torque according to the method, angle and angular velocity information of each joint obtained from an encoder is obtained according to the method to obtain angle, angular velocity and angular acceleration information of each joint, the angle, angular velocity and angular acceleration information are brought into an ideal dynamic model of the robot in a free space, so that theoretical torque of the robot at the moment is obtained, the actual torque of the robot is compared with the theoretical torque, and if the difference exceeds a certain threshold value, collision is warned. And converting the magnitude of the output force according to the corresponding relation between the robot joint torque and the terminal force. The traditional method for identifying the dynamics of the robot based on the neural network is only accurate in an observation space, the accuracy of the dynamics in the unobserved space cannot be guaranteed, in a control algorithm of the co-fusion robot, the pose adjustment of the robot can be carried out at any time according to the contact force between the robot and the outside, the neural network based on the priori knowledge can be generalized to the working space of the whole robot, and the unobserved space can be effectively predicted.
Furthermore, a median filter is designed to remove the interference of environmental noise, the mean filter is designed to process the speed fed back by the encoder, and the joint acceleration is obtained through differential processing, so that the contact force monitoring result is more accurate.
Furthermore, based on the priori knowledge of the robot dynamics, the kinetic parameters of the robot are identified by a gradient descent method, so that an accurate kinetic model (an ideal kinetic model) of the robot in a free space is obtained, the accuracy of the model is more than 98%, and the parameter result identified after the parameters approach can be directly used without being continuously corrected in the control process, namely, the applicability of the model is not only limited in an observed space, but also can be generalized to the working space of the whole robot, so that the unobserved space can be effectively predicted.
The method adopts the neural network to learn and identify corresponding kinetic parameters in an off-line manner, the identification comprises the steps that the robot such as the mass of a joint connecting rod and the moment of inertia mixes the distance between the mass center of the two connecting rods and a rotating shaft of the robot, the friction parameters are obtained by modeling through a viscous friction and coulomb friction model, and a learning sample is real data of the motion of the robot, is closer to the real model and is more accurate.
Further, the motor current signal is converted into corresponding actual joint torque, the difference between the motor current signal and theoretical torque exceeds a threshold value or not, if the difference suddenly exceeds the threshold value, collision and contact with the external environment are indicated, and the joint torque is converted into the tail end according to a robot power Jacobian matrix, so that the contact force direction and the contact force size of the tail end are obtained.
In conclusion, the invention identifies the kinetic parameters of the robot by a gradient descent method, establishes an accurate kinetic model of the robot in a free space, inputs real-time information after filtering and differentiating the angle and the angular velocity of the encoder into the accurate kinetic model of the robot to obtain a theoretical moment corresponding to the condition without contact with the external environment, simultaneously converts the current signal of the motor driver into a real-time actual moment of the joint in proportion, compares the actual moment with the theoretical moment, indicates collision and contact if the difference exceeds a threshold value, and converts the joint moment into the tail end according to a robot-powered Jacobian matrix so as to obtain the contact force direction and the contact force magnitude of the tail end. The invention does not need to install a force/torque sensor or a vision sensor at the tail end or a joint, monitors whether the robot collides or not, reduces the cost and the complex wiring and communication, shortens the control period, improves the system response speed, ensures the real-time performance of monitoring, can realize full-arm contact monitoring, inputs force information into the robot controller to become the motion adjustment basis of the robot, can better ensure the safety of the co-fusion robot, and is easier to be accepted by industrial personnel and users.
Drawings
FIG. 1 is a diagram of a system for identifying the number of machine parameters and monitoring the contact force based on the priori dynamics knowledge;
FIG. 2 is a schematic diagram of an implementation process of a method for identifying the number of machine parameters and monitoring contact force based on a priori dynamics knowledge;
FIG. 3 is a schematic flow chart of a kinetic parameter identification unit;
FIG. 4 is a schematic diagram of a real-time impact force monitoring module;
FIG. 5(a) is a comparison of the actual torque of a sample with the theoretical torque 1 axis obtained by substituting identified parameter values into a kinetic equation;
FIG. 5(b) is a comparison of the actual torque of the sample and the theoretical torque 2 axis obtained by substituting the identified parameter values into the kinetic equation;
FIG. 6(a) is a comparison graph of actual torque of non-sample data, i.e., data outside the observation space, and theoretical torque 1 axis obtained by substituting the identified parameter values into the kinetic equation;
FIG. 6(b) is a comparison graph of actual torque of non-sample data, i.e. data outside the observation space, and theoretical torque 2 axis obtained by substituting the identified parameter values into the kinetic equation;
FIG. 7 is a graph of estimated contact force versus measured contact force for a six-dimensional force sensor using the method described in this patent.
Detailed Description
Specific examples of the present invention are given below. The specific examples are intended to be illustrative of the invention only and are not intended to limit the scope of the claims of the present application.
The invention relates to a machine parameter identification and contact force monitoring method based on priori dynamics knowledge, which does not need a force sensor and a vision sensor and obtains the contact force direction and magnitude according to the priori dynamics knowledge.
The invention identifies the dynamics parameter of the robot by a gradient descent method, establishes an accurate dynamics model of the robot in a free space, inputs real-time information after filtering and differentiating the angle and the angular velocity of an encoder into the accurate dynamics model of the robot to obtain a theoretical moment corresponding to the condition without contact with the external environment, simultaneously converts the current signal of a motor driver into a real-time actual moment of a joint in proportion, compares the actual moment with the theoretical moment, indicates that collision and contact occur if the difference value exceeds a threshold value, and converts the moment of the joint to the tail end according to a Jacobian matrix of the robot, thereby obtaining the direction and the size of the contact force of the tail end. The contact force information is used as a control basis to adjust the movement of the robot so as to achieve the purpose of ensuring the safe work of the co-fusion robot.
The invention discloses a robot number identification and contact force monitoring method based on priori dynamics knowledge. The robot body is a two-degree-of-freedom joint type mechanical arm and mainly comprises a driving device and connecting rods, a schematic diagram of one mechanical arm of the robot body is shown in figure 1, a single mechanical arm is provided with two joint connecting rods, the connecting rod 1 and the connecting rod 2 are connected, the connecting rod 1 is connected with a shoulder of the robot, an encoder 1, a motor 1 and a reducer 1 are arranged at the connecting positions of the two, and the reducer 2, the motor 2 and the driving device of the encoder 2 are arranged between the connecting rod 1 and the connecting rod 2 and comprise an alternating current synchronous motor and a servo driver; the data acquisition module comprises an encoder and a data acquisition card, wherein the encoder is used for measuring the joint angle q ═ q of the robot1,q2]TAnd joint angular velocity
Figure GDA0002987623580000051
The data acquisition card is used for acquiring the joint angle q ═ q measured by the encoder1,q2]TAnd joint angular velocity
Figure GDA0002987623580000052
The analog quantity monitoring end of the driver detects the current I and transmits the current I to an upper computer, and the communication and data transmission between the encoder and the data acquisition card are realized through a signal wire; the upper computer is communicated with the data acquisition card through a PCI bus, and is used for receiving the data of the data acquisition cardAnd corresponding processing and control of the movement of the driving device are carried out, and the upper computer comprises a main controller, a data processing unit, a robot dynamics parameter identification unit and a real-time monitoring unit.
More specifically, the main controller is used for controlling the robot to realize the motion of the designated track, controlling the robot to make the robot move in different tracks as much as possible in a free space so as to provide a large amount of sample data for the kinetic parameter identification unit, and the main controller realizes the control of the motion of the robot in different tracks, preferably adopting an adaptive position controller based on an RBF neural network to compensate a kinetic modeling error.
More specifically, the data processing unit is used for processing the information transmitted by the data acquisition card on line in real time, and comprises the steps of filtering the average value of the speed information to compensate the speed information fluctuation caused by low sampling frequency, and obtaining the angular acceleration of each joint by carrying out first-order differentiation on the angular velocity of each joint after the average value is filtered
Figure GDA0002987623580000053
The signal is observed and analyzed to find that the environmental interference is mainly noise interference with the irregular amplitude value greatly exceeding the current signal by 2 times, and a simple and effective median filter is designed to filter the current signal of each joint so as to remove the irregular noise interference with the large amplitude value in the environment. And amplifying the current signals in equal proportion according to the proportional relation between the current and the torque to obtain the actual torque of each joint.
τ=n·v·η·A
Where τ represents the actual torque of the robot joint, n represents the robot joint reducer reduction ratio, v is the ratio of the rated torque to the rated current, η is the reducer efficiency, and a is the actual monitored current.
More specifically, the robot dynamics parameter identification unit mainly acquires and counts angles, angular velocities, angular accelerations, and actual moments of the robot at different moments as samples by using the data processing unit, and identifies dynamics parameters of the robot by using a gradient descent method, thereby obtaining an accurate dynamics model of the robot.
An ideal value theta of a kinetic parameter of the robot is assumed*As is known, the vectors of composition are:
θ*=[m1 *,m2 *,I1 *,I2 *,p1 *,p2 *]
wherein m is1 *,m2 *Respectively, are ideal values of the masses of the two connecting rods, I1 *,I2 *Respectively, ideal values of moment of inertia, p, of two connecting rods rotating around their respective axes of rotation1 *,p2 *Respectively, are ideal values of the distances from the centers of mass of the two connecting rods to the respective rotating shafts.
Then, the robot dynamics model is established according to the Newton-Euler method as follows
Figure GDA0002987623580000054
Wherein, tau*Is the corresponding joint moment under the ideal condition, q is [ q ]1,q2]TIs the vector of the angle of each joint,
Figure GDA0002987623580000061
is the angular velocity of each joint and is,
Figure GDA0002987623580000062
is the angular acceleration of each joint, D (q) is an inertia matrix,
Figure GDA0002987623580000063
is the centrifugal and Coriolis force vector, G (q) is the gravity vector, τfIs the joint friction torque, τeIs a moment that overcomes the contact force with the outside.
For two-freedom rotary robot, there are
Figure GDA0002987623580000064
Figure GDA0002987623580000065
Figure GDA0002987623580000066
Wherein c isi=cos(qi),si=sin(qi),miRepresenting the mass of each articulated arm, IiRepresenting the moment of inertia of each joint,/iIndicating the length, p, of each linkiRepresenting the length of the connecting rod centroid to the axis. The friction of the robot is complex and is modeled in this application using viscous friction plus coulomb friction model, i.e.
Figure GDA0002987623580000067
Wherein f isvIs the coefficient of viscous friction, fcIs the coulomb friction coefficient.
From the above, it can be seen that the dynamics of the robot are related to the mass m of each jointiMoment of inertia IiLength of link rod liLength p from connecting rod center of mass to shaftiViscosity coefficient fvCoulomb coefficient of friction fcAngle q1,q2Angular velocity
Figure GDA0002987623580000068
Angular acceleration
Figure GDA0002987623580000069
Of each link length liThe information, which can be obtained by measurement, of the angle information q,
Figure GDA00029876235800000610
and the actual moment tau can be acquired by the data acquisition card and processed by the data processing unit, so that other unknown parameters can be identified by adopting a gradient descent method based on a robot kinetic equation. Is provided withParameter vector to be identified
Figure GDA00029876235800000611
Comprises the following steps:
Figure GDA00029876235800000612
wherein
Figure GDA00029876235800000613
Respectively, the identification values of the mass of the two connecting rods,
Figure GDA00029876235800000614
respectively are the identification values of the moment of inertia of the two connecting rods rotating around the respective rotating shafts,
Figure GDA00029876235800000615
respectively the identification values of the distances from the mass centers of the two connecting rods to the respective rotating shafts.
The neural network constructed by the priori knowledge of the robot dynamics is as follows:
Figure GDA00029876235800000616
the unknown parameters are obtained by a gradient descent method,
Figure GDA00029876235800000617
is manually estimated
Figure GDA00029876235800000618
The artificial estimation value is a numerical value which is close to the true value, and can be obtained by an operator through experience estimation or a computer-aided three-dimensional model, the learning rate of the parameter is beta, and the loss function adopts a quadratic cost function:
Figure GDA0002987623580000071
the relative of the loss function L
Figure GDA0002987623580000072
Has a gradient of
Figure GDA0002987623580000073
The parameter updating rule is as follows:
Figure GDA0002987623580000074
the final result of the parameter update and optimization is
Figure GDA0002987623580000075
The unit adopts an off-line learning mode, and takes a large amount of data processed by the data processing unit as a learning sample. Thereby identifying the dynamic parameters of the robot according to the optimization mode
Figure GDA0002987623580000076
An ideal dynamic model of the robot in free space, i.e. without collision or contact with the surrounding environment including the human, is obtained, and the corresponding joint moment at a certain time under such conditions is hereinafter referred to as a theoretical moment.
More specifically, the real-time monitoring unit is used for judging whether collision occurs or not and calculating the position of contact and the magnitude and direction of the contact force. The processed data of the data processing unit is input into the real-time monitoring unit, and the real-time monitoring unit inputs the angle q and the angular speed of each joint
Figure GDA0002987623580000077
Angular acceleration
Figure GDA0002987623580000078
The ideal dynamics model of the robot obtained by the dynamics parameter identification unit is brought in to obtain the corresponding theoretical moment of the robot under the conditions of the position, the speed and the acceleration, the actual moment transmitted by the data processing unit is compared with the theoretical moment to calculate the difference value of the moment vectors, namely the difference value
Figure GDA0002987623580000079
Where tau represents the actual moment vector and,
Figure GDA00029876235800000710
representing the theoretical moment vector, and delta tau representing the difference vector between the actual moment and the theoretical moment.
If an axis differs beyond a threshold range (around 10% of theoretical moment) it is indicated that contact has occurred, i.e. contact is made
Δτn>Δτ* n
Wherein n is 1 or 2, delta taunDenotes the difference between the actual moment of the n-th axis and the theoretical moment, Δ τ* nRepresenting the difference threshold for the nth axis. If Δ τn>Δτ* nAnd Δ τn+1≤Δτ* n+1It indicates that contact occurred at the nth joint.
By
Figure GDA00029876235800000711
Where tau represents the actual moment vector and,
Figure GDA00029876235800000712
representing theoretical moment vector, τeIs a torque vector generated by external collision on a robot joint and has
τe=JTFe
Wherein FeIs the contact force between the outside and the robot, JTIs a robot-human jacobian matrix.
From the above can be obtained
Figure GDA0002987623580000081
And then the contact force vector (the magnitude and the direction of the contact force) of the robot and the outside is input into the main controller, the robot is controlled to make corresponding compliance response according to a specific control strategy in the later period, and the movement is stopped or the force information is converted into the position adjustment information of the robot according to an impedance control strategy.
The implementation flow of the method is shown in fig. 2, the robot is controlled to move according to different tracks, the data acquisition module acquires joint angles, angular speeds and current information of the robot, the data processing unit performs real-time online processing, the obtained data is counted and stored, about 3 ten thousand sets of data are obtained as samples, and the kinetic parameter identification unit performs offline identification to obtain kinetic parameters of the robot. Fig. 5(a) and 5(b) are comparison diagrams of actual torque obtained by conversion of motor currents of a first joint (1 axis) and a second joint (2 axis) in a sample and corresponding joint theoretical torque values obtained by substituting identified dynamic parameters, joint angles, angular velocities and angular accelerations into a free space ideal dynamic equation, wherein a solid line represents the magnitude of the actual torque, a dotted line represents the magnitude of the theoretical torque, and the diagrams of fig. 5(a) and 5(b) show that two curves are almost coincident, and have small errors, which indicates that the accuracy of the identified dynamic parameters is high. Fig. 6(a) and 6(b) are diagrams respectively showing the movement of a new trajectory of the robot, data outside the obtained sample data participating in the kinetic parameter identification, that is, outside the observation space, and a comparison diagram of the actual moments of the first joint and the second joint and the theoretical moments obtained by substituting the identified parameter values into the kinetic equation, where the solid line represents the magnitude of the actual moment and the dotted line represents the magnitude of the theoretical moment, and it can be seen from fig. 6(a) and 6(b) that the two curves are almost coincident and have a small error, which indicates that the identified kinetic parameter is still applicable and accurate outside the observation space, that is, the applicability is not limited to the observed space, and the working space of the entire robot can be generalized. The method comprises the steps of installing a six-dimensional force sensor at the tail end of a robot, controlling the robot to move, applying an external contact force at the tail end of the robot, and estimating the contact force between the robot and the external environment according to the robot contact force monitoring method based on the priori dynamics knowledge. Fig. 7 is a comparison graph of the robot tail end contact force measured by the six-dimensional force sensor and the contact force estimated by the method, wherein a solid line represents the magnitude of the contact force measured by the six-dimensional force sensor, a dotted line represents the magnitude of the contact force estimated by the method, and fig. 7 shows that the difference between two curves is not large, which illustrates the accuracy of the estimation of the contact force between the robot and the external environment by the method.
In conclusion, the feasibility and accuracy of the method in identifying robot parameters and monitoring contact force can be demonstrated.
Nothing in this specification is said to apply to the prior art.

Claims (3)

1. A robot parameter identification and contact force monitoring method based on priori dynamics knowledge is characterized in that joint angles and angular velocities of a robot are obtained through an encoder, noise interference is removed through median filtering, filtered velocity information is differentiated and subjected to mean filtering, joint angular acceleration of the robot is obtained, a data acquisition card acquires current information of a motor driver analog quantity monitoring end in real time, and actual joint torque of the robot is obtained through median filtering and proportional amplification;
controlling the robot to move in different tracks as much as possible in space, simultaneously acquiring angles, angular velocities, angular accelerations and actual moments of the robot at different moments as samples, and performing offline learning by a gradient descent method based on the priori dynamics knowledge of the robot to identify dynamics parameters of the robot, including joint connecting rod mass, rotational inertia and distances from a mass center of a connecting rod to respective rotating shafts, so as to obtain an ideal dynamics model of the robot when the robot does not collide and contact with the surrounding environment including the human in free space;
in the real-time monitoring process, the joint angle and angular velocity of the robot and the joint angular acceleration of the robot are introduced into an ideal dynamic model of a free space to obtain the theoretical moment of the robot at the moment; during real-time monitoring, current information collected at the analog quantity monitoring end of the motor driver is subjected to median filtering and proportional amplification to obtain the actual joint torque at the moment;
comparing the actual joint moment with the theoretical moment, if the difference exceeds the threshold value range, indicating that collision occurs, converting the forced Jacobian matrix according to the relation between the robot joint moment and the tail end contact force to obtain the contact force and the direction of the robot, and adjusting the robot motion by using the contact force information as a control basis to achieve the purpose of ensuring the man-machine cooperation safety;
the identification process of the kinetic parameters comprises the following steps:
an ideal value theta of a kinetic parameter of the robot is assumed*As is known, the vectors of composition are:
θ*=[m1 *,m2 *,I1 *,I2 *,p1 *,p2 *]
wherein m is1 *,m2 *Respectively, are ideal values of the masses of the two connecting rods, I1 *,I2 *Respectively, ideal values of moment of inertia, p, of two connecting rods rotating around their respective axes of rotation1 *,p2 *Respectively are ideal values of the distances from the mass centers of the two connecting rods to the respective rotating shafts;
then, according to the Newton-Euler method, a robot dynamics model is established as
Figure FDA0002987623570000011
Wherein, tau*Is the corresponding joint moment under the ideal condition, q is [ q ]1,q2]TIs the vector of the angle of each joint,
Figure FDA0002987623570000012
is the angular velocity of each joint and is,
Figure FDA0002987623570000013
is the angular acceleration of each joint, D (q) is an inertia matrix,
Figure FDA0002987623570000014
is the centrifugal and Coriolis force vector, G (q) is the gravity vector, τfIs the joint friction torque, τeThe moment of contact force with the outside is overcome;
for two-freedom rotary robot, there are
Figure FDA0002987623570000015
Figure FDA0002987623570000016
Figure FDA0002987623570000017
Wherein c isi=cos(qi),si=sin(qi),miRepresenting the mass of each articulated arm, IiRepresenting the moment of inertia of each joint,/iIndicating the length, p, of each linkiRepresenting the length of the connecting rod centroid to the shaft; g represents the gravitational acceleration; i is 1 or 2; c. C12=cos(q1+q2);
Using viscous friction and Coulomb friction model to obtain joint friction torque, i.e.
Figure FDA0002987623570000021
Wherein f isvIs the coefficient of viscous friction, fcIs the coulomb friction coefficient;
setting the parameter vector to be identifiedMeasurement of
Figure FDA0002987623570000022
Comprises the following steps:
Figure FDA0002987623570000023
wherein
Figure FDA0002987623570000024
Respectively, the identification values of the mass of the two connecting rods,
Figure FDA0002987623570000025
respectively are the identification values of the moment of inertia of the two connecting rods rotating around the respective rotating shafts,
Figure FDA0002987623570000026
respectively identifying values of distances from the mass centers of the two connecting rods to respective rotating shafts;
the neural network constructed by the priori knowledge of the robot dynamics is as follows:
Figure FDA0002987623570000027
the unknown parameters are obtained by a gradient descent method,
Figure FDA0002987623570000028
is manually estimated
Figure FDA0002987623570000029
The learning rate of the parameters is beta, and the loss function adopts a quadratic cost function:
Figure FDA00029876235700000210
then the loss function L isRelative to
Figure FDA00029876235700000211
Has a gradient of
Figure FDA00029876235700000212
The parameter updating rule is as follows:
Figure FDA00029876235700000213
the final result of the parameter update and optimization is:
Figure FDA00029876235700000214
therefore, each identified dynamic parameter is obtained, and an ideal dynamic model of the robot in free space, namely when the robot does not collide and contact with the surrounding environment including the human is obtained.
2. The monitoring method of claim 1, wherein the contact force vector calculation: if a certain axis differs beyond a threshold, contact is indicated, i.e. contact is made
Δτn>Δτ* n
Wherein n is 1 or 2, delta taunDenotes the difference between the actual moment of the n-th axis and the theoretical moment, Δ τ* nA difference threshold representing the nth axis;
if Δ τn>Δτ* nAnd Δ τn+1≤Δτ* n+1Then it indicates that contact occurred at the nth joint;
by
Figure FDA00029876235700000215
Where τ represents the actual forceThe vector of the moment is, in turn,
Figure FDA00029876235700000216
representing theoretical moment vector, τeIs a torque vector generated by external collision on a robot joint and has
τe=JTFe
Wherein FeIs the contact force between the outside and the robot, JTIs a robot-human Jacobian matrix;
from the above can be obtained
Figure FDA0002987623570000031
Therefore, the contact force vector of the robot with the outside is obtained, including the magnitude and the direction.
3. A real-time collision detection system based on robot dynamics is characterized in that the detection system applies the robot number identification and contact force monitoring method based on priori dynamics knowledge as claimed in claim 1 or 2, the detection system comprises a robot body, a data acquisition module and an upper computer, wherein the robot body is a two-degree-of-freedom joint type mechanical arm, the robot body is provided with two mechanical arms, a single mechanical arm is provided with two joint connecting rods, one end of one joint connecting rod is connected with a shoulder of a robot, the other end of the joint connecting rod is connected with the other joint connecting rod, and a motor, a speed reducer and a motor driver are arranged at the connecting position for driving;
the data acquisition module comprises an encoder and a data acquisition card, wherein the encoder is used for measuring the joint angle q ═ q of the robot1,q2]TAnd joint angular velocity
Figure FDA0002987623570000032
The data acquisition card is used for acquiring the joint angle q ═ q measured by the encoder1,q2]TAnd joint angular velocity
Figure FDA0002987623570000033
The analog quantity monitoring end of the driver detects the current A and transmits the current A to the upper computer, the encoder is arranged at the connecting position of the two joint connecting rods and the connecting position of the joint connecting rods and the shoulder of the robot, and communication and data transmission are realized between the encoder and the data acquisition card through signal lines; the upper computer is communicated with the data acquisition card through a PCI bus, and is used for receiving data of the data acquisition card, carrying out corresponding processing and controlling the motor to move;
the upper computer comprises a main controller, a data processing unit, a robot dynamics parameter identification unit and a real-time monitoring unit;
the main controller is used for controlling the robot to realize the motion of the appointed track, and controlling the robot to make the robot move in different tracks as much as possible in a free space so as to provide a large amount of sample data for the kinetic parameter identification unit;
the data processing unit is used for processing the information transmitted by the data acquisition card on line in real time, and comprises the steps of filtering the average value of the speed information to make up for the speed information fluctuation caused by low sampling frequency, and obtaining the angular acceleration of each joint by carrying out first-order differentiation on the angular velocity of each joint after the average value filtering
Figure FDA0002987623570000034
Amplifying the current signals in equal proportion according to the proportional relation between the current and the torque to obtain the actual torque of each joint;
τ=n·v·η·A
wherein τ represents an actual torque of the robot joint, n represents a reduction ratio of the robot joint reducer, v is a ratio of a rated torque to a rated current, η is a reducer efficiency, and a is an actual monitoring current;
the dynamic parameter identification unit is used for simultaneously collecting and counting angles, angular velocities, angular accelerations and actual moments of the robot at different moments as samples by the data processing unit to obtain a large number of data samples, and the dynamic parameters of the robot are identified by a gradient descent method, so that an ideal dynamic model of the robot is obtained;
the real-time monitoring unit is used for judging whether collision occurs or not and calculating the position of contact and the magnitude and the direction of contact force.
CN201911127310.9A 2019-11-18 2019-11-18 Machine parameter identification and contact force monitoring method based on priori dynamics knowledge Active CN110716557B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911127310.9A CN110716557B (en) 2019-11-18 2019-11-18 Machine parameter identification and contact force monitoring method based on priori dynamics knowledge

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911127310.9A CN110716557B (en) 2019-11-18 2019-11-18 Machine parameter identification and contact force monitoring method based on priori dynamics knowledge

Publications (2)

Publication Number Publication Date
CN110716557A CN110716557A (en) 2020-01-21
CN110716557B true CN110716557B (en) 2021-05-11

Family

ID=69215183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911127310.9A Active CN110716557B (en) 2019-11-18 2019-11-18 Machine parameter identification and contact force monitoring method based on priori dynamics knowledge

Country Status (1)

Country Link
CN (1) CN110716557B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111546345B (en) * 2020-05-26 2021-08-17 广州纳丽生物科技有限公司 Skin material mechanical property measuring method based on contact dynamics model
CN111618859B (en) * 2020-06-03 2021-04-13 杭州键嘉机器人有限公司 Method for feeding back mechanical arm high-precision force under static or low-speed working condition
CN111897289B (en) * 2020-08-05 2022-02-18 上海柏楚电子科技股份有限公司 Torque information processing method, device, equipment and medium for motor driving mechanism
TWI764377B (en) * 2020-11-16 2022-05-11 達明機器人股份有限公司 System and method for safely compensating weight of robot
CN112528434B (en) * 2020-12-04 2023-01-06 上海新时达机器人有限公司 Information identification method and device, electronic equipment and storage medium
CN112894821B (en) * 2021-01-30 2022-06-28 同济大学 Current method based collaborative robot dragging teaching control method, device and equipment
CN113126659B (en) * 2021-04-06 2022-04-08 北京理工大学 System and method for detecting jumping and landing state of humanoid robot
CN113253274B (en) * 2021-04-30 2024-02-06 西南电子技术研究所(中国电子科技集团公司第十研究所) Fusion processing method for anti-collision ground surface power line of helicopter
CN113610218B (en) * 2021-07-23 2023-04-25 广州大学 Load identification method, system, device and storage medium based on extreme learning machine
CN113601516A (en) * 2021-08-16 2021-11-05 安徽元古纪智能科技有限公司 Sensorless robot dragging teaching method and system
CN114115013A (en) * 2021-11-19 2022-03-01 深圳市汇川技术股份有限公司 Robot motor control method, terminal device, and storage medium
CN114660948B (en) * 2022-05-24 2022-08-02 河北工业大学 High-precision control method for piezoelectric impact type micro-spraying valve
CN115752321A (en) * 2022-11-09 2023-03-07 中山大学 Medical robot motion trajectory measurement and comparison method and computer-readable storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104626171A (en) * 2015-01-07 2015-05-20 北京卫星环境工程研究所 Mechanical arm collision detection and response method based on six-dimensional force sensor
CN106125548A (en) * 2016-06-20 2016-11-16 珞石(北京)科技有限公司 Industrial robot kinetic parameters discrimination method
CN106426174B (en) * 2016-11-05 2019-01-11 上海大学 A kind of robotic contact power detection method based on torque observation and Friction identification
CN107263466B (en) * 2017-05-11 2020-07-17 西北工业大学 Base undisturbed control method of space robot based on quadratic programming problem
CN108772838B (en) * 2018-06-19 2021-04-27 河北工业大学 Mechanical arm safe collision strategy based on external force observer
CN109176532B (en) * 2018-11-09 2020-09-29 中国科学院自动化研究所 Method, system and device for planning path of mechanical arm
CN109773794B (en) * 2019-02-26 2022-01-25 浙江大学 6-axis robot dynamics parameter identification method based on neural network
CN110146093B (en) * 2019-06-19 2020-12-15 北京理工大学 Double-body asteroid detection autonomous collaborative optical navigation method
CN110327187B (en) * 2019-07-10 2021-07-13 河北工业大学 Model-free control method with prior moment for exoskeleton

Also Published As

Publication number Publication date
CN110716557A (en) 2020-01-21

Similar Documents

Publication Publication Date Title
CN110716557B (en) Machine parameter identification and contact force monitoring method based on priori dynamics knowledge
CN108772838B (en) Mechanical arm safe collision strategy based on external force observer
CN106426174B (en) A kind of robotic contact power detection method based on torque observation and Friction identification
CN108058188B (en) Control method of robot health monitoring and fault diagnosis system
Shang et al. Synchronization control in the cable space for cable-driven parallel robots
Geravand et al. Human-robot physical interaction and collaboration using an industrial robot with a closed control architecture
CN112549024B (en) Robot sensorless collision detection method based on time series analysis and application
CN110539302B (en) Industrial robot overall dynamics modeling and dynamics parameter identification method
Makarov et al. Adaptive filtering for robust proprioceptive robot impact detection under model uncertainties
CN111267105A (en) Kinetic parameter identification and collision detection method for six-joint robot
CN112276944A (en) Man-machine cooperation system control method based on intention recognition
CN113977578A (en) Soft measurement method for end force of hydraulic mechanical arm
CN113189950B (en) Double-robot cooperative flexible assembly and adjustment method for assembling large weak-rigidity structural member
Indri et al. Friction modeling and identification for industrial manipulators
CN112936260A (en) Sensor-free collision detection method and system for six-axis industrial robot
CN113246137A (en) Robot collision detection method based on external moment estimation model
CN111113488A (en) Robot collision detection device and method
CN112318501B (en) Method for improving detection precision and protection sensitivity of collision force of robot
KR101487624B1 (en) Method for controlling robot manipulator device with a redundant dof for detecting abnormal external force
CN113021353A (en) Robot collision detection method
CN116330259A (en) Collaborative robot collision detection method based on decision tree
CN113143465B (en) Method for dragging, guiding and positioning mechanical arm based on joint torque
CN116330344A (en) Cooperative robot collision detection method based on supervised learning support vector machine
CN113043269A (en) Robot contact force observation system based on robot model
He et al. Identification Method of Robot Dynamics Parameters Based on Improved Friction Model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant