CN110716557A - Machine parameter identification and contact force monitoring method based on priori dynamics knowledge - Google Patents
Machine parameter identification and contact force monitoring method based on priori dynamics knowledge Download PDFInfo
- Publication number
- CN110716557A CN110716557A CN201911127310.9A CN201911127310A CN110716557A CN 110716557 A CN110716557 A CN 110716557A CN 201911127310 A CN201911127310 A CN 201911127310A CN 110716557 A CN110716557 A CN 110716557A
- Authority
- CN
- China
- Prior art keywords
- robot
- joint
- moment
- contact force
- actual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 45
- 238000000034 method Methods 0.000 title claims abstract description 40
- 230000001133 acceleration Effects 0.000 claims abstract description 21
- 238000011478 gradient descent method Methods 0.000 claims abstract description 14
- 238000012545 processing Methods 0.000 claims description 23
- 239000013598 vector Substances 0.000 claims description 23
- 238000001914 filtration Methods 0.000 claims description 17
- 230000033001 locomotion Effects 0.000 claims description 11
- 239000003638 chemical reducing agent Substances 0.000 claims description 10
- 239000011159 matrix material Substances 0.000 claims description 9
- 241000282414 Homo sapiens Species 0.000 claims description 6
- 238000013528 artificial neural network Methods 0.000 claims description 6
- 230000006870 function Effects 0.000 claims description 6
- 230000003321 amplification Effects 0.000 claims description 5
- 238000001514 detection method Methods 0.000 claims description 5
- 238000003199 nucleic acid amplification method Methods 0.000 claims description 5
- 238000004891 communication Methods 0.000 claims description 4
- 230000004069 differentiation Effects 0.000 claims description 4
- 238000005070 sampling Methods 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 claims description 3
- 238000005457 optimization Methods 0.000 claims description 3
- 230000009467 reduction Effects 0.000 claims description 3
- 230000005484 gravity Effects 0.000 claims description 2
- 238000004364 calculation method Methods 0.000 claims 1
- 230000010365 information processing Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 8
- 238000011217 control strategy Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000019771 cognition Effects 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 230000037307 sensitive skin Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
The invention relates to a robot parameter identification and contact force monitoring method based on prior dynamics knowledge, which obtains joint position and speed information and joint acceleration information of a robot through an encoder, and a data acquisition card acquires actual joint torque obtained after current information processing of a motor driver analog quantity monitoring end in real time; controlling the robot to move in different tracks as much as possible in space, collecting angles, angular velocities, angular accelerations and actual moments of the robot at different moments as samples, and identifying kinetic parameters of the robot through off-line learning by a gradient descent method to obtain an ideal kinetic model of the robot; in the real-time monitoring process, the acquired and processed current information is brought into an ideal dynamic model to obtain the theoretical moment of the robot at the moment; and comparing the actual joint moment with the theoretical moment, and if the difference exceeds the threshold range, indicating that collision occurs. The method can accurately acquire contact force data at low cost.
Description
Technical Field
The invention relates to the technical field of contact force monitoring methods of a co-fusion robot and the outside world, in particular to a robot parameter identification and contact force monitoring method based on priori dynamics knowledge.
Background
Robots have the advantages of large load, fast response, high precision and the like, are generally used for replacing human beings to perform repeated, heavy or dangerous tasks, and are widely applied to various fields of industrial manufacturing. With the progress of subjects and technologies such as sensors, artificial intelligence, automatic control and the like, robots are gradually developing into intelligent equipment with sensing, cognition and autonomous action capabilities, and particularly, the task connotation of the robots is greatly enriched by the proposal of a co-fusion robot concept. The co-fusion robot is characterized in that the robot and the human share the working space and the production activity at the same time, the advantages of large bearing capacity and high-precision execution of the robot are exerted by utilizing the more direct cognition and the highly intelligent decision-making capability of the human, and the non-deterministic task in the unstructured environment is completed through the man-machine cooperative operation. The robot and the staff are in the same working space and usually need to be mutually matched in operation, in this case, the safety of the robot is very important, if the robot does not take necessary anti-collision measures, equipment damage or casualties can be caused, and therefore, the problem of the contact safety of the robot and the external environment must be solved firstly. In order to ensure the safety of the robot, the contact force between the robot and the external environment needs to be detected, and necessary control strategies are adopted in time to avoid serious collision and control the collision contact force within a completely bearable range.
The solution to the safety problem of robots can start from collision detection, and some scholars have made efforts and achieved certain results in this respect. The method for detecting the contact force between the robot and the external environment is a method of adding an external sensor, for example, a force sensor is added to the wrist of the robot, chinese patent CN201510006024.2 sets a six-dimensional force sensor between a flange at the end of a mechanical arm and a load, and continuously collects measurement data of the sensor according to a set sampling frequency to detect and respond the collision contact force of the mechanical arm. In addition, a method for detecting sensitive skin wrapped on the outer surface of the robot can well detect collision contact force and collision positions of the robot, Chinese patent CN201480076246.5 wraps the robot by using a soft covering component capable of containing gas, and the collision contact force is detected by detecting the differential pressure change between the gas pressure in the inner space of the robot and the external pressure of the soft covering component, so that the method increases the difficulty of processing external information, increases the complexity of wiring of the robot, increases the volume of the robot and reduces the flexibility. Also, a scholars proposes a visual sensor detection method, and chinese patent CN201810365980.3 obtains position information of a vacuum manipulator based on a multidimensional visual sensor and judges that the vacuum manipulator is in a dangerous state in real time. The method can comprehensively master the external environment information, but the image information is large in quantity, and the instantaneity is difficult to guarantee.
After the robot takes place to contact with external environment, the contact force can increase in the twinkling of an eye, consequently, collision detection algorithm need possess real-time, accuracy and certain contact orientation discernment ability to control the contact force better, reduce the collision injury.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a machine parameter identification and contact force monitoring method based on priori dynamics knowledge. According to the method, an ideal dynamic model of the robot in free space, namely when the robot does not collide and contact with the surrounding environment including people, is obtained through off-line learning by a gradient descent method, and the corresponding joint moment at a certain moment under the condition is called as a theoretical moment. And comparing the actual joint moment of the robot with the theoretical moment, and obtaining collision information of the robot through an upper computer resolving program.
The invention aims to conveniently and quickly identify the contact force and the contact position of the robot with the external environment so as to provide reference for the control of the robot.
The technical scheme adopted for solving the technical problems is that the robot number identification and contact force monitoring method based on the priori dynamics knowledge is provided, the method obtains the joint position and speed information of the robot through a coder, removes noise interference through median filtering, differentiates the filtered speed information and performs mean filtering to obtain the joint acceleration information of the robot, and a data acquisition card acquires the current information of a motor driver analog quantity monitoring end in real time and obtains the actual joint torque of the robot through median filtering and proportional amplification;
controlling the robot to move in different tracks as much as possible in space, simultaneously acquiring angles, angular velocities, angular accelerations and actual moments of the robot at different moments as samples, and performing offline learning by a gradient descent method based on the priori dynamics knowledge of the robot to identify dynamics parameters of the robot, including joint connecting rod mass, rotational inertia and distances from a mass center of a connecting rod to respective rotating shafts, so as to obtain an ideal dynamics model of the robot when the robot does not collide and contact with the surrounding environment including the human in free space;
in the real-time monitoring process, current information of an encoder and a motor driver is collected and processed according to the method, and is brought into an ideal dynamic model of a free space to obtain a theoretical moment of the robot at the moment; during real-time monitoring, current signals collected by a driver analog quantity monitoring end are subjected to median filtering and proportional amplification to obtain the actual joint moment at the moment;
and comparing the actual joint moment with the theoretical moment, indicating that collision occurs if the difference exceeds the threshold range, converting the forced Jacobian matrix according to the relation between the robot joint moment and the tail end contact force to obtain the contact force and the direction of the robot, and adjusting the motion of the robot by using the contact force information as a control basis to achieve the purpose of ensuring the man-machine cooperation safety.
A real-time collision detection system based on robot dynamics comprises a robot body, a data acquisition module and an upper computer, wherein the robot body is a two-degree-of-freedom joint type mechanical arm, the robot body is provided with two mechanical arms, each mechanical arm is provided with two joint connecting rods, one end of one joint connecting rod is connected with a shoulder of a robot, the other end of the joint connecting rod is connected with the other joint connecting rod, and a motor, a speed reducer and a motor driver are arranged at the connecting position for driving;
the data acquisition module comprises an encoder and a data acquisition card, wherein the encoder is used for measuring the joint angle q ═ q of the robot1,q2]TAnd joint angular velocityThe data acquisition card is used for acquiring the joint angle q ═ q measured by the encoder1,q2]TAnd joint angular velocityThe analog quantity monitoring end of the driver detects the current I and transmits the current I to the upper computer, the encoder is arranged at the connecting position of the two joint connecting rods and the connecting position of the joint connecting rods and the shoulder of the robot, and communication and data transmission are realized between the encoder and the data acquisition card through signal lines; the upper computer is communicated with the data acquisition card through a PCI bus, and is used for receiving data of the data acquisition card, carrying out corresponding processing and controlling the motor to move;
the upper computer comprises a main controller, a data processing unit, a robot dynamics parameter identification unit and a real-time monitoring unit;
the main controller is used for controlling the robot to realize the motion of the appointed track, and controlling the robot to make the robot move in different tracks as much as possible in a free space so as to provide a large amount of sample data for the kinetic parameter identification unit;
the data processing unit is used for processing the information transmitted by the data acquisition card on line in real time, and comprises the steps of filtering the average value of the speed information to make up for the speed information fluctuation caused by low sampling frequency, and obtaining the angular acceleration of each joint by carrying out first-order differentiation on the angular velocity of each joint after the average value filteringAmplifying the current signals in equal proportion according to the proportional relation between the current and the torque to obtain the actual torque of each joint;
τ=n·v·η·i
wherein τ represents an actual torque of the robot joint, n represents a reduction ratio of the robot joint reducer, v is a ratio of a rated torque to a rated current, η is reducer efficiency, and i is an actual monitoring current;
the dynamic parameter identification unit is used for simultaneously collecting and counting angles, angular velocities, angular accelerations and actual moments of the robot at different moments as samples by the data processing unit to obtain a large number of data samples, and the dynamic parameters of the robot are identified by a gradient descent method, so that an ideal dynamic model of the robot is obtained;
the real-time monitoring unit is used for judging whether collision occurs or not and calculating the position of contact and the magnitude and the direction of contact force.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides a robot body information-based real-time actual moment of each joint of a robot, which is obtained by an encoder, angle and angular velocity information of each joint and a current signal acquired by a driver analog quantity monitoring end, wherein the angular velocity is subjected to mean value filtering and then differentiation to obtain angular acceleration information, and the current signal is subjected to median filtering and proportional amplification to obtain the real-time actual moment of each joint of the robot. The robot is controlled to move in different tracks as much as possible in a free space, the angles, the angular velocities, the angular accelerations and the actual moments of the robot at different moments are collected as samples, the kinetic parameters of the robot are identified in an off-line learning mode by adopting a gradient descent method based on the priori kinetic knowledge of the robot, so that an ideal kinetic model of the robot in the free space is obtained, and the corresponding joint moment is called as a theoretical moment under the condition. During real-time monitoring, current signals collected by a driver analog quantity monitoring end are converted into actual joint torque according to the method, angle and angular velocity information of each joint obtained from an encoder is obtained according to the method to obtain angle, angular velocity and angular acceleration information of each joint, the angle, angular velocity and angular acceleration information are brought into an ideal dynamic model of the robot in a free space, so that theoretical torque of the robot at the moment is obtained, the actual torque of the robot is compared with the theoretical torque, and if the difference exceeds a certain threshold value, collision is warned. And converting the magnitude of the output force according to the corresponding relation between the robot joint torque and the terminal force. The traditional method for identifying the dynamics of the robot based on the neural network is only accurate in an observation space, the accuracy of the dynamics in the unobserved space cannot be guaranteed, in a control algorithm of the co-fusion robot, the pose adjustment of the robot can be carried out at any time according to the contact force between the robot and the outside, the neural network based on the priori knowledge can be generalized to the working space of the whole robot, and the unobserved space can be effectively predicted.
Furthermore, a median filter is designed to remove the interference of environmental noise, the mean filter is designed to process the speed fed back by the encoder, and the joint acceleration is obtained through differential processing, so that the contact force monitoring result is more accurate.
Furthermore, based on the priori knowledge of the robot dynamics, the kinetic parameters of the robot are identified by a gradient descent method, so that an accurate kinetic model (an ideal kinetic model) of the robot in a free space is obtained, the accuracy of the model is more than 98%, and the parameter result identified after the parameters approach can be directly used without being continuously corrected in the control process, namely, the applicability of the model is not only limited in an observed space, but also can be generalized to the working space of the whole robot, so that the unobserved space can be effectively predicted.
The method adopts the neural network to learn and identify corresponding kinetic parameters in an off-line manner, the identification comprises the steps that the robot such as the mass of a joint connecting rod and the moment of inertia mixes the distance between the mass center of the two connecting rods and a rotating shaft of the robot, the friction parameters are obtained by modeling through a viscous friction and coulomb friction model, and a learning sample is real data of the motion of the robot, is closer to the real model and is more accurate.
Further, the motor current signal is converted into corresponding actual joint torque, the difference between the motor current signal and theoretical torque exceeds a threshold value or not, if the difference suddenly exceeds the threshold value, collision and contact with the external environment are indicated, and the joint torque is converted into the tail end according to a robot power Jacobian matrix, so that the contact force direction and the contact force size of the tail end are obtained.
In conclusion, the invention identifies the kinetic parameters of the robot by a gradient descent method, establishes an accurate kinetic model of the robot in a free space, inputs real-time information after filtering and differentiating the angle and the angular velocity of the encoder into the accurate kinetic model of the robot to obtain a theoretical moment corresponding to the condition without contact with the external environment, simultaneously converts the current signal of the motor driver into a real-time actual moment of the joint in proportion, compares the actual moment with the theoretical moment, indicates collision and contact if the difference exceeds a threshold value, and converts the joint moment into the tail end according to a robot-powered Jacobian matrix so as to obtain the contact force direction and the contact force magnitude of the tail end. The invention does not need to install a force/torque sensor or a vision sensor at the tail end or a joint, monitors whether the robot collides or not, reduces the cost and the complex wiring and communication, shortens the control period, improves the system response speed, ensures the real-time performance of monitoring, can realize full-arm contact monitoring, inputs force information into the robot controller to become the motion adjustment basis of the robot, can better ensure the safety of the co-fusion robot, and is easier to be accepted by industrial personnel and users.
Drawings
FIG. 1 is a diagram of a system for identifying the number of machine parameters and monitoring the contact force based on the priori dynamics knowledge;
FIG. 2 is a schematic diagram of an implementation process of a method for identifying the number of machine parameters and monitoring contact force based on a priori dynamics knowledge;
FIG. 3 is a schematic flow chart of a kinetic parameter identification unit;
FIG. 4 is a schematic diagram of a real-time impact force monitoring module;
FIG. 5(a) is a comparison of the actual torque of a sample with the theoretical torque 1 axis obtained by substituting identified parameter values into a kinetic equation;
FIG. 5(b) is a comparison of the actual torque of the sample and the theoretical torque 2 axis obtained by substituting the identified parameter values into the kinetic equation;
FIG. 6(a) is a comparison graph of actual torque of non-sample data, i.e., data outside the observation space, and theoretical torque 1 axis obtained by substituting the identified parameter values into the kinetic equation;
FIG. 6(b) is a comparison graph of actual torque of non-sample data, i.e. data outside the observation space, and theoretical torque 2 axis obtained by substituting the identified parameter values into the kinetic equation;
FIG. 7 is a graph of estimated contact force versus measured contact force for a six-dimensional force sensor using the method described in this patent.
Detailed Description
Specific examples of the present invention are given below. The specific examples are intended to be illustrative of the invention only and are not intended to limit the scope of the claims of the present application.
The invention relates to a machine parameter identification and contact force monitoring method based on priori dynamics knowledge, which does not need a force sensor and a vision sensor and obtains the contact force direction and magnitude according to the priori dynamics knowledge.
The invention identifies the dynamics parameter of the robot by a gradient descent method, establishes an accurate dynamics model of the robot in a free space, inputs real-time information after filtering and differentiating the angle and the angular velocity of an encoder into the accurate dynamics model of the robot to obtain a theoretical moment corresponding to the condition without contact with the external environment, simultaneously converts the current signal of a motor driver into a real-time actual moment of a joint in proportion, compares the actual moment with the theoretical moment, indicates that collision and contact occur if the difference value exceeds a threshold value, and converts the moment of the joint to the tail end according to a Jacobian matrix of the robot, thereby obtaining the direction and the size of the contact force of the tail end. The contact force information is used as a control basis to adjust the movement of the robot so as to achieve the purpose of ensuring the safe work of the co-fusion robot.
The invention discloses a robot number identification and contact force monitoring method based on priori dynamics knowledge. Wherein the robot body is a two-degree-of-freedom joint type mechanical arm mainly comprising a driving device and a connecting rod, and figure 1 is a schematic diagram of one mechanical arm of the robot body, and a single mechanical arm is provided withThe robot comprises two joint connecting rods, namely a connecting rod 1 and a connecting rod 2, wherein the connecting rod 1 is connected with a shoulder of a robot, an encoder 1, a motor 1 and a speed reducer 1 are arranged at the connecting position of the two, and a speed reducer 2, a motor 2 and an encoder 2 driving device comprising an alternating current synchronous motor and a servo driver are arranged between the connecting rod 1 and the connecting rod 2; the data acquisition module comprises an encoder and a data acquisition card, wherein the encoder is used for measuring the joint angle q ═ q of the robot1,q2]TAnd joint angular velocityThe data acquisition card is used for acquiring the joint angle q ═ q measured by the encoder1,q2]TAnd joint angular velocityThe analog quantity monitoring end of the driver detects the current I and transmits the current I to an upper computer, and the communication and data transmission between the encoder and the data acquisition card are realized through a signal wire; the upper computer is communicated with the data acquisition card through a PCI bus, is used for receiving data of the data acquisition card, carrying out corresponding processing and controlling the driving device to move, and comprises a main controller, a data processing unit, a robot dynamics parameter identification unit and a real-time monitoring unit.
More specifically, the main controller is used for controlling the robot to realize the motion of the designated track, controlling the robot to make the robot move in different tracks as much as possible in a free space so as to provide a large amount of sample data for the kinetic parameter identification unit, and the main controller realizes the control of the motion of the robot in different tracks, preferably adopting an adaptive position controller based on an RBF neural network to compensate a kinetic modeling error.
More specifically, the data processing unit is used for processing the information transmitted by the data acquisition card on line in real time, and comprises the steps of filtering the average value of the speed information to compensate the speed information fluctuation caused by low sampling frequency, and obtaining the angular acceleration of each joint by carrying out first-order differentiation on the angular velocity of each joint after the average value is filteredThe signal is observed and analyzed to find that the environmental interference is mainly noise interference with the irregular amplitude value greatly exceeding the current signal by 2 times, and a simple and effective median filter is designed to filter the current signal of each joint so as to remove the irregular noise interference with the large amplitude value in the environment. And amplifying the current signals in equal proportion according to the proportional relation between the current and the torque to obtain the actual torque of each joint.
τ=n·v·η·i
Where τ represents the actual torque of the robot joint, n represents the robot joint reducer reduction ratio, v is the ratio of the rated torque to the rated current, η is the reducer efficiency, and i is the actual monitored current.
More specifically, the robot dynamics parameter identification unit mainly acquires and counts angles, angular velocities, angular accelerations, and actual moments of the robot at different moments as samples by using the data processing unit, and identifies dynamics parameters of the robot by using a gradient descent method, thereby obtaining an accurate dynamics model of the robot.
Assuming that the ideal values of the robot dynamics parameters are known, the vector of the components is:
θ*=[m1 *,m2 *,I1 *,I2 *,p1 *,p2 *]
wherein m is1 *,m2 *Respectively, are ideal values of the masses of the two connecting rods, I1 *,I2 *Respectively, ideal values of moment of inertia, p, of two connecting rods rotating around their respective axes of rotation1 *,p2 *Respectively, are ideal values of the distances from the centers of mass of the two connecting rods to the respective rotating shafts.
Then, the robot dynamics model is established according to the Newton-Euler method as follows
Wherein, tau*Is the corresponding joint moment under the ideal condition, q is [ q ]1,q2]TIs the vector of the angle of each joint,is the angular velocity of each joint and is,is the angular acceleration of each joint, D (q) is an inertia matrix,is the centrifugal and Coriolis force vector, G (q) is the gravity vector, τfIs the joint friction torque, τeIs a moment that overcomes the contact force with the outside.
For two-freedom rotary robot, there are
Wherein c isi=cos(qi),si=sin(qi),miRepresenting the mass of each articulated arm, IiRepresenting the moment of inertia of each joint,/iIndicating the length, p, of each linkiRepresenting the length of the connecting rod centroid to the axis. The friction of the robot is complex and is modeled in this application using viscous friction plus coulomb friction model, i.e.
Wherein f isvIs the coefficient of viscous friction, fcIs the coulomb friction coefficient.
From the above known robotsThe dynamics being related to the mass m of each jointiMoment of inertia IiLength of link rod liLength p from connecting rod center of mass to shaftiViscosity coefficient fvCoulomb coefficient of friction fcAngle q1,q2Angular velocityAngular accelerationOf each link length liThe information, which can be obtained by measurement, of the angle information q,and the actual moment tau can be acquired by the data acquisition card and processed by the data processing unit, so that other unknown parameters can be identified by adopting a gradient descent method based on a robot kinetic equation. Setting the parameter vector to be identifiedComprises the following steps:
whereinRespectively an identification value of the mass of the two connecting rods, I1 *,I2 *Respectively the values of the moments of inertia, p, of the two links about their respective axes of rotation1 *,p2 *Respectively the identification values of the distances from the mass centers of the two connecting rods to the respective rotating shafts.
The neural network constructed by the priori knowledge of the robot dynamics is as follows:
by usingThe gradient descent method obtains the unknown parameters,is manually estimatedThe artificial estimation value is a numerical value which is close to the true value, and can be obtained by an operator through experience estimation or a computer-aided three-dimensional model, the learning rate of the parameter is beta, and the loss function adopts a quadratic cost function:
The parameter updating rule is as follows:
the final result of the parameter update and optimization is
The unit adopts an off-line learning mode, and takes a large amount of data processed by the data processing unit as a learning sample. Thereby identifying the dynamic parameters of the robot according to the optimization modeAn ideal kinetic model of the robot in free space, i.e. without collision and contact with the surrounding environment including the person, is obtained, as will be described belowThe moment of the corresponding joint under this condition is called the theoretical moment.
More specifically, the real-time monitoring unit is used for judging whether collision occurs or not and calculating the position of contact and the magnitude and direction of the contact force. The processed data of the data processing unit is input into the real-time monitoring unit, and the real-time monitoring unit inputs the angle q and the angular speed of each jointAngular accelerationThe ideal dynamics model of the robot obtained by the dynamics parameter identification unit is brought in to obtain the corresponding theoretical moment of the robot under the conditions of the position, the speed and the acceleration, the actual moment transmitted by the data processing unit is compared with the theoretical moment to calculate the difference value of the moment vectors, namely the difference value
Where tau represents the actual moment vector and,representing the theoretical moment vector, and delta tau representing the difference vector between the actual moment and the theoretical moment.
If an axis differs beyond a threshold range (around 10% of theoretical moment) it is indicated that contact has occurred, i.e. contact is made
Δτn>Δτ* n
Wherein n is 1 or 2, delta taunDenotes the difference between the actual moment of the n-th axis and the theoretical moment, Δ τ* nRepresenting the difference threshold for the nth axis. If Δ τn>Δτ* nAnd Δ τn+1≤Δτ* n+1It indicates that contact occurred at the nth joint.
By
Where tau represents the actual moment vector and,representing theoretical moment vector, τeIs a torque vector generated by external collision on a robot joint and has
τe=JTFe
Wherein FeIs the contact force between the outside and the robot, JTIs a robot-human jacobian matrix.
From the above can be obtained
And then the contact force vector (the magnitude and the direction of the contact force) of the robot and the outside is input into the main controller, the robot is controlled to make corresponding compliance response according to a specific control strategy in the later period, and the movement is stopped or the force information is converted into the position adjustment information of the robot according to an impedance control strategy.
The implementation flow of the method is shown in fig. 2, the robot is controlled to move according to different tracks, the data acquisition module acquires joint angles, angular speeds and current information of the robot, the data processing unit performs real-time online processing, the obtained data is counted and stored, about 3 ten thousand sets of data are obtained as samples, and the kinetic parameter identification unit performs offline identification to obtain kinetic parameters of the robot. Fig. 5(a) and 5(b) are comparison diagrams of actual torque obtained by conversion of motor currents of a first joint (1 axis) and a second joint (2 axis) in a sample and corresponding joint theoretical torque values obtained by substituting identified dynamic parameters, joint angles, angular velocities and angular accelerations into a free space ideal dynamic equation, wherein a solid line represents the magnitude of the actual torque, a dotted line represents the magnitude of the theoretical torque, and the diagrams of fig. 5(a) and 5(b) show that two curves are almost coincident, and have small errors, which indicates that the accuracy of the identified dynamic parameters is high. Fig. 6(a) and 6(b) are diagrams respectively showing the movement of a new trajectory of the robot, data outside the obtained sample data participating in the kinetic parameter identification, that is, outside the observation space, and a comparison diagram of the actual moments of the first joint and the second joint and the theoretical moments obtained by substituting the identified parameter values into the kinetic equation, where the solid line represents the magnitude of the actual moment and the dotted line represents the magnitude of the theoretical moment, and it can be seen from fig. 6(a) and 6(b) that the two curves are almost coincident and have a small error, which indicates that the identified kinetic parameter is still applicable and accurate outside the observation space, that is, the applicability is not limited to the observed space, and the working space of the entire robot can be generalized. The method comprises the steps of installing a six-dimensional force sensor at the tail end of a robot, controlling the robot to move, applying an external contact force at the tail end of the robot, and estimating the contact force between the robot and the external environment according to the robot contact force monitoring method based on the priori dynamics knowledge. Fig. 7 is a comparison graph of the robot tail end contact force measured by the six-dimensional force sensor and the contact force estimated by the method, wherein a solid line represents the magnitude of the contact force measured by the six-dimensional force sensor, a dotted line represents the magnitude of the contact force estimated by the method, and fig. 7 shows that the difference between two curves is not large, which illustrates the accuracy of the estimation of the contact force between the robot and the external environment by the method.
In conclusion, the feasibility and accuracy of the method in identifying robot parameters and monitoring contact force can be demonstrated.
Nothing in this specification is said to apply to the prior art.
Claims (4)
1. A robot parameter identification and contact force monitoring method based on priori dynamics knowledge is characterized in that joint position and speed information of a robot are obtained through an encoder, noise interference is removed through median filtering, the filtered speed information is differentiated and subjected to mean filtering, joint acceleration information of the robot is obtained, a data acquisition card acquires current information of a motor driver analog quantity monitoring end in real time, and actual joint torque of the robot is obtained through median filtering and proportional amplification;
controlling the robot to move in different tracks as much as possible in space, simultaneously acquiring angles, angular velocities, angular accelerations and actual moments of the robot at different moments as samples, and performing offline learning by a gradient descent method based on the priori dynamics knowledge of the robot to identify dynamics parameters of the robot, including joint connecting rod mass, rotational inertia and distances from a mass center of a connecting rod to respective rotating shafts, so as to obtain an ideal dynamics model of the robot when the robot does not collide and contact with the surrounding environment including the human in free space;
in the real-time monitoring process, current information of an encoder and a motor driver is collected and processed according to the method, and is brought into an ideal dynamic model of a free space to obtain a theoretical moment of the robot at the moment; during real-time monitoring, current signals collected by a driver analog quantity monitoring end are subjected to median filtering and proportional amplification to obtain the actual joint moment at the moment;
and comparing the actual joint moment with the theoretical moment, indicating that collision occurs if the difference exceeds the threshold range, converting the forced Jacobian matrix according to the relation between the robot joint moment and the tail end contact force to obtain the contact force and the direction of the robot, and adjusting the motion of the robot by using the contact force information as a control basis to achieve the purpose of ensuring the man-machine cooperation safety.
2. The method of claim 1, wherein the kinetic parameters are identified by:
assuming that the ideal values of the robot dynamics parameters are known, the vector of the components is:
θ*=[m1 *,m2 *,I1 *,I2 *,p1 *,p2 *]
wherein m is1 *,m2 *Respectively, are ideal values of the masses of the two connecting rods, I1 *,I2 *Respectively, ideal values of moment of inertia, p, of two connecting rods rotating around their respective axes of rotation1 *,p2 *Are respectively twoIdeal values of distances from the centroids of the connecting rods to the respective rotation axes;
then, according to the Newton-Euler method, a robot dynamics model is established as
Wherein, tau*Is the corresponding joint moment under the ideal condition, q is [ q ]1,q2]TIs the vector of the angle of each joint,is the angular velocity of each joint and is,is the angular acceleration of each joint, D (q) is an inertia matrix,is the centrifugal and Coriolis force vector, G (q) is the gravity vector, τfIs the joint friction torque, τeThe moment of contact force with the outside is overcome;
for two-freedom rotary robot, there are
Wherein c isi=cos(qi),si=sin(qi),miRepresenting the mass of each articulated arm, IiRepresenting the moment of inertia of each joint,/iIndicating the length, p, of each linkiRepresenting the length of the connecting rod centroid to the shaft;
using viscous friction and Coulomb friction model to obtain joint friction torque, i.e.
Wherein f isvIs the coefficient of viscous friction, fcIs the coulomb friction coefficient;
whereinRespectively an identification value of the mass of the two connecting rods, I1 *,I2 *Respectively the values of the moments of inertia, p, of the two links about their respective axes of rotation1 *,p2 *Respectively identifying values of distances from the mass centers of the two connecting rods to respective rotating shafts;
the neural network constructed by the priori knowledge of the robot dynamics is as follows:
the unknown parameters are obtained by a gradient descent method,is manually estimatedThe learning rate of the parameters is beta, and the loss function adopts a quadratic cost function:
The parameter updating rule is as follows:
the final result of the parameter update and optimization is:
therefore, each identified dynamic parameter is obtained, and an ideal dynamic model of the robot in free space, namely when the robot does not collide and contact with the surrounding environment including the human is obtained.
3. The monitoring method of claim 1, wherein the contact force vector calculation: if an axis differs beyond a threshold range (around 10% of theoretical moment) it is indicated that contact has occurred, i.e. contact is made
Δτn>Δτ* n
Wherein n is 1 or 2, delta taunDenotes the difference between the actual moment of the n-th axis and the theoretical moment, Δ τ* nA difference threshold representing the nth axis;
if Δ τn>Δτ* nAnd Δ τn+1≤Δτ* n+1Then it indicates that contact occurred at the nth joint;
by
Where tau represents the actual moment vector and,representing theoretical moment vector, τeIs a torque vector generated by external collision on a robot joint and has
τe=JTFe
Wherein FeIs the contact force between the outside and the robot, JTIs a robot-human Jacobian matrix;
from the above can be obtained
Therefore, the contact force vector of the robot with the outside is obtained, including the magnitude and the direction.
4. A real-time collision detection system based on robot dynamics comprises a robot body, a data acquisition module and an upper computer, wherein the robot body is a two-degree-of-freedom joint type mechanical arm, the robot body is provided with two mechanical arms, each mechanical arm is provided with two joint connecting rods, one end of one joint connecting rod is connected with a shoulder of a robot, the other end of the joint connecting rod is connected with the other joint connecting rod, and a motor, a speed reducer and a motor driver are arranged at the connecting position for driving;
the data acquisition module comprises an encoder and a data acquisition card, wherein the encoder is used for measuring the joint angle q ═ q of the robot1,q2]TAnd joint angular velocityThe data acquisition card is used for acquiring the joint angle q ═ q measured by the encoder1,q2]TAnd joint angular velocityThe analog quantity monitoring end of the driver detects the current I and transmits the current I to the upper computer, the encoder is arranged at the connecting position of the two joint connecting rods and the connecting position of the joint connecting rods and the shoulder of the robot, and communication and data transmission are realized between the encoder and the data acquisition card through signal lines; the upper computer is communicated with the data acquisition card through a PCI bus, and is used for receiving data of the data acquisition card, carrying out corresponding processing and controlling the motor to move;
the upper computer comprises a main controller, a data processing unit, a robot dynamics parameter identification unit and a real-time monitoring unit;
the main controller is used for controlling the robot to realize the motion of the appointed track, and controlling the robot to make the robot move in different tracks as much as possible in a free space so as to provide a large amount of sample data for the kinetic parameter identification unit;
the data processing unit is used for processing the information transmitted by the data acquisition card on line in real time, and comprises the steps of filtering the average value of the speed information to make up for the speed information fluctuation caused by low sampling frequency, and obtaining the angular acceleration of each joint by carrying out first-order differentiation on the angular velocity of each joint after the average value filteringAmplifying the current signals in equal proportion according to the proportional relation between the current and the torque to obtain the actual torque of each joint;
τ=n·v·η·i
wherein τ represents an actual torque of the robot joint, n represents a reduction ratio of the robot joint reducer, v is a ratio of a rated torque to a rated current, η is reducer efficiency, and i is an actual monitoring current;
the dynamic parameter identification unit is used for simultaneously collecting and counting angles, angular velocities, angular accelerations and actual moments of the robot at different moments as samples by the data processing unit to obtain a large number of data samples, and the dynamic parameters of the robot are identified by a gradient descent method, so that an ideal dynamic model of the robot is obtained;
the real-time monitoring unit is used for judging whether collision occurs or not and calculating the position of contact and the magnitude and the direction of contact force.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911127310.9A CN110716557B (en) | 2019-11-18 | 2019-11-18 | Machine parameter identification and contact force monitoring method based on priori dynamics knowledge |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911127310.9A CN110716557B (en) | 2019-11-18 | 2019-11-18 | Machine parameter identification and contact force monitoring method based on priori dynamics knowledge |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110716557A true CN110716557A (en) | 2020-01-21 |
CN110716557B CN110716557B (en) | 2021-05-11 |
Family
ID=69215183
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911127310.9A Active CN110716557B (en) | 2019-11-18 | 2019-11-18 | Machine parameter identification and contact force monitoring method based on priori dynamics knowledge |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110716557B (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111546345A (en) * | 2020-05-26 | 2020-08-18 | 广州纳丽生物科技有限公司 | Skin material mechanical property measuring method based on contact dynamics model |
CN111897289A (en) * | 2020-08-05 | 2020-11-06 | 上海柏楚电子科技股份有限公司 | Torque information processing method, device, equipment and medium for motor driving mechanism |
CN112528434A (en) * | 2020-12-04 | 2021-03-19 | 上海新时达机器人有限公司 | Information identification method and device, electronic equipment and storage medium |
CN112894821A (en) * | 2021-01-30 | 2021-06-04 | 同济大学 | Current method based collaborative robot dragging teaching control method, device and equipment |
CN113126659A (en) * | 2021-04-06 | 2021-07-16 | 北京理工大学 | System and method for detecting jumping and landing state of humanoid robot |
CN113253274A (en) * | 2021-04-30 | 2021-08-13 | 西南电子技术研究所(中国电子科技集团公司第十研究所) | Fusion processing method for helicopter anti-collision ground surface power line |
CN113610218A (en) * | 2021-07-23 | 2021-11-05 | 广州大学 | Load identification method, system and device based on extreme learning machine and storage medium |
CN113601516A (en) * | 2021-08-16 | 2021-11-05 | 安徽元古纪智能科技有限公司 | Sensorless robot dragging teaching method and system |
WO2021243945A1 (en) * | 2020-06-03 | 2021-12-09 | 杭州键嘉机器人有限公司 | Method for robotic arm high-precision force feedback in stationary or low-speed working condition, robotic arm-assisted surgical method, and nonvolatile computer-readable medium having processor-executable program code |
CN114115013A (en) * | 2021-11-19 | 2022-03-01 | 深圳市汇川技术股份有限公司 | Robot motor control method, terminal device, and storage medium |
CN114505890A (en) * | 2020-11-16 | 2022-05-17 | 达明机器人股份有限公司 | System and method for robot safety compensation weight |
CN114660948A (en) * | 2022-05-24 | 2022-06-24 | 河北工业大学 | Piezoelectric impact type micro-spraying valve high-precision control method |
CN115752321A (en) * | 2022-11-09 | 2023-03-07 | 中山大学 | Medical robot motion trajectory measurement and comparison method and computer-readable storage medium |
CN118424766A (en) * | 2024-07-05 | 2024-08-02 | 江苏深蓝航天有限公司 | Test device, recovery device design method and application thereof |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104626171A (en) * | 2015-01-07 | 2015-05-20 | 北京卫星环境工程研究所 | Mechanical arm collision detection and response method based on six-dimensional force sensor |
CN106125548A (en) * | 2016-06-20 | 2016-11-16 | 珞石(北京)科技有限公司 | Industrial robot kinetic parameters discrimination method |
CN106426174A (en) * | 2016-11-05 | 2017-02-22 | 上海大学 | Robot contact force detecting method based on torque observation and friction identification |
CN107263466A (en) * | 2017-05-11 | 2017-10-20 | 西北工业大学 | Pedestal unperturbed control method of the robot for space based on quadratic programming problem |
CN108772838A (en) * | 2018-06-19 | 2018-11-09 | 河北工业大学 | A kind of mechanical arm safety collision strategy based on outer force observer |
CN109176532A (en) * | 2018-11-09 | 2019-01-11 | 中国科学院自动化研究所 | A kind of robotic arm path planing method, system and device |
CN109773794A (en) * | 2019-02-26 | 2019-05-21 | 浙江大学 | A kind of 6 axis Identification of Dynamic Parameters of Amanipulator method neural network based |
CN110146093A (en) * | 2019-06-19 | 2019-08-20 | 北京理工大学 | Binary asteroid detection independently cooperates with optical navigation method |
CN110327187A (en) * | 2019-07-10 | 2019-10-15 | 河北工业大学 | A kind of band priori torque non-model control method of ectoskeleton |
-
2019
- 2019-11-18 CN CN201911127310.9A patent/CN110716557B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104626171A (en) * | 2015-01-07 | 2015-05-20 | 北京卫星环境工程研究所 | Mechanical arm collision detection and response method based on six-dimensional force sensor |
CN106125548A (en) * | 2016-06-20 | 2016-11-16 | 珞石(北京)科技有限公司 | Industrial robot kinetic parameters discrimination method |
CN106426174A (en) * | 2016-11-05 | 2017-02-22 | 上海大学 | Robot contact force detecting method based on torque observation and friction identification |
CN107263466A (en) * | 2017-05-11 | 2017-10-20 | 西北工业大学 | Pedestal unperturbed control method of the robot for space based on quadratic programming problem |
CN108772838A (en) * | 2018-06-19 | 2018-11-09 | 河北工业大学 | A kind of mechanical arm safety collision strategy based on outer force observer |
CN109176532A (en) * | 2018-11-09 | 2019-01-11 | 中国科学院自动化研究所 | A kind of robotic arm path planing method, system and device |
CN109773794A (en) * | 2019-02-26 | 2019-05-21 | 浙江大学 | A kind of 6 axis Identification of Dynamic Parameters of Amanipulator method neural network based |
CN110146093A (en) * | 2019-06-19 | 2019-08-20 | 北京理工大学 | Binary asteroid detection independently cooperates with optical navigation method |
CN110327187A (en) * | 2019-07-10 | 2019-10-15 | 河北工业大学 | A kind of band priori torque non-model control method of ectoskeleton |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111546345A (en) * | 2020-05-26 | 2020-08-18 | 广州纳丽生物科技有限公司 | Skin material mechanical property measuring method based on contact dynamics model |
CN111546345B (en) * | 2020-05-26 | 2021-08-17 | 广州纳丽生物科技有限公司 | Skin material mechanical property measuring method based on contact dynamics model |
WO2021243945A1 (en) * | 2020-06-03 | 2021-12-09 | 杭州键嘉机器人有限公司 | Method for robotic arm high-precision force feedback in stationary or low-speed working condition, robotic arm-assisted surgical method, and nonvolatile computer-readable medium having processor-executable program code |
CN111897289A (en) * | 2020-08-05 | 2020-11-06 | 上海柏楚电子科技股份有限公司 | Torque information processing method, device, equipment and medium for motor driving mechanism |
CN111897289B (en) * | 2020-08-05 | 2022-02-18 | 上海柏楚电子科技股份有限公司 | Torque information processing method, device, equipment and medium for motor driving mechanism |
CN114505890B (en) * | 2020-11-16 | 2024-07-02 | 达明机器人股份有限公司 | System and method for robot safety compensation weight |
CN114505890A (en) * | 2020-11-16 | 2022-05-17 | 达明机器人股份有限公司 | System and method for robot safety compensation weight |
CN112528434B (en) * | 2020-12-04 | 2023-01-06 | 上海新时达机器人有限公司 | Information identification method and device, electronic equipment and storage medium |
CN112528434A (en) * | 2020-12-04 | 2021-03-19 | 上海新时达机器人有限公司 | Information identification method and device, electronic equipment and storage medium |
CN112894821B (en) * | 2021-01-30 | 2022-06-28 | 同济大学 | Current method based collaborative robot dragging teaching control method, device and equipment |
CN112894821A (en) * | 2021-01-30 | 2021-06-04 | 同济大学 | Current method based collaborative robot dragging teaching control method, device and equipment |
CN113126659A (en) * | 2021-04-06 | 2021-07-16 | 北京理工大学 | System and method for detecting jumping and landing state of humanoid robot |
CN113253274A (en) * | 2021-04-30 | 2021-08-13 | 西南电子技术研究所(中国电子科技集团公司第十研究所) | Fusion processing method for helicopter anti-collision ground surface power line |
CN113253274B (en) * | 2021-04-30 | 2024-02-06 | 西南电子技术研究所(中国电子科技集团公司第十研究所) | Fusion processing method for anti-collision ground surface power line of helicopter |
CN113610218A (en) * | 2021-07-23 | 2021-11-05 | 广州大学 | Load identification method, system and device based on extreme learning machine and storage medium |
CN113601516A (en) * | 2021-08-16 | 2021-11-05 | 安徽元古纪智能科技有限公司 | Sensorless robot dragging teaching method and system |
CN114115013A (en) * | 2021-11-19 | 2022-03-01 | 深圳市汇川技术股份有限公司 | Robot motor control method, terminal device, and storage medium |
CN114660948A (en) * | 2022-05-24 | 2022-06-24 | 河北工业大学 | Piezoelectric impact type micro-spraying valve high-precision control method |
CN115752321A (en) * | 2022-11-09 | 2023-03-07 | 中山大学 | Medical robot motion trajectory measurement and comparison method and computer-readable storage medium |
CN118424766A (en) * | 2024-07-05 | 2024-08-02 | 江苏深蓝航天有限公司 | Test device, recovery device design method and application thereof |
Also Published As
Publication number | Publication date |
---|---|
CN110716557B (en) | 2021-05-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110716557B (en) | Machine parameter identification and contact force monitoring method based on priori dynamics knowledge | |
CN108772838B (en) | Mechanical arm safe collision strategy based on external force observer | |
CN109940622B (en) | Non-sensing collision detection method for robot mechanical arm based on motor current | |
CN108015774B (en) | Sensor-free mechanical arm collision detection method | |
CN106426174B (en) | A kind of robotic contact power detection method based on torque observation and Friction identification | |
Geravand et al. | Human-robot physical interaction and collaboration using an industrial robot with a closed control architecture | |
CN108058188B (en) | Control method of robot health monitoring and fault diagnosis system | |
CN110539302A (en) | industrial robot overall dynamics modeling and dynamics parameter identification method | |
CN112549024B (en) | Robot sensorless collision detection method based on time series analysis and application | |
Makarov et al. | Adaptive filtering for robust proprioceptive robot impact detection under model uncertainties | |
CN111267105A (en) | Kinetic parameter identification and collision detection method for six-joint robot | |
CN110103222A (en) | A kind of industrial robot collision checking method | |
US20140318246A1 (en) | Inertia estimating method and inertia estimating appartus of position control apparatus | |
CN112276944A (en) | Man-machine cooperation system control method based on intention recognition | |
CN109202889A (en) | A kind of Flexible Multi-joint robot electric current Force control system and method | |
CN111347416B (en) | Detection robot collision detection method without external sensor | |
Indri et al. | Friction modeling and identification for industrial manipulators | |
CN115157260B (en) | Gravity and inertial force compensation method for six-dimensional force sensor at tail end of mechanical arm | |
CN113977578A (en) | Soft measurement method for end force of hydraulic mechanical arm | |
CN112936260A (en) | Sensor-free collision detection method and system for six-axis industrial robot | |
Falco et al. | Performance metrics and test methods for robotic hands | |
Namiki et al. | Ball catching in kendama game by estimating grasp conditions based on a high-speed vision system and tactile sensors | |
KR101487624B1 (en) | Method for controlling robot manipulator device with a redundant dof for detecting abnormal external force | |
CN113246137A (en) | Robot collision detection method based on external moment estimation model | |
CN111113488A (en) | Robot collision detection device and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |