CN110561424A - online robot kinematic calibration method based on multi-sensor hybrid filter - Google Patents

online robot kinematic calibration method based on multi-sensor hybrid filter Download PDF

Info

Publication number
CN110561424A
CN110561424A CN201910765787.3A CN201910765787A CN110561424A CN 110561424 A CN110561424 A CN 110561424A CN 201910765787 A CN201910765787 A CN 201910765787A CN 110561424 A CN110561424 A CN 110561424A
Authority
CN
China
Prior art keywords
representing
robot
particle
sensor
quaternion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910765787.3A
Other languages
Chinese (zh)
Inventor
杜广龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to PCT/CN2019/114533 priority Critical patent/WO2021031346A1/en
Publication of CN110561424A publication Critical patent/CN110561424A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope

Abstract

The invention provides an online robot kinematic calibration method based on a multi-sensor hybrid filter, which comprises the following steps: s1, respectively measuring the orientation and the position of the robot end effector by using an inertial sensor and a position sensor; s2, estimating the attitude of the tail end of the robot by using Kalman filtering; s3, estimating the position of the tail end of the robot by using particle filtering; and S4, collecting the postures of the robot by using the hybrid sensor, and calculating differential errors of the kinematic parameters by using extended Kalman filtering so as to optimize the kinematic errors. The invention realizes the rapid online correction of the robot parameters under the condition of not stopping the robot, thereby greatly improving the operation precision and efficiency of the robot.

Description

Online robot kinematic calibration method based on multi-sensor hybrid filter
Technical Field
the invention belongs to the field of robot motion, and particularly relates to an online robot motion calibration method based on a multi-sensor hybrid filter.
Background
depending on the tolerances inherent in the production process, deviations occur between the measured values and the actual values in the kinematic parameters of the robot. When nominal kinematic parameters are used to estimate the position and orientation of the robot, robot end effector errors will result due to kinematic errors. To accomplish the calibration task, there are a variety of measurement techniques, such as expensive custom functions, coordinate measuring machines and laser tracking interferometer systems.
Due to the development of sensors, the process of the conventional off-line robot motion calibration method is converted into an on-line mode. The traditional vision-based robot calibration technology, the laser-based robot calibration technology and the like realize the online measurement of position and direction errors by adding additional functions. There is another more convenient way. I.e. mounting contactless position and orientation measurement sensors, such as orientation measurement sensors (often called inertial measurement units, IMU for short), position sensors, etc.
However, the current calibration method is only in-situ calibration and does not start from any position.
disclosure of Invention
The present invention aims to overcome the above-mentioned disadvantages of the prior art and proposes an online robot kinematics calibration method based on a multi-sensor hybrid filter, which allows cooperation between various robots, which means that the robot should be able to automatically calibrate parameters in a short time.
The online robot kinematic calibration method based on the multi-sensor hybrid filter comprises the following steps:
S1, respectively measuring the orientation and the position of the robot end effector by using an inertial sensor and a position sensor;
S2, estimating the position of the robot end effector by using Kalman Filtering (KF);
s3, estimating the attitude of the robot end effector by using Particle Filter (PF);
S4, calculating differential errors of DH kinematic parameters by using Extended Kalman Filter (EKF) to optimize the kinematic errors; the DH kinematic parameters include link length, link twist angle, link offset, and joint angle.
Further, the step S1 specifically includes:
in order to measure the rotation angles of the robot end around the x, y and z axes (base space rectangular coordinate system), an inertial measurement unit is rigidly fixed on the robot end effector, and comprises a magnetometer, two gyroscopes and an accelerometer; using a Factor Quaternion Algorithm (FQA) based on measured data of magnetic and gravitational fields to improve the accuracy of the estimation process; the Euler angles obtained by measurement are used for representing the attitude of the end effector and the quaternion [ q [ ]0,q1,q2,q3]Obtained by conversion of the euler angle:
Where φ, θ, ψ denote rotational angles about the x, y, and z axes, respectively.
The quaternion satisfies the following relationship:
wherein q is0is a quaternion scalar, [ q ]1,q2,q3]Is a quaternion vector.
The direction cosine matrix from the inertial sensor coordinate system to the world coordinate system is represented as:
further, in step S2, the position of the robot end effector is estimated by using kalman filtering as follows:
The following quaternion q is used0、q1、q2And q is3a differential function over time t, reducing magnetismmeasurement errors of force meters and gyroscopes:
In which ξx、ξyand xizrespectively represent x in the inertial systems、ysand zsAn angular velocity component of the shaft;
defining a state transition matrix:
WhereinΔtis the sample time, state xKF,kcomposed of quaternion states and angular velocities, described as xKF,k=[q0,k q1,kq2,k q3,k ξx,k ξy,k ξz,k]Wherein q is0,k、q1,k、q2,kAnd q is3,kRepresenting a quaternion state, ξ, at time kx,k、ξy,kand xiz,krespectively in the inertial system xs、ysAnd zsThe angular velocity of the shaft at time k;
Defining a process noise vector:
wk=[0 0 0 0 wx wy wz]T
Wherein wx、wyAnd wzprocess noise representing angular velocity, the observation matrix H assuming the calibrated gyroscope detects angular velocitykis Hk=[0n×p In×n]where n is the number of angular velocity vectors, p is the dimension of a quaternion, the observation matrix determines a quaternion q at time kkThe normalized form of (a) is:
qk=[q0,k/M q1,k/M q2,k/M q3,k/M]
Further, in step S3, the pose of the robot end effector is estimated by using particle filtering as follows:
The position and acceleration of each sample point is estimated by particle filtering of the following equation:
Whereinrepresenting the velocity of the ith particle, a representing the acceleration of the IMU, g representing the local gravity vector,Represents the terminal position, Cirepresenting a rotation matrix from the inertial measurement sensor coordinate system to the world coordinate system.
the position state of the robot end is defined as xPF=[px,py,pz,ax,ay,az],px,py,pzAnd ax,ay,azrepresenting position and acceleration in the x, y, z axes, respectively; to obtain more accurate weights, a time period Δ T is usedsinstead of the instantaneous position difference at time k, where s represents the number of iterations; the cumulative position difference of the estimated value and the calculated value of the ith particle is used for likelihood calculation as follows:
Ms=ΔTs/t
MsIs expressed at DeltaTstime intervals for calculating the accumulated position difference for each particle taken in, t represents the number of intervals, whereinRespectively representing the position states of the ith particle on the x, y and z axes at time k,Respectively, represent the position states of the ith particle on the x, y and z axes estimated by kalman filtering at time k. The approximate posterior is expressed as:
Wherein x is0:sIndicates the current particle position,Indicates the position of the ith particle until the number of iterations is s,Represents the accumulated position difference until the iteration number is s; n is the number of particles in the particle filtering algorithm,Is the normalized weight of the ith particle at time k, δ (·) is the dirac δ function; the posterior probability satisfies the following equation:
(p(xs|xs-1) The probability of the prior is represented by,the posterior probability at the number of iterations s-1 is shown,Is represented by xsUnder the conditions ofthe probability of,Representing the calculated accumulated position error, xsRepresenting the position of the particle at the number of iterations s;
normalized weight of ith particle at iteration number sThe method comprises the following steps:
Wherein r (-) is the importance density;Representing the position state of the ith particle at the iteration number s; by resampling and subtracting p (x)s|xs-1) Obtaining the important density, and obtaining the normalized weight by the following method:
Wherein the content of the first and second substances,Is shown inunder the conditions ofThe probability of,Representing the actual position of the ith particle at iteration number s,
the weight of the position particle is defined asThe most likely value; minimum cumulative erroris considered to be the most likely position value; thus, the normalized weight is calculated by the following formula:
whereinis a pair ofAnd calculating the standard deviation.
Further, step S4 is specifically as follows: in collecting position and attitude information of the robot using the position sensor and the inertial measurement unit, measurement errors increase with the passage of time due to inherent noise of the sensor, and the extended kalman filter is applied to optimize the motion errors (the following formula calculation is an optimization process).
If 4 pose transformation parameters from the IMU sensor to the tail end of the robot are considered for 4 DH kinematic parameters of N rotary joints, wherein the pose transformation parameters are connecting rod length, connecting rod torsion angle, connecting rod offset and joint rotation angle, the number of the total parameters is 4N + 4; therefore, in the pose estimation process based on Extended Kalman Filtering (EKF), the estimated pose is calculated based on 4(N +1) D-H parameters; the model for estimating the state is as follows:
Pk+1|k=Pk|k+Qk
whereinAnd Pk|kRespectively representing estimated position states andCovariance matrix, Qka covariance matrix representing the system noise at time k; jacobian matrix Jk+1redundancy of measurement errorsAnd a redundant covariance matrix Sk+1Is obtained by the following formula:
Wherein m iskAnd RkRespectively representing the measured attitude value and a covariance matrix of the measured noise at the moment k, and T represents a conversion matrix between joints;namely the estimated pose at the moment k +1 calculated previously;
Therefore, the filter gain K of the extended Kalman filterk+1measured value of stateSum state error covariance matrix Pk+1the updating is as follows:
Pk+1|k+1=(I-Kk+1Jk+1)Pk+1|k
Wherein I represents an identity matrix.
compared with the prior art, the invention has the following advantages and effects:
1. the on-line method provided by the invention estimates the attitude of the tail end of the robot by using Kalman filtering, estimates the position of the tail end of the robot by using particle filtering, and finally obtains the kinematic parameter error by expanding the Kalman filtering.
2. The method adopts the inertial sensor and the position sensor to quickly and accurately calibrate errors.
3. Compared to current calibration methods, the present invention is not only calibrated in situ, which means that the robot starts from an arbitrary position. Furthermore, the robot does not need to make some specific movements to measure the robot information off-line, which makes it more convenient and efficient, and more importantly, it has a higher fault tolerance, making it easier to use.
Drawings
Fig. 1 is a flowchart of an online robot kinematic calibration method based on a multi-sensor hybrid filter of an embodiment.
Detailed Description
the present invention is described in further detail below with reference to examples, but the embodiments of the present invention are not limited thereto, and those skilled in the art can realize or understand the present invention by referring to the prior art unless specifically described below.
Fig. 1 shows an online robot kinematic calibration method based on a multi-sensor hybrid filter, which includes the following steps:
s1, measuring orientation and position of robot end effector by using inertial sensor and position sensor
for measuring the rotation angles of the robot end about the x, y and z axes, an inertial measurement unit is rigidly fixed to the robot end-effector, the inertial measurement unit comprising a magnetometer, two gyroscopes and an accelerometer; a Factor Quaternion Algorithm (FQA) based on measured data of magnetic and gravitational fields is utilized to improve the accuracy of the estimation process. The Euler angles obtained by measurement are used for representing the attitude of the end effector and the quaternion [ q [ ]0,q1,q2,q3]Obtained by conversion of the euler angle:
Where φ, θ, ψ denote rotational angles about the x, y, and z axes, respectively.
The quaternion satisfies the following relationship:
wherein q is0is a quaternion scalar, [ q ]1,q2,q3]Is a quaternion vector.
The direction cosine matrix from the inertial sensor coordinate system to the world coordinate system is represented as:
S2, estimating the attitude of the tail end of the robot by using Kalman filtering
The following quaternion q is used0、q1、q2And q is3The differential function over time t reduces the measurement errors of the magnetometer and the gyroscope:
in which ξx、ξyAnd xizRespectively represent x in the inertial systems、ysAnd zsAn angular velocity component of the shaft;
defining a state transition matrix:
WhereinΔtIs the sample time, state xKF,kcomposed of quaternion states and angular velocities, described as xKF,k=[q0,kq1,k q2,k q3,k ξx,k ξy,k ξz,k]wherein q is0,k、q1,k、q2,kAnd q is3,kRepresenting a quaternion state, ξ, at time kx,k、ξy,kAnd xiz,kRespectively in the inertial system xs、ysAnd zsThe angular velocity of the shaft at time k;
Defining a process noise vector:
wk=[0 0 0 0 wx wy wz]T
Wherein wx、wyand wzprocess noise representing angular velocity, the observation matrix H assuming the calibrated gyroscope detects angular velocitykis Hk=[0n×p In×n]Where n is the number of angular velocity vectors, p is the dimension of a quaternion, the observation matrix determines a quaternion q at time kkThe normalized form of (a) is:
qk=[q0,k/M q1,k/M q2,k/M q3,k/M]
s3, estimating the position of the tail end of the robot by using particle filtering
The position and acceleration of each sample point is estimated by particle filtering of the following equation:
WhereinRepresenting the velocity of the ith particle, A representing the acceleration of the IMU, and g representing the local gravity vector,Representing the end position. Cirepresenting a rotation matrix from the inertial measurement sensor coordinate system to the world coordinate system.
The position state of the robot end is defined as xPF=[px,py,pz,ax,ay,az],px,py,pzAnd ax,ay,azRepresenting position and acceleration in the x, y, z axes, respectively; to obtain more accurate weights, a time period Δ T is usedsinstead of the instantaneous position difference at time k, where s represents the number of iterations; the cumulative position difference of the estimated value and the calculated value of the ith particle is used for likelihood calculation as follows:
Ms=ΔTs/t
MsIs expressed at DeltaTsThe time interval for calculating the accumulated position difference for each particle taken in, t represents the number of intervals,
WhereinRespectively representing the position states of the ith particle on the x, y and z axes at time k,Respectively, represent the position states of the ith particle on the x, y and z axes estimated by kalman filtering at time k. The approximate posterior is expressed as:
Wherein x is0:sindicates the current particle position,representation to iterationThe position of the ith particle in s times,Represents the accumulated position difference until the iteration number is s; n is the number of particles in the particle filtering algorithm,Is the normalized weight of the ith particle at time k, δ (·) is the dirac δ function; the posterior probability satisfies the following equation:
(p(xs|xs-1) The probability of the prior is represented by,The posterior probability at the number of iterations s-1 is shown,Is represented by xsUnder the conditions ofThe probability of,Representing the calculated accumulated position error, xsrepresenting the position of the particle at the number of iterations s;
Normalized weight of ith particle at iteration number sthe method comprises the following steps:
Wherein r (-) is the importance density;representing the position state of the ith particle at the iteration number s; by resampling and subtracting p (x)s|xs-1) Obtaining the important density, and obtaining the normalized weight by the following method:
wherein the content of the first and second substances,is shown inunder the conditions ofThe probability of,Representing the actual position of the ith particle at iteration number s,
The weight of the position particle is defined asthe most likely value; minimum cumulative errorIs considered to be the most likely position value; thus, the normalized weight is calculated by the following formula:
WhereinIs a pair ofAnd calculating the standard deviation.
s4, calculating DH kinematic parameters (DH parameters: connecting rod length, connecting rod torsion angle, connecting rod offset and joint rotation angle) by using Extended Kalman Filter (EKF), thereby optimizing the kinematic error. The dynamic modeling in fig. 1 is DH kinematics modeling, which is a general robot kinematics modeling, and refers to a mathematical formula model for establishing the relationship between variables.
In collecting position and attitude information of the robot using the position sensor and the inertial measurement unit, measurement errors increase with the passage of time due to inherent noise of the sensor, and the extended kalman filter is applied to optimize the motion errors (the following formula calculation is an optimization process).
If 4 pose transformation parameters from the IMU sensor to the tail end of the robot are considered for 4 DH kinematic parameters of N rotary joints, wherein the pose transformation parameters are connecting rod length, connecting rod torsion angle, connecting rod offset and joint rotation angle, the number of the total parameters is 4N + 4; therefore, in the pose estimation process based on Extended Kalman Filtering (EKF), the estimated pose is calculated based on 4(N +1) D-H parameters; the model for estimating the state is as follows:
Pk+1|k=Pk|k+Qk
WhereinAnd Pk|krespectively representing the estimated position state and the covariance matrix, Qka covariance matrix representing the system noise at time k; jacobian matrix Jk+1redundancy of measurement errorsand a redundant covariance matrix Sk+1Is obtained by the following formula:
Wherein m iskand RkRespectively representing the measured attitude value and a covariance matrix of the measured noise at the moment k, and T represents a conversion matrix between joints;Namely the estimated pose at the moment k +1 calculated previously;
therefore, the filter gain K of the extended Kalman filterk+1Measured value of stateSum state error covariance matrix Pk+1the updating is as follows:
Pk+1|k+1=(I-Kk+1Jk+1)Pk+1|k
Wherein I represents an identity matrix.
the above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (5)

1. The online robot kinematic calibration method based on the multi-sensor hybrid filter is characterized by comprising the following steps of:
S1, respectively measuring the orientation and the position of the robot end effector by using an inertial sensor and a position sensor;
s2, estimating the position of the robot end effector by using Kalman filtering;
S3, estimating the attitude of the robot end effector by using particle filtering;
S4, calculating differential errors of DH kinematic parameters by using extended Kalman filtering to optimize the kinematic errors; the DH kinematic parameters include link length, link twist angle, link offset, and joint angle.
2. The method for online robot kinematic calibration based on multi-sensor hybrid filter according to claim 1, wherein the step S1 specifically comprises:
For measuring the rotation angles of the robot end about the x, y and z axes, an inertial measurement unit is rigidly fixed to the robot end-effector, the inertial measurement unit comprising a magnetometer, two gyroscopes and an accelerometer; using a Factor Quaternion Algorithm (FQA) based on measured data of magnetic and gravitational fields to improve the accuracy of the estimation process; the Euler angles obtained by measurement are used for representing the attitude of the end effector and the quaternion [ q [ ]0,q1,q2,q3]obtained by conversion of the euler angle:
where φ, θ, ψ denote rotational angles around the x, y, and z axes, respectively;
the quaternion satisfies the following relationship:
Wherein q is0Is a quaternion scalar, [ q ]1,q2,q3]Is a quaternion vector;
the direction cosine matrix from the inertial sensor coordinate system to the world coordinate system is represented as:
3. the method for calibrating the kinematics of the online robot based on the multi-sensor hybrid filter of claim 1, wherein in step S2, the position of the end effector of the robot is estimated by using kalman filtering as follows:
The following quaternion q is used0、q1、q2And q is3the differential function over time t reduces the measurement errors of the magnetometer and the gyroscope:
In which ξx、ξyAnd xizRespectively represent x in the inertial systems、ysAnd zsAn angular velocity component of the shaft;
Defining a state transition matrix:
WhereinΔtis the sample time, state xKF,kComposed of quaternion states and angular velocities, described as xKF,k=[q0,k q1,k q2,kq3,k ξx,k ξy,k ξz,k]Wherein q is0,k、q1,k、q2,kAnd q is3,krepresenting a quaternion state, ξ, at time kx,k、ξy,kand xiz,kRespectively in the inertial system xs、ysAnd zsthe angular velocity of the shaft at time k;
defining a process noise vector:
wk=[0 0 0 0 wx wy wz]T
wherein wx、wyAnd wzProcess noise representing angular velocity, the observation matrix H assuming the calibrated gyroscope detects angular velocitykis Hk=[0n×p In×n],0n×pZero matrix representing n rows and p columns, In×nAn identity matrix representing n rows and n columns, where n is the number of angular velocity vectors and p is the dimension of a quaternion, the observation matrix defining a quaternion q at time kkThe normalized form of (a) is:
qk=[q0,k/M q1,k/M q2,k/M q3,k/M]
4. the method for online robot kinematic calibration based on multi-sensor hybrid filter according to claim 1, wherein the step S3 is to estimate the pose of the robot end effector by using particle filtering as follows:
The position and acceleration of each sample point is estimated by particle filtering of the following equation:
WhereinRepresenting the velocity of the ith particle, a representing the acceleration of the IMU, g representing the local gravity vector,represents the terminal position, CiA rotation matrix representing the rotation from the inertial measurement sensor coordinate system to the world coordinate system;
The position state of the robot end is defined as xPF=[px,py,pz,ax,ay,az],px,py,pzAnd ax,ay,azRepresenting position and acceleration in the x, y, z axes, respectively; to obtain more accurate weights, a time period Δ T is usedsInstead of the instantaneous position difference at time k, where s represents the number of iterations; the cumulative position difference of the estimated value and the calculated value of the ith particle is used for likelihood calculation as follows:
Ms=ΔTs/t
MsIs expressed at DeltaTsThe time interval for calculating the accumulated position difference for each particle taken in, t represents the number of intervals,
WhereinRespectively representing the position states of the ith particle on the x, y and z axes at time k,Respectively representing the position states of the ith particle on x, y and z axes estimated by Kalman filtering at the moment k; the approximate posterior is expressed as:
Wherein x is0:sIndicates the current particle position,Indicates the position of the ith particle until the number of iterations is s,Represents the accumulated position difference until the iteration number is s; n is the number of particles in the particle filtering algorithm,is the normalized weight of the ith particle at time k, δ (·) is the dirac δ function; the posterior probability satisfies the following equation:
(p(xs|xs-1) The probability of the prior is represented by,the posterior probability at the number of iterations s-1 is shown,is represented by xsunder the conditions ofThe probability of,Representing the calculated accumulated position error, xsRepresenting the position of the particle at the number of iterations s;
normalized weight of ith particle at iteration number sThe method comprises the following steps:
wherein r (-) is the importance density;Representing the position state of the ith particle at the iteration number s; by resampling and subtracting p (x)s|xs-1) Obtaining the important density, and obtaining the normalized weight by the following method:
Wherein the content of the first and second substances,Is shown inUnder the conditions ofThe probability of,Representing the actual position of the ith particle at iteration number s,
the weight of the position particle is defined asThe most likely value; minimum cumulative errorIs considered to be the most likely position value; thus, the normalized weight is calculated by the following formula:
WhereinIs a pair ofand calculating the standard deviation.
5. the method for online robot kinematic calibration based on multi-sensor hybrid filter according to claim 1, wherein the step S4, the optimizing kinematic error specifically includes:
In the process of collecting the position and attitude information of the robot by using the position sensor and the inertial measurement unit, due to the inherent noise of the sensor, the measurement error increases with the passage of time, and the extended kalman filter is applied to optimize the motion error;
If 4 pose transformation parameters from the IMU sensor to the tail end of the robot are considered for 4 DH kinematic parameters of N rotary joints, wherein the pose transformation parameters are connecting rod length, connecting rod torsion angle, connecting rod offset and joint rotation angle, the number of the total parameters is 4N + 4; therefore, in the pose estimation process based on Extended Kalman Filtering (EKF), the estimated pose is calculated based on 4(N +1) D-H parameters; the model for estimating the state is as follows:
Pk+1|k=Pk|k+Qk
Whereinand Pk|krespectively representing the estimated position state and the covariance matrix, QkA covariance matrix representing the system noise at time k; jacobian matrix Jk+1Redundancy of measurement errorsand a redundant covariance matrix Sk+1Is obtained by the following formula:
Wherein m iskAnd RkRespectively representing the measured attitude value and a covariance matrix of the measured noise at the moment k, and T represents a conversion matrix between joints;namely the estimated pose at the moment k +1 calculated previously;
Therefore, the filter gain K of the extended Kalman filterk+1Measured value of stateSum state error covariance matrix Pk+1The updating is as follows:
Pk+1|k+1=(I-Kk+1Jk+1)Pk+1|k
wherein I represents an identity matrix.
CN201910765787.3A 2019-07-28 2019-08-19 online robot kinematic calibration method based on multi-sensor hybrid filter Pending CN110561424A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/114533 WO2021031346A1 (en) 2019-07-28 2019-10-31 Online robot kinematic calibration method based on plurality of sensors combined with filters

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2019106917873 2019-07-28
CN201910691787 2019-07-28

Publications (1)

Publication Number Publication Date
CN110561424A true CN110561424A (en) 2019-12-13

Family

ID=68774081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910765787.3A Pending CN110561424A (en) 2019-07-28 2019-08-19 online robot kinematic calibration method based on multi-sensor hybrid filter

Country Status (2)

Country Link
CN (1) CN110561424A (en)
WO (1) WO2021031346A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110991085A (en) * 2019-12-20 2020-04-10 上海有个机器人有限公司 Robot image simulation data construction method, medium, terminal and device
CN111076721A (en) * 2020-01-19 2020-04-28 浙江融芯导航科技有限公司 Fast-convergence inertial measurement unit installation attitude estimation method
CN111086001A (en) * 2019-12-25 2020-05-01 广东省智能制造研究所 State estimation method and system for multi-modal perception of foot robot
CN111222225A (en) * 2019-12-20 2020-06-02 浙江欣奕华智能科技有限公司 Method and device for determining pose of sensor in robot
CN113587920A (en) * 2020-04-30 2021-11-02 阿里巴巴集团控股有限公司 Motion measurement method, motion measurement device, electronic device and computer-readable storage medium
WO2021218212A1 (en) * 2020-04-26 2021-11-04 珠海格力智能装备有限公司 Robot control method and apparatus, and storage medium and processor
CN114683259A (en) * 2020-12-28 2022-07-01 财团法人工业技术研究院 Mechanical arm correction system and mechanical arm correction method
US11904482B2 (en) 2020-12-28 2024-02-20 Industrial Technology Research Institute Mechanical arm calibration system and mechanical arm calibration method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170276501A1 (en) * 2016-03-28 2017-09-28 Fetch Robotics, Inc. System and Method for Localization of Robots
CN107270893A (en) * 2017-05-27 2017-10-20 东南大学 Lever arm, time in-synchronization error estimation and the compensation method measured towards real estate
CN108318038A (en) * 2018-01-26 2018-07-24 南京航空航天大学 A kind of quaternary number Gaussian particle filtering pose of mobile robot calculation method
CN108827301A (en) * 2018-04-16 2018-11-16 南京航空航天大学 A kind of improvement error quaternion Kalman filtering robot pose calculation method
CN109395375A (en) * 2018-09-18 2019-03-01 华南理工大学 A kind of 3d gaming method of interface interacted based on augmented reality and movement
CN109521868A (en) * 2018-09-18 2019-03-26 华南理工大学 A kind of dummy assembly method interacted based on augmented reality and movement

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100936710B1 (en) * 2007-03-16 2010-01-13 한국전자통신연구원 Position estimation Method, System and recorded media of mobile system using odometer and heading sensor
EP3054265B1 (en) * 2015-02-04 2022-04-20 Hexagon Technology Center GmbH Coordinate measuring machine
US9921578B2 (en) * 2016-03-09 2018-03-20 International Business Machines Corporation Automatic database filtering system utilizing robotic filters
US10766144B2 (en) * 2018-01-08 2020-09-08 Digital Dream Labs, Llc Map related acoustic filtering by a mobile robot
CN108692701B (en) * 2018-05-28 2020-08-07 佛山市南海区广工大数控装备协同创新研究院 Mobile robot multi-sensor fusion positioning method based on particle filter

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170276501A1 (en) * 2016-03-28 2017-09-28 Fetch Robotics, Inc. System and Method for Localization of Robots
CN107270893A (en) * 2017-05-27 2017-10-20 东南大学 Lever arm, time in-synchronization error estimation and the compensation method measured towards real estate
CN108318038A (en) * 2018-01-26 2018-07-24 南京航空航天大学 A kind of quaternary number Gaussian particle filtering pose of mobile robot calculation method
CN108827301A (en) * 2018-04-16 2018-11-16 南京航空航天大学 A kind of improvement error quaternion Kalman filtering robot pose calculation method
CN109395375A (en) * 2018-09-18 2019-03-01 华南理工大学 A kind of 3d gaming method of interface interacted based on augmented reality and movement
CN109521868A (en) * 2018-09-18 2019-03-26 华南理工大学 A kind of dummy assembly method interacted based on augmented reality and movement

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
GUANGLONG DU: "Human–Manipulator Interface Based on Multisensory Process via Kalman Filters", 《IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS》 *
GUANGLONG DU: "Online Serial Manipulator Calibration Based on Multisensory Process Via Extended Kalman and Particle Filters", 《 IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS 》 *
SEONG-HOON PETER WON: "A Kalman/particle filter-based position and orientation estimation method using a position sensor/inertial measurement unit hybrid system", 《IEEE TRANS. IND. ELECTRON.》 *
张国良: "《移动机器人的SLAM与VSLAM方法》", 31 October 2018, 西安交通大学出版社 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110991085A (en) * 2019-12-20 2020-04-10 上海有个机器人有限公司 Robot image simulation data construction method, medium, terminal and device
CN111222225A (en) * 2019-12-20 2020-06-02 浙江欣奕华智能科技有限公司 Method and device for determining pose of sensor in robot
CN110991085B (en) * 2019-12-20 2023-08-29 上海有个机器人有限公司 Method, medium, terminal and device for constructing robot image simulation data
CN111222225B (en) * 2019-12-20 2023-08-29 浙江欣奕华智能科技有限公司 Method and device for determining pose of sensor in robot
CN111086001A (en) * 2019-12-25 2020-05-01 广东省智能制造研究所 State estimation method and system for multi-modal perception of foot robot
CN111076721A (en) * 2020-01-19 2020-04-28 浙江融芯导航科技有限公司 Fast-convergence inertial measurement unit installation attitude estimation method
CN111076721B (en) * 2020-01-19 2023-03-28 浙江融芯导航科技有限公司 Fast-convergence inertial measurement unit installation attitude estimation method
WO2021218212A1 (en) * 2020-04-26 2021-11-04 珠海格力智能装备有限公司 Robot control method and apparatus, and storage medium and processor
CN113587920A (en) * 2020-04-30 2021-11-02 阿里巴巴集团控股有限公司 Motion measurement method, motion measurement device, electronic device and computer-readable storage medium
CN113587920B (en) * 2020-04-30 2024-02-20 阿里巴巴集团控股有限公司 Motion measurement method, motion measurement device, electronic equipment and computer readable storage medium
CN114683259A (en) * 2020-12-28 2022-07-01 财团法人工业技术研究院 Mechanical arm correction system and mechanical arm correction method
US11904482B2 (en) 2020-12-28 2024-02-20 Industrial Technology Research Institute Mechanical arm calibration system and mechanical arm calibration method

Also Published As

Publication number Publication date
WO2021031346A1 (en) 2021-02-25

Similar Documents

Publication Publication Date Title
CN110561424A (en) online robot kinematic calibration method based on multi-sensor hybrid filter
CN111207774B (en) Method and system for laser-IMU external reference calibration
Du et al. Online serial manipulator calibration based on multisensory process via extended Kalman and particle filters
CN110926460B (en) Uwb positioning abnormal value processing method based on IMU
JP5219467B2 (en) Method, apparatus, and medium for posture estimation of mobile robot based on particle filter
CN110887481B (en) Carrier dynamic attitude estimation method based on MEMS inertial sensor
JP4876204B2 (en) Small attitude sensor
CN113311411B (en) Laser radar point cloud motion distortion correction method for mobile robot
CN111949929B (en) Design method of multi-sensor fusion quadruped robot motion odometer
CN110017850B (en) Gyroscope drift estimation method and device and positioning system
CN106125548A (en) Industrial robot kinetic parameters discrimination method
CN107941212B (en) Vision and inertia combined positioning method
Du et al. An online method for serial robot self-calibration with CMAC and UKF
CN108871323B (en) High-precision navigation method of low-cost inertial sensor in locomotive environment
CN115533915A (en) Active contact detection control method for aerial work robot in uncertain environment
CN115855048A (en) Multi-sensor fusion pose estimation method
CN106595669B (en) Method for resolving attitude of rotating body
CN110967017A (en) Cooperative positioning method for rigid body cooperative transportation of double mobile robots
TW202224872A (en) Mechanical arm calibration system and mechanical arm calibration method
CN116338719A (en) Laser radar-inertia-vehicle fusion positioning method based on B spline function
Zhe et al. Adaptive complementary filtering algorithm for imu based on mems
Luo et al. End-Effector Pose Estimation in Complex Environments Using Complementary Enhancement and Adaptive Fusion of Multisensor
CN114111772B (en) Underwater robot soft operation hand position tracking method based on data glove
CN114046800B (en) High-precision mileage estimation method based on double-layer filtering frame
CN108107882B (en) Automatic calibration and detection system of service robot based on optical motion tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191213

RJ01 Rejection of invention patent application after publication