CN110542429B - Error compensation method for omnidirectional mobile robot - Google Patents

Error compensation method for omnidirectional mobile robot Download PDF

Info

Publication number
CN110542429B
CN110542429B CN201910636415.0A CN201910636415A CN110542429B CN 110542429 B CN110542429 B CN 110542429B CN 201910636415 A CN201910636415 A CN 201910636415A CN 110542429 B CN110542429 B CN 110542429B
Authority
CN
China
Prior art keywords
mobile robot
omnidirectional mobile
coordinate system
formula
kinematic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910636415.0A
Other languages
Chinese (zh)
Other versions
CN110542429A (en
Inventor
杜宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Dahuazhongtian Technology Co ltd
Original Assignee
Dalian Dahuazhongtian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Dahuazhongtian Technology Co ltd filed Critical Dalian Dahuazhongtian Technology Co ltd
Priority to CN201910636415.0A priority Critical patent/CN110542429B/en
Publication of CN110542429A publication Critical patent/CN110542429A/en
Application granted granted Critical
Publication of CN110542429B publication Critical patent/CN110542429B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

An error compensation method for an omnidirectional mobile robot belongs to the technical field of mobile robots. Firstly, establishing a kinematic model of the three-wheeled omnidirectional mobile robot, and secondly, calculating robot odometry information by a discrete method; then, under the condition of eliminating the interference item, performing parameter estimation by a least square method of multiple linear regression to obtain a calibrated kinematic parameter formula; and finally, performing pose recognition through a vision sensor, performing multiple experiments by using the three-wheeled omnidirectional mobile robot, recognizing and substituting vision recognition results into the obtained formula, and calculating to obtain an estimated value of the kinematic parameters of the robot. Compared with a method of directly multiplying the kinematics by the coefficient, the method does not consider the relation between unknown parameters in the motion equation, the obtained motion equation can more accurately describe the motion of the actual motion robot, and the odometer precision of the robot is obviously improved through the kinematics calibration.

Description

Error compensation method for omnidirectional mobile robot
Technical Field
The invention belongs to the technical field of mobile robots, and provides an error compensation method for an omnidirectional mobile robot.
Background
The odometer positioning method of the mobile robot simplifies the fundamental problem of positioning, so the cost of the mobile robot can be greatly reduced, however, few researchers directly research the distance measurement accuracy of the odometer of the mobile robot, and a great part of research which attributes the phenomenon to the mobile robot technology is completed by people in the field of Artificial Intelligence (AI). Because the upper-layer algorithm has some disadvantages and the cost problem of the robot is considered, the odometer positioning method is more and more emphasized by scientific research personnel. It is well known that a disadvantage of odometer positioning is its infinite accumulation of position errors. After long-time operation, the position error of the odometer becomes larger and larger, and the position estimation precision is not high, so that the odometer needs to be calibrated to improve the positioning precision.
The conventional odometer calibration method can be divided into off-line calibration and on-line calibration. Larsen and Martinelli utilize augmented extended kalman filtering (AKF) to calibrate a two-wheeled differential robot on line, the method is the most influential on-line calibration method, many researchers improve on the basis of the method to calibrate the odometer of the differential robot, Borenstein measures the actual end point by making the differential mobile robot move clockwise and counterclockwise according to a preset 4 x 4 square track, compares the actual end point with a theoretical value, and calibrates by combining a robot motion model, the method (UMBmark method) is a famous off-line calibration method, most of the existing calibration methods are applied to the differential robot, y.maddahi corrects a kinematic equation by a correction coefficient to calibrate a three-wheeled omnidirectional mobile robot, but the method considers the relationship among parameters and cannot completely reflect the actual running condition of the robot. A new off-line calibration method is proposed, which is used for error compensation of the omnidirectional mobile robot.
Disclosure of Invention
In order to solve the technical problem, the invention provides an error compensation method for an omnidirectional mobile robot, which is used for identifying the pose of the omnidirectional mobile robot based on vision, calculating to obtain calibrated kinematic parameters and further enabling the omnidirectional mobile robot to move more accurately.
In order to achieve the purpose, the technical scheme of the invention is as follows:
an error compensation method for an omnidirectional mobile robot comprises the steps of firstly calculating a kinematics model of the omnidirectional mobile robot, obtaining kinematics parameters meeting a multiple linear equation, and estimating by using a mathematical model of multiple linear regression. And then, a mathematical model based on least square method multi-element linear regression is given, the parameters are evaluated, and the relation between the kinematic parameters and the observed quantity is obtained by the method. And finally, acquiring the pose and speed observed quantity of the omnidirectional mobile robot through vision and a coder, calculating an estimated value of a kinematic parameter of the omnidirectional mobile robot, substituting and updating a kinematic model, and realizing error compensation of the omnidirectional mobile robot. The method comprises the following specific steps:
firstly, a vision sensor is installed on the omnidirectional mobile robot, and the relation between the angular velocity of the wheel and the velocity in the world coordinate system is obtained through a kinematic model of the omnidirectional mobile robot.
In order to calibrate odometer errors of the omnidirectional mobile robot through a kinematic equation, how the kinematic equation of the omnidirectional mobile robot describes the influence of each wheel on the motion of the omnidirectional mobile robot needs to be known, the omnidirectional mobile robot is a mobile omnidirectional mobile robot with three wheels uniformly distributed with double rows of Mecanum wheels, a coordinate system of the omnidirectional mobile robot is established in the center of a base of the omnidirectional mobile robot, and a world coordinate system is established by taking an initial position as an origin.
The initial position of the omnidirectional mobile robot is defined as a global coordinate system (X) W O W Y W ). When the omnidirectional mobile robot is in a non-slip state, the relationship between the speed in the global coordinate system and the wheel speed is as follows:
Figure BDA0002130487720000021
wherein i is 1,2,3, L i Represents the distance, alpha, from the ith wheel plane to the origin of the omnidirectional mobile robot coordinate system i The angle between the axial direction of the ith wheel and the positive direction of the X axis in the global coordinate system is represented, r represents the radius of the omnidirectional mobile robot wheel, beta is the steering angle of the omnidirectional mobile robot wheel, gamma is the angle between the main plane of the wheel and the wheel roll axis, and the vector is
Figure BDA0002130487720000027
Representing velocity V in world coordinate system x ,V y ,V ω 。ω i Indicating the angular velocity of the ith wheel. θ represents the direction of the omnidirectional mobile robot in the world coordinate system. The matrix R (θ) defines the transformation relationship between the coordinate system of the omnidirectional mobile robot and the global coordinate system:
Figure BDA0002130487720000022
and (3) transforming the result according to the formula (1) and the formula (2), and expressing a kinematic parameter matrix by using matrixes P and M to obtain a conversion relation between the angular velocity of the wheel and the velocity in a world coordinate system, namely an omnidirectional mobile robot kinematic model, wherein the matrixes P and M are expressed by the formulas (4), (5) and (6).
Figure BDA0002130487720000023
Figure BDA0002130487720000024
Figure BDA0002130487720000025
Figure BDA0002130487720000026
And secondly, acquiring a odometer discrete model of the omnidirectional mobile robot based on the kinematics model of the omnidirectional mobile robot obtained in the first step, wherein the odometer discrete model is shown as a formula (7), then combining the formula (4) with the formula (7), obtaining a relation between speed and displacement, and establishing a relation between an error parameter and an observed quantity.
The calculation process of the motion pose information of the omnidirectional mobile robot is dispersed into a superposition process to be solved, and the state of the omnidirectional mobile robot at the moment k is assumed to be S k =[x k y k θ k ] T Including position (x) relative to the world coordinate system k ,y k ) And rotation condition (theta) k )。V x,k And V y,k Respectively, the velocity in the X, Y direction relative to the global coordinate system at time k. Delta theta k Represents the variation of the rotation angle from the kth sampling point to the (k +1) th sampling point, T represents the sampling frequency, and after the translation and the rotation, the state S of the robot reaches the k +1 moment k+1 =[x k+1 y k+1 θ k+1 ] T . The discrete motion relationship is then expressed as:
Figure BDA0002130487720000031
and (4) obtaining a conversion relation as shown in a formula (8) according to the speed conversion relation between the angular speed of the wheel and the world coordinate system in the first step and the discrete motion relation as shown in the formula (7).
Figure BDA0002130487720000032
Wherein m is gl (g=[1 3],l=[1 3]) Representing each element, ω, in the M matrix i,k (i ═ 1,2,3) denotes the angular velocity of the three wheels at time k, [ X ═ X m Y m θ m ]Showing the pose of the omnidirectional mobile robot at time m, [ X ] 0 Y 0 θ 0 ]And representing the initial pose of the omnidirectional mobile robot.
And thirdly, acquiring the attitude information of the robot by enabling the omnidirectional mobile robot to repeatedly run along the repeated track. Performing parameter estimation by least square method of multiple linear regression, and calculating estimated value of kinematic parameter by using relationship between translation, rotation and angular velocity
Figure BDA0002130487720000036
The calculation formula for estimating the X-direction parameters by adopting a least square method is as follows:
Figure BDA0002130487720000033
wherein X m,I ,Y m,I (I=[1N]) Indicating the position of the omnidirectional mobile robot at time m, K X,I (I=[1N]) Indicating X-direction position information calculated by each wheel,
Figure BDA0002130487720000034
represents K X,I A collection of (a). K X,I The calculation formula is as follows:
Figure BDA0002130487720000035
the estimated values of the kinematic parameters in the X direction are obtained as follows:
Figure BDA0002130487720000041
similarly, the calculation formula for estimating the Y-direction parameters by adopting the least square method is as follows:
Figure BDA0002130487720000042
wherein, K Y,I (I=[1N]) Indicating the Y-direction position information calculated by each wheel,
Figure BDA0002130487720000049
represents K Y,I A collection of (a). K Y,I The calculation formula is as follows:
Figure BDA0002130487720000043
the estimated values of the kinematic parameters in the Y direction are obtained as follows:
Figure BDA0002130487720000044
similarly, the calculation formula for estimating the theta direction parameter by adopting the least square method is as follows:
Figure BDA0002130487720000045
wherein, K θ,I (I=[1N]) Indicates theta direction angle information calculated by each wheel,
Figure BDA00021304877200000410
represents K θ,I A collection of (a). K θ,I The calculation formula is as follows:
Figure BDA0002130487720000046
the estimated values of the kinematic parameters in the theta direction are obtained as follows:
Figure BDA0002130487720000047
fourthly, the three wheels of the omnidirectional mobile robot are uniformly distributed, the vision sensor is positioned above the omnidirectional mobile robot, the ceiling above the vision sensor is provided with the label to be identified, the omnidirectional mobile robot runs a specific track, and the start and end poses of the omnidirectional mobile robot are obtained by identifying the label through the vision sensor.
Firstly, calibrating a visual sensor to obtain internal and external parameters of the sensor; secondly, the omnidirectional mobile robot carries out coordinate conversion, and the conversion relation is shown as a formula (18); and finally, obtaining the odometer information of the omnidirectional mobile robot measured by the vision sensor.
Figure BDA0002130487720000048
Wherein, the characters a and b are internal parameters of the omnidirectional mobile robot; c and d are the initial readings of the sensor; eta represents the rotation of the omnidirectional mobile robot in a world coordinate system; [ X ] C Y C ]Indicates position information of the omnidirectional mobile robot in a camera coordinate system, [ X ] G Y G ]And the position information of the omnidirectional mobile robot in a world coordinate system is represented.
In the experimental process, the running linear velocity of the omnidirectional mobile robot is 0.3m/s, the sampling frequency is 20ms, the starting and ending poses of the omnidirectional mobile robot are obtained and are substituted into the third step, the formulas (11), (14) and (17), the estimated value of the kinematic parameter is obtained by calculation,
Figure BDA0002130487720000052
estimate representing the inverse of the kinematic parameter matrix:
Figure BDA0002130487720000051
and fifthly, substituting the estimated value of the inverse of the kinematic parameter matrix obtained in the fourth step into a formula (5), updating a kinematic model, and realizing the error compensation of the omnidirectional mobile robot.
Furthermore, the parameter calibration method in the third step belongs to an off-line calibration method, and compared with an on-line calibration method, the requirements on the accuracy of the sensor, the real-time performance of the system and the like are not very high, and more condition restrictions are not needed, so that the method is convenient to implement. And the relation between unknown parameters is not considered in the parameter calibration method, and the obtained result is closer to the actual parameters of the robot.
The error compensation method for the omnidirectional mobile robot is also suitable for other types of wheeled robots, and the pose identification method can be realized by using a visual sensor, and manual measurement and laser positioning methods. In the calibration process, only the kinematics model is modified according to the calibrated omnidirectional mobile robot, and then error compensation is carried out based on the method.
Compared with the prior art, the invention has the beneficial effects that: compared with an offline calibration method, error compensation is performed on the basis of not considering the correlation among the kinematic parameters, and the calibration result can better reflect the real state of the omnidirectional mobile robot; compared with an on-line calibration method, the method does not need a complex modeling process, does not need the system requirements of high precision and high real-time performance, and is simple and reliable to operate. The vision method adopted by the invention is simple, and the camera is arranged on the robot body, so that the operation is convenient, and the recognition effect is better.
Drawings
Fig. 1 is a coordinate system and three-wheel arrangement of an omnidirectional mobile robot used in an example of the invention.
Fig. 2 shows the wheel assembly of the omnidirectional mobile robot used in the embodiment of the invention.
Figure 3 is a feature of the wheels of an omni-directional mobile robot used in an example of the invention.
FIG. 4 is a transformation process for a coordinate system used in an example of the invention.
FIG. 5 is a relative relationship of coordinate systems used in examples of the present invention.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
The invention provides a novel odometer offline calibration method for an omnidirectional mobile robot based on a kinematic equation. The following describes the implementation of the present invention in detail.
The invention provides an error compensation method for an omnidirectional mobile robot, which is implemented by combining encoder reading with a kinematics model, so that the accuracy of kinematics parameters directly influences the level of positioning accuracy. The method is based on a kinematic equation, and a novel odometer error compensation method for the omnidirectional mobile robot is developed. According to the kinematic analysis, unknown parameters in the kinematic model conform to a multivariate linear equation and can be estimated by a least square method. To facilitate the measurement of observation, the relationship between the velocity and the attitude (position and direction) is obtained by using a discrete method instead of integration. And then, obtaining an equation of the parameter to be estimated through a formula of multiple linear regression. Compared with a method for directly multiplying the kinematics by a coefficient, the method does not consider the relation among unknown parameters in the motion equation, the obtained motion equation can more accurately describe the actual motion omnidirectional mobile robot, and finally, a three-wheel omnidirectional mobile robot is used for carrying out an experiment to obtain an estimated value of the kinematics parameter, and the method comprises the following specific steps:
firstly, a vision sensor is installed on the omnidirectional mobile robot, and the relation between the angular velocity of the wheel and the velocity in the world coordinate system is obtained through a kinematic model of the omnidirectional mobile robot.
In order to calibrate the odometer error of the omnidirectional mobile robot through a kinematic equation, how each wheel describes the influence of each wheel on the motion of the omnidirectional mobile robot needs to be known, the studied parameters of the omnidirectional mobile robot are shown in fig. 1,2 and 3, and the coordinate system establishment mode and the wheels of the robot are described in detail in the figuresSub-characteristics and assembly form; the center of the base is defined as the origin of the robot coordinate system, and thus the robot coordinate system can be established as shown in fig. 1. The researched omnidirectional mobile robot is a mobile robot with three wheels and double rows of Mecanum wheels uniformly distributed, a coordinate system of the omnidirectional mobile robot is established in the center of a base of the omnidirectional mobile robot, and a world coordinate system is established by taking an initial position as an origin. Then, along the axis direction of each wheel, a distance L from the wheel plane to the origin of the omnidirectional mobile robot coordinate system is used i As the wheelbase. Alpha is alpha i Representing the angle between the axial direction of the ith wheel and the positive direction of the X-axis in the global coordinate system. According to the assembly condition of the wheels of the omni-directional mobile robot, the steering angle is β, as shown in fig. 2. Finally, the angle between the principal plane of the wheel and the axis of the wheel is γ, according to the characteristics of an omni wheel, as shown in fig. 3.
In the method, a start position of the omni-directional mobile robot is defined as a global coordinate system (X) W O W Y W ). When the mobile omni-directional mobile robot is in a non-slip state, the relationship between the speed in the global coordinate system and the wheel speed can be written as follows:
Figure BDA0002130487720000061
wherein r represents the radius of the wheel of the omnidirectional mobile robot, vector
Figure BDA0002130487720000062
Each element of (a) represents a velocity V in a world coordinate system x ,V y ,V ω 。ω i Representing the angular velocity, i.e. ω, of each wheel 1 ,ω 2 ,ω 3 . θ represents the direction of the omnidirectional mobile robot in the world coordinate system. The matrix R (θ) defines a transformation relationship between the coordinate system of the omnidirectional mobile robot and the global coordinate system.
Figure BDA0002130487720000071
The relationship between the angular velocity of the wheel and the velocity in the world coordinate system can be easily obtained. From equations (1) and (2), and substituting the above elements, transforming the result, and expressing the kinematic parameter matrices by matrices P and M, the conversion relationship between the angular velocity of the wheel and the velocity in the world coordinate system can be easily obtained as shown in equation (3). p is a radical of formula sf (s=[1 3],f=[1 3]) Each element in the inverse matrix representing matrix P, where matrices P and M are of the specific form shown in equations (4), (5) and (6).
Figure BDA0002130487720000072
Figure BDA0002130487720000073
Figure BDA0002130487720000074
Figure BDA0002130487720000075
And secondly, acquiring a odometer discrete model of the omnidirectional mobile robot based on the kinematics model of the omnidirectional mobile robot obtained in the first step, wherein the odometer discrete model is shown as a formula (7), and then acquiring a relation between speed and displacement by combining the formula (4) and the formula (7) through multiple times of experimental data, and establishing a relation between error parameters and observed quantity.
Dispersing the calculation process of the motion pose information of the omnidirectional mobile robot into a superposition process for solving, and assuming that the state of the omnidirectional mobile robot at the moment k is S k =[x k y k θ k ] T Including position (x) relative to the world coordinate system k ,y k ) And rotation condition (theta) k )。V x,k And V y,k Respectively, at time k, in the X and Y directions relative to the global coordinate systemThe upward velocity. Delta theta k Represents the variation of the rotation angle from the kth sampling point to the (k +1) th sampling point, T represents the sampling frequency, and after the translation and the rotation, the state S of the robot reaches the k +1 moment k+1 =[x k+1 y k+1 θ k+1 ] T . It can be expressed as:
Figure BDA0002130487720000076
and (4) obtaining a conversion relation as shown in the formula (8) according to the conversion relation between the angular speed of the wheel and the speed in the world coordinate system and the discrete motion relation as shown in the formula (7) in the first step.
Figure BDA0002130487720000081
Wherein m is gl (g=[1 3],l=[1 3]) Representing each element, ω, in the M matrix i,k (i ═ 1,2,3) denotes the angular velocity of the three wheels at time k, [ X ═ X m Y m θ m ]Showing the pose of the omnidirectional mobile robot at time m, [ X ] 0 Y 0 θ 0 ]And the initial pose of the omnidirectional mobile robot is shown.
And thirdly, acquiring the attitude information of the robot by enabling the omnidirectional mobile robot to repeatedly run along the repeated track. Performing parameter estimation by least square method of multiple linear regression, and calculating estimation value of kinematic parameter by using relation of translation, rotation and angular velocity
Figure BDA0002130487720000089
The X-direction parameter estimation calculation formula by adopting a least square method is as follows:
Figure BDA0002130487720000082
wherein, X m,I ,Y m,I (I=[1N]) Indicating omnidirectional movementPosition of robot at time m, K X,I (I=[1N]) Indicating X-direction position information calculated by each wheel,
Figure BDA0002130487720000083
represents K X,I A set of (a). K X,I The calculation formula is as follows:
Figure BDA0002130487720000084
the estimated values of the kinematic parameters in the X direction are obtained as follows:
Figure BDA0002130487720000085
similarly, the Y-direction parameter estimation calculation formula by using the least square method is as follows:
Figure BDA0002130487720000086
wherein, K Y,I (I=[1N]) Indicating the Y-direction position information calculated by each wheel,
Figure BDA0002130487720000087
represents K Y,I A collection of (a). K Y,I The calculation formula is as follows:
Figure BDA0002130487720000088
the estimated values of the kinematic parameters in the Y direction are obtained as follows:
Figure BDA0002130487720000091
similarly, the calculation formula for estimating the theta direction parameter by adopting the least square method is as follows:
Figure BDA0002130487720000092
wherein, K θ,I (I=[1N]) Indicates theta direction angle information calculated by each wheel,
Figure BDA0002130487720000098
represents K θ,I A set of (a). K θ,I The calculation formula is as follows:
Figure BDA0002130487720000093
the estimated values of the kinematic parameters in the theta direction are obtained as follows:
Figure BDA0002130487720000094
fourthly, the three wheels of the omnidirectional mobile robot are uniformly distributed, the vision sensor is positioned above the omnidirectional mobile robot, the ceiling above the vision sensor is provided with a tag to be identified, the omnidirectional mobile robot runs a specific track, and the tag is identified by the vision sensor to obtain the starting and ending poses of the omnidirectional mobile robot.
Firstly, calibrating a visual sensor to obtain internal and external parameters of the sensor; secondly, the omnidirectional mobile robot performs coordinate transformation, as shown in fig. 4 and 5, and the transformation relationship is shown in formula (18); and finally, obtaining the odometer information of the omnidirectional mobile robot measured by the vision sensor.
Figure BDA0002130487720000095
Wherein, the characters a and b are internal parameters of the omnidirectional mobile robot; c and d are the initial readings of the sensor; eta represents the rotation of the omnidirectional mobile robot in a world coordinate system; [ X ] C Y C ]Representing the position of an omnidirectional mobile robot in a camera coordinate systemInformation, [ X ] G Y G ]And the position information of the omnidirectional mobile robot in a world coordinate system is represented.
In the experimental process, the running linear velocity of the omnidirectional mobile robot is 0.3m/s, the sampling frequency is 20ms, the starting and ending poses of the omnidirectional mobile robot are obtained and are substituted into the third step, the formulas (11), (14) and (17), the estimated value of the kinematic parameter is obtained by calculation,
Figure BDA0002130487720000096
estimate representing the inverse of the kinematic parameter matrix:
Figure BDA0002130487720000097
and fifthly, substituting the estimated value of the inverse of the kinematic parameter matrix obtained in the fourth step into a formula (5), updating a kinematic model, and realizing the error compensation of the omnidirectional mobile robot.
At this point, the kinematic parameters of the three-wheeled omnidirectional mobile robot are determined.
The error compensation method for the omnidirectional mobile robot is also suitable for other types of wheeled robots, and the pose identification method can be realized by using a visual sensor, and manual measurement and laser positioning methods. In the calibration process, only the kinematics model is modified according to the calibrated omnidirectional mobile robot, and then error compensation is carried out based on the method.

Claims (1)

1. An error compensation method for an omnidirectional mobile robot is characterized in that the error compensation method firstly calculates a kinematics model of the omnidirectional mobile robot to obtain kinematics parameters meeting a multivariate linear equation; then, a mathematical model based on least square method multiple linear regression is given, parameter evaluation is carried out, and the relation between the kinematic parameters and the observed quantity is obtained through the method; finally, acquiring the pose and speed observed quantity of the omnidirectional mobile robot through a vision and encoder, calculating an estimated value of a kinematic parameter of the omnidirectional mobile robot, substituting the estimated value into an updated kinematic model, and realizing error compensation of the omnidirectional mobile robot; the method comprises the following steps:
firstly, a vision sensor is arranged on an omnidirectional mobile robot, and the relationship between the angular velocity of a wheel and the velocity in a world coordinate system is obtained through a kinematic model of the omnidirectional mobile robot;
the omnidirectional mobile robot is a mobile omnidirectional mobile robot with three wheels and double rows of Mecanum wheels uniformly distributed, a coordinate system of the omnidirectional mobile robot is established in the center of a base of the omnidirectional mobile robot, and a world coordinate system is established by taking an initial position as an origin; defining the initial position of the omnidirectional mobile robot as a global coordinate system X W O W Y W (ii) a When the omnidirectional mobile robot is in a non-slip state, the relationship between the speed in the global coordinate system and the wheel speed is as follows:
Figure FDA0003786821730000011
wherein i is 1,2,3, L i Represents the distance, alpha, from the ith wheel plane to the origin of the omnidirectional mobile robot coordinate system i The angle between the axial direction of the ith wheel and the positive direction of the X axis in the global coordinate system is shown, r represents the radius of the wheels of the omnidirectional mobile robot, beta is the steering angle of the wheels of the omnidirectional mobile robot, gamma is the angle between the main plane of the wheels and the roller shaft of the wheels, and a vector
Figure FDA0003786821730000012
Representing velocity V in world coordinate system x ,V y ,V ω ;ω i Indicates the angular velocity of the ith wheel; theta represents the direction of the omnidirectional mobile robot in a world coordinate system; the matrix R (θ) defines the transformation relationship between the coordinate system of the omnidirectional mobile robot and the global coordinate system:
Figure FDA0003786821730000013
according to the formula (1) and the formula (2), the result is transformed, the matrix P and the matrix M represent a kinematic parameter matrix, and the conversion relation between the angular velocity of the wheel and the velocity in a world coordinate system, namely an omnidirectional mobile robot kinematic model, is obtained, wherein the matrix P and the matrix M are shown in the formula (3), and the matrix P and the matrix M are shown in the formula (4), (5) and (6);
Figure FDA0003786821730000014
Figure FDA0003786821730000015
Figure FDA0003786821730000021
Figure FDA0003786821730000022
secondly, acquiring a odometer discrete model of the omnidirectional mobile robot as shown in a formula (7) based on the kinematics model of the omnidirectional mobile robot obtained in the first step, then combining the formula (4) and the formula (7) to obtain a relation between speed and displacement, and establishing a relation between an error parameter and an observed quantity;
the calculation process of the motion pose information of the omnidirectional mobile robot is dispersed into a superposition process to be solved, and the state of the omnidirectional mobile robot at the moment k is assumed to be S k =[x k y k θ k ] T Including position (x) relative to the world coordinate system k ,y k ) And rotation condition theta k ;V x,k And V y,k Respectively representing the speed in the X and Y directions relative to the global coordinate system at the moment k; delta theta k The variation of the rotation angle from the kth sampling point to the (k +1) th sampling point is shown, T represents the sampling frequency, and the state S of the robot reaches the k +1 moment after the translation and the rotation k+1 =[x k+1 y k+1 θ k+1 ] T (ii) a The discrete motion relationship is then expressed as:
Figure FDA0003786821730000023
obtaining a conversion relation as shown in a formula (8) according to a speed conversion relation between the angular speed of the wheel and a world coordinate system and a discrete motion relation in the formula (7) in the first step;
Figure FDA0003786821730000024
wherein m is gl Representing each element of the M matrix, where g ═ 13],l=[1 3],ω i,k Denotes the angular velocity of the three wheels at time k, where i ═ 1,2,3, [ X [ ] m Y m θ m ]Showing the pose of the omnidirectional mobile robot at time m, [ X ] 0 Y 0 θ 0 ]Representing the initial pose of the omnidirectional mobile robot;
thirdly, acquiring the attitude information of the robot by enabling the omnidirectional mobile robot to repeatedly run along a repeated track; performing parameter estimation by least square method of multiple linear regression, and calculating estimation value of kinematic parameter by using relation of translation, rotation and angular velocity
Figure FDA0003786821730000025
The calculation formula for estimating the X-direction parameters by adopting a least square method is as follows:
Figure FDA0003786821730000026
wherein, X m,I ,Y m,I Indicating the position of the omnidirectional mobile robot at time m, where I ═ 1N],K X,I Indicates the position information in the X direction calculated by each wheel, wherein I ═ 1N],
Figure FDA0003786821730000031
Represents K X,I A set of (a); k X,I The calculation formula is as follows:
Figure FDA0003786821730000032
the estimated values of the kinematic parameters in the X direction are obtained as follows:
Figure FDA0003786821730000033
similarly, the calculation formula for estimating the Y-direction parameters by adopting the least square method is as follows:
Figure FDA0003786821730000034
wherein, K Y,I Indicating Y-direction position information calculated by each wheel, where I ═ 1N],
Figure FDA0003786821730000035
Represents K Y,I A set of (a); k Y,I The calculation formula is as follows:
Figure FDA0003786821730000036
the estimated values of the kinematic parameters in the Y direction are obtained as follows:
Figure FDA0003786821730000037
similarly, the calculation formula for estimating the theta direction parameter by adopting the least square method is as follows:
Figure FDA0003786821730000038
wherein, K θ,I Denotes θ direction angle information calculated by each wheel, where I ═ 1N],
Figure FDA0003786821730000039
Represents K θ,I A set of (a); k θ,I The calculation formula is as follows:
Figure FDA00037868217300000310
the estimated values of the kinematic parameters in the theta direction are obtained as follows:
Figure FDA00037868217300000311
fourthly, three wheels of the omnidirectional mobile robot are uniformly distributed, the vision sensor is positioned above the omnidirectional mobile robot, a label to be identified is arranged above the vision sensor, the omnidirectional mobile robot runs a specific track, and the start and end poses of the omnidirectional mobile robot are obtained by identifying the label through the vision sensor;
firstly, calibrating a visual sensor to obtain internal and external parameters of the sensor; secondly, the omnidirectional mobile robot carries out coordinate conversion, and the conversion relation is shown as a formula (18); finally, obtaining odometry information of the omnidirectional mobile robot measured by the vision sensor;
Figure FDA0003786821730000041
wherein, the characters a and b are internal parameters of the omnidirectional mobile robot; c and d are the initial readings of the sensor; eta represents the rotation of the omnidirectional mobile robot in a world coordinate system; [ X ] C Y C ]Indicates position information of the omnidirectional mobile robot in a camera coordinate system, [ X ] G Y G ]Representing the position information of the omnidirectional mobile robot in a world coordinate system;
substituting the initial pose and the final pose of the omnidirectional mobile robot into the third step of Chinese formulas (11), (14) and (17), calculating to obtain an estimated value of the kinematic parameters,
Figure FDA0003786821730000042
estimate representing the inverse of the kinematic parameter matrix:
Figure FDA0003786821730000043
and fifthly, substituting the estimated value of the inverse of the kinematic parameter matrix obtained in the fourth step into a formula (5), updating a kinematic model, and realizing the error compensation of the omnidirectional mobile robot.
CN201910636415.0A 2019-07-15 2019-07-15 Error compensation method for omnidirectional mobile robot Active CN110542429B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910636415.0A CN110542429B (en) 2019-07-15 2019-07-15 Error compensation method for omnidirectional mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910636415.0A CN110542429B (en) 2019-07-15 2019-07-15 Error compensation method for omnidirectional mobile robot

Publications (2)

Publication Number Publication Date
CN110542429A CN110542429A (en) 2019-12-06
CN110542429B true CN110542429B (en) 2022-09-20

Family

ID=68709890

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910636415.0A Active CN110542429B (en) 2019-07-15 2019-07-15 Error compensation method for omnidirectional mobile robot

Country Status (1)

Country Link
CN (1) CN110542429B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111007522A (en) * 2019-12-16 2020-04-14 深圳市三宝创新智能有限公司 Position determination system of mobile robot
CN111191186B (en) * 2020-01-07 2021-09-28 江南大学 Multi-cell filtering method for positioning position of mobile robot in production workshop
CN111610523B (en) * 2020-05-15 2023-11-07 浙江工业大学 Parameter correction method for wheeled mobile robot
CN112462753B (en) * 2020-10-20 2024-01-30 天津大学 Kinematic modeling method for car-snake composite variable structure mobile robot
CN114035540B (en) * 2021-10-26 2024-03-22 嘉兴市敏硕智能科技有限公司 Omnidirectional mobile platform error self-calibration method, system, device and storage medium
CN114296454A (en) * 2021-12-24 2022-04-08 大连理工大学人工智能大连研究院 Self-adaptive motion control method and system of omnidirectional full-drive mobile robot
CN116026368B (en) * 2023-03-29 2023-07-04 上海仙工智能科技有限公司 Mobile robot parameter joint calibration method and system, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109017984A (en) * 2018-07-25 2018-12-18 吉林大学 A kind of track follow-up control method, control system and the relevant apparatus of unmanned vehicle
CN109540140A (en) * 2018-11-23 2019-03-29 于兴虎 A kind of method for positioning mobile robot merging SSD target identification and odometer information
CN109916431A (en) * 2019-04-12 2019-06-21 成都天富若博特科技有限责任公司 A kind of wheel encoder calibration algorithm for four wheel mobile robots

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9678210B2 (en) * 2014-12-19 2017-06-13 Caterpillar Inc. Error estimation in real-time visual odometry system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109017984A (en) * 2018-07-25 2018-12-18 吉林大学 A kind of track follow-up control method, control system and the relevant apparatus of unmanned vehicle
CN109540140A (en) * 2018-11-23 2019-03-29 于兴虎 A kind of method for positioning mobile robot merging SSD target identification and odometer information
CN109916431A (en) * 2019-04-12 2019-06-21 成都天富若博特科技有限责任公司 A kind of wheel encoder calibration algorithm for four wheel mobile robots

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Calibration of kinematic parameters of a Car-Like Mobile Robot to improve odometry accuracy;Kooktae Lee等;《2008 IEEE Inernational Conference on Robotics and Automation》;20080523;2546-2551 *
移动机器人基于多传感器数据融合的定位及地图创建研究;庄严;《中国优秀博硕士学位论文全文数据库(博士)信息科技辑》;20050715(第3期);1-125 *

Also Published As

Publication number Publication date
CN110542429A (en) 2019-12-06

Similar Documents

Publication Publication Date Title
CN110542429B (en) Error compensation method for omnidirectional mobile robot
CN108955688B (en) Method and system for positioning double-wheel differential mobile robot
CN109885883B (en) Unmanned vehicle transverse motion control method based on GK clustering algorithm model prediction
CN109916431B (en) Wheel encoder calibration algorithm for four-wheel mobile robot
CN111694361B (en) Steel structure flexible flaw detection robot track tracking method based on improved approach law sliding mode control
CN111142091B (en) Automatic driving system laser radar online calibration method fusing vehicle-mounted information
CN108362288B (en) Polarized light SLAM method based on unscented Kalman filtering
CN111462231A (en) Positioning method based on RGBD sensor and IMU sensor
Reinstein et al. Terrain adaptive odometry for mobile skid-steer robots
CN108387236B (en) Polarized light SLAM method based on extended Kalman filtering
CN112907735B (en) Flexible cable identification and three-dimensional reconstruction method based on point cloud
WO2022227460A1 (en) Pose prediction method and apparatus, electronic device, and storage medium
CN105867373A (en) Mobile robot posture reckoning method and system based on laser radar data
Lin et al. Calibration for odometry of omnidirectional mobile robots based on kinematic correction
CN112706165A (en) Tracking control method and system for wheel type mobile mechanical arm
CN110861081B (en) Autonomous positioning method for under-constrained cable parallel robot end effector
Ma et al. A geometry-based slip prediction model for planetary rovers
CN116337045A (en) High-speed map building navigation method based on karto and teb
CN114474003A (en) Vehicle-mounted construction robot error compensation method based on parameter identification
CN113129377B (en) Three-dimensional laser radar rapid robust SLAM method and device
CN114442054A (en) Sensor and chassis combined calibration system and method for mobile robot
CN115993089B (en) PL-ICP-based online four-steering-wheel AGV internal and external parameter calibration method
CN113885499B (en) Robot track fault-tolerant control method for detection in cavity
Belyaev et al. Slip detection and compensation system for mobile robot in heterogeneous environment
CN116338719A (en) Laser radar-inertia-vehicle fusion positioning method based on B spline function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant