CN114252073B  Robot attitude data fusion method  Google Patents
Robot attitude data fusion method Download PDFInfo
 Publication number
 CN114252073B CN114252073B CN202111415634.XA CN202111415634A CN114252073B CN 114252073 B CN114252073 B CN 114252073B CN 202111415634 A CN202111415634 A CN 202111415634A CN 114252073 B CN114252073 B CN 114252073B
 Authority
 CN
 China
 Prior art keywords
 angle
 coordinate system
 acceleration
 quaternion
 world coordinate
 Prior art date
 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
 Active
Links
Classifications

 G—PHYSICS
 G01—MEASURING; TESTING
 G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
 G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00
 G01C21/20—Instruments for performing navigational calculations

 G—PHYSICS
 G01—MEASURING; TESTING
 G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
 G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00
 G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00 by using measurements of speed or acceleration
 G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
 G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
 G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with noninertial navigation instruments

 G—PHYSICS
 G01—MEASURING; TESTING
 G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
 G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00
 G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00 by using measurements of speed or acceleration
 G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
 G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
 G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with noninertial navigation instruments
 G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00  G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with noninertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
Abstract
The invention discloses a robot attitude data fusion method, which utilizes angular acceleration integration output by a gyroscope to obtain attitude angle quaternion, and threeaxis acceleration and acceleration true value [0, 1 ] are carried out under a world coordinate system] ^{T} Obtaining delta qacc through comparison, correcting the roll angle and the pitch angle, and obtaining delta q according to magnetometer output _{mag} To correct the yaw angle and then to correct the attitude. The drawn map obtained by adopting the data fusion method has obvious edge lines, smooth wall bodies and clear rightangle corners.
Description
Technical Field
The invention belongs to the technical field of robots, and particularly relates to a robot gesture data fusion method.
Background
SLAM (synchronous localization and mapping, simultaneous localization and mapping) of mobile robots is one of the key technologies for automatic navigation. With the increasing artificial intelligence and hardware computing power, SLAM technology is becoming increasingly diverse. SLAM based on twodimensional radar has been a hotspot in research processes, and will face relatively complex environmental and difficult problems. Gmapping is a common framework of filteringbased open source SLAM algorithms. The algorithm can draw an indoor map in real time, and the progress is high when a small scene map is created, but the algorithm is highly dependent on odometer data. If the odometer uses only the data obtained from the motion sensor to estimate the position of the object, a deviation in positioning may occur. In practice, there is also wheel slip and accumulated error. Accurate odometer data is important for the whole system and certain deviations can occur if the odometer is obtained solely by using an encoder.
The IMU is an inertial measurement unit, is a highsensitivity sensor, and has certain deviation between an actual coordinate axis and an ideal coordinate axis due to physical factors in the manufacturing process, so that IMU noise, scale error and axis deviation are caused. Therefore, calibration of the collected data is required to compensate for these errors. In the data fusion process of IMU calibration data, highfrequency noise of the accelerometer and the magnetometer and lowfrequency noise of the gyroscope can cause larger errors in attitude calculation. It is therefore necessary to select an appropriate data fusion method.
Disclosure of Invention
The invention provides a robot gesture data fusion method aiming at the problems in the prior art.
The invention solves the technical problems by the following technical means:
the accelerometer is sensitive to acceleration, and the inclination angle error is relatively large by taking an instantaneous value; the accelerometer measures tilt angle, which is correspondingly slow in its dynamics, and signals are not available at high frequencies, so high frequencies can be suppressed by low pass. The angle obtained by integrating the gyroscope is not influenced by acceleration, but the error caused by the integral drift and the temperature drift is larger along with the increase of time, the gyroscope responds quickly, and the inclination angle can be measured after the integration.
According to the invention, the angular acceleration output by the gyroscope is used, the angle is obtained through integration, and then the rolling angle and the pitch angle of the angle are corrected through the output of the accelerometer; and finally correcting the yaw angle by a magnetometer.
The method comprises the following steps:
1) The attitude angle quaternion at the Kth moment under a discrete system is obtained by adopting the following formula:
g represents the world coordinate system, L represents the current coordinate system,
and when the k moment is represented, the derivative of the quaternion q.
2) The threeaxis acceleration is converted to a world coordinate system using:
representation>The quaternion is inversely transformed.
3) Estimation of acceleration in world coordinate system to be obtained ^{G} g _{p} Comparing with the actual gravity acceleration, the method is calculated by adopting the following formulaError quaternion of (c):
through the calculation, the method has the advantages that,
4) Delta q is calculated using _{mag} ：
Δq _{mag} ＝[Δq _{0mag} 0 0 Δq _{3mag} ] ^{T}
Through the calculation, the method has the advantages that,
5) Posture correction is performed by using the following method
The beneficial effects of the invention are as follows: the drawn map obtained by adopting the data fusion method has obvious edge lines, smooth wall bodies and clear rightangle corners.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a map drawn without the method of the present invention;
FIG. 3 is a map drawn using the method of the present invention;
fig. 4 is an application scenario of the robot pose data fusion method of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described in conjunction with the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Examples
As shown in fig. 4, in an application scenario of the method for fusing the gesture data of the robot, the IMU gesture sensor is located on the central axis of the disinfection robot, and is coaxial with the laser radar, the robot is driven by a wheel, the odometer is located in a DS servo driver connected with a hub motor, and the hub motor is distributed on two sides of the central axis and is symmetrical about the central axis.
The rotation matrix R in SLAM can be represented by a quaternion q, q having one real part and three imaginary parts, written as q= [ q0, q1, q2, q3] ^{T} Or q= [ w, x, y, z] ^{T} Q0 is the real part and is a scalar, [ q1, q2, q3 ]] ^{T} Is the imaginary part, which is the vector.
FIG. 1 is a flow chart of the method, and the specific steps of the method include:
using the quaternion q to represent rotation, the attitude angle quaternion at the kth time under the discrete system can be calculated by the following formula (1):
in the formula (1), G represents a world coordinate system, and L represents a current coordinate system;
multiplying by any vector in the world coordinate system, the vector can be rotated from the world coordinate system to the current coordinate system;
when the k moment is represented, the derivative of the quaternion q;
the quaternion derivative calculation formula (2) is:
in the formula (2), the calculation formula (3) of the circle is:
using the forward differential equation (1)
The circle is multiplied into a common point multiplication operation:
using calculation of angular velocityConverting the threeaxis acceleration into the world coordinate system, and utilizing the acceleration true value [0, 1 ] in the converted world coordinate system] ^{T} And (g) comparing to obtain delta qacc, wherein the delta qacc only comprises correction of the roll angle and the pitch angle because the roll angle and the pitch angle can be calculated by using the threeaxis acceleration data. Using gyroscopes to get>Will be ^{L} a conversion to world coordinate system:
in the formula (8), the amino acid sequence of the compound,representation>The quaternion is inversely transformed.
Estimation of acceleration in world coordinate system to be obtained ^{G} g _{p} Comparing with the actual gravity acceleration, calculatingError quaternion of (c):
r (q) represents a rotation matrix corresponding to the quaternion q, and the specific formula is as follows:
and (3) finishing to obtain:
let Δqacc=0, and finally solve for:
after superposition, finally the method can be obtained:
because the accelerometer can only correct the roll angle and the yaw angle, and the magnetometer is used for correcting the yaw angle, the output of the magnetometer can be understood as the top of the magnetic line or projection of the threeaxis included angle of the machine body coordinate system.
Δq _{mag} ＝[Δq _{0mag} 0 0 Δq _{3mag} ] ^{T} ......(14)；
And (4) combining the two (14) and (15), and obtaining:
gamma is a gamma function in formula (16);
correcting the accelerated posture, and adding yaw angle correction:
simulation experiments are carried out in a Turtlebot simulation experiment environment, wherein FIG. 2 corresponds to a drawing map obtained by data fusion without adopting the invention, and FIG. 3 corresponds to a drawing map obtained by adopting the data fusion. As can be clearly seen from the simulation diagram, the map before data fusion has the defects that the wall identification deviation is unclear at some edge lines, and particularly, the position of a rightangle corner is blurred. The map edge line after data fusion is obvious, the wall is smooth, and the rightangle corner is clear.
It is noted that relational terms such as first and second, and the like, if any, are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a nonexclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.
Claims (2)
1. The robot posture data fusion method is characterized by comprising the following steps of:
step 1), obtaining an attitude angle quaternion by utilizing angular acceleration integration output by a gyroscope;
step 2), threeaxis acceleration and acceleration true value [0, 1 ] are carried out under the world coordinate system] ^{T} Obtaining delta qacc through comparison to correct the rolling angle and the pitch angle;
step 3), obtaining delta q according to magnetometer output _{mag} To correct the yaw angle;
step 4), carrying out posture correction by adopting the following formula:
in step 2), threeaxis acceleration is converted into a world coordinate system by adopting the following steps:
representation>Inverse transforming quaternions;
estimation of acceleration in world coordinate system to be obtained ^{G} g _{p} Comparing with the actual gravity acceleration, the method is calculated by adopting the following formulaError quaternion of (c):
through the calculation, the method has the advantages that,
in step 3), Δq is calculated using the formula _{mag} ：
Δq _{mag} ＝[Δq _{0mag} 0 0 Δq _{3mag} ] ^{T}
Through the calculation, the method has the advantages that,
2. the method for fusing robot pose data according to claim 1, wherein the pose angle quaternion at the kth time under the discrete system in step 1) is calculated by the following formula:
g represents the world coordinate system, L represents the current coordinate system,
and when the k moment is represented, the derivative of the quaternion q.
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

CN202111415634.XA CN114252073B (en)  20211125  20211125  Robot attitude data fusion method 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

CN202111415634.XA CN114252073B (en)  20211125  20211125  Robot attitude data fusion method 
Publications (2)
Publication Number  Publication Date 

CN114252073A CN114252073A (en)  20220329 
CN114252073B true CN114252073B (en)  20230915 
Family
ID=80793301
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

CN202111415634.XA Active CN114252073B (en)  20211125  20211125  Robot attitude data fusion method 
Country Status (1)
Country  Link 

CN (1)  CN114252073B (en) 
Families Citing this family (1)
Publication number  Priority date  Publication date  Assignee  Title 

CN117288187B (en) *  20231123  20240223  北京小米机器人技术有限公司  Robot pose determining method and device, electronic equipment and storage medium 
Citations (14)
Publication number  Priority date  Publication date  Assignee  Title 

US7216055B1 (en) *  19980605  20070508  Crossbow Technology, Inc.  Dynamic attitude measurement method and apparatus 
CN102438521A (en) *  20090424  20120502  原子能和辅助替代能源委员会  System and method for determining the posture of a person 
CN108225370A (en) *  20171215  20180629  路军  A kind of data fusion and calculation method of athletic posture sensor 
WO2018214226A1 (en) *  20170522  20181129  深圳市靖洲科技有限公司  Unmanned vehicle realtime posture measurement method 
CN109001787A (en) *  20180525  20181214  北京大学深圳研究生院  A kind of method and its merge sensor of solving of attitude and positioning 
CN109108936A (en) *  20181024  20190101  电子科技大学  A kind of the selfbalance robot control system and control method of Multiple Source Sensor data fusion 
CN110231029A (en) *  20190508  20190913  西安交通大学  A kind of underwater robot Multisensor Fusion data processing method 
CN110440746A (en) *  20190805  20191112  桂林电子科技大学  A kind of nodig technique subterranean drill bit posture fusion method based on the decline of quaternary number gradient 
CN110595434A (en) *  20190910  20191220  兰州交通大学  Quaternion fusion attitude estimation method based on MEMS sensor 
CN111156998A (en) *  20191226  20200515  华南理工大学  Mobile robot positioning method based on RGBD camera and IMU information fusion 
CN112082547A (en) *  20200908  20201215  北京邮电大学  Integrated navigation system optimization method and device, electronic equipment and storage medium 
CN112256047A (en) *  20200930  20210122  江苏集萃智能制造技术研究所有限公司  Quaternionbased fourfoot attitude control strategy 
CN112665574A (en) *  20201126  20210416  江苏科技大学  Underwater robot attitude acquisition method based on momentum gradient descent method 
CN113091738A (en) *  20210409  20210709  安徽工程大学  Mobile robot map construction method based on visual inertial navigation fusion and related equipment 

2021
 20211125 CN CN202111415634.XA patent/CN114252073B/en active Active
Patent Citations (14)
Publication number  Priority date  Publication date  Assignee  Title 

US7216055B1 (en) *  19980605  20070508  Crossbow Technology, Inc.  Dynamic attitude measurement method and apparatus 
CN102438521A (en) *  20090424  20120502  原子能和辅助替代能源委员会  System and method for determining the posture of a person 
WO2018214226A1 (en) *  20170522  20181129  深圳市靖洲科技有限公司  Unmanned vehicle realtime posture measurement method 
CN108225370A (en) *  20171215  20180629  路军  A kind of data fusion and calculation method of athletic posture sensor 
CN109001787A (en) *  20180525  20181214  北京大学深圳研究生院  A kind of method and its merge sensor of solving of attitude and positioning 
CN109108936A (en) *  20181024  20190101  电子科技大学  A kind of the selfbalance robot control system and control method of Multiple Source Sensor data fusion 
CN110231029A (en) *  20190508  20190913  西安交通大学  A kind of underwater robot Multisensor Fusion data processing method 
CN110440746A (en) *  20190805  20191112  桂林电子科技大学  A kind of nodig technique subterranean drill bit posture fusion method based on the decline of quaternary number gradient 
CN110595434A (en) *  20190910  20191220  兰州交通大学  Quaternion fusion attitude estimation method based on MEMS sensor 
CN111156998A (en) *  20191226  20200515  华南理工大学  Mobile robot positioning method based on RGBD camera and IMU information fusion 
CN112082547A (en) *  20200908  20201215  北京邮电大学  Integrated navigation system optimization method and device, electronic equipment and storage medium 
CN112256047A (en) *  20200930  20210122  江苏集萃智能制造技术研究所有限公司  Quaternionbased fourfoot attitude control strategy 
CN112665574A (en) *  20201126  20210416  江苏科技大学  Underwater robot attitude acquisition method based on momentum gradient descent method 
CN113091738A (en) *  20210409  20210709  安徽工程大学  Mobile robot map construction method based on visual inertial navigation fusion and related equipment 
NonPatent Citations (5)
Title 

Saurav Agarwal等.Robust PoseGraph SLAM Using Absolute Orientation Sensing.《IEEE Robotics and Automation Letters》.2019,981988. * 
吴骁航.非理想情况下非线性系统的滤波及信息融合算法.《中国博士学位论文全文数据库信息科技辑》.2020,(第1期),I13581. * 
孙玉杰.协作机器人姿态检测及姿态误差补偿技术研.《中国优秀硕士学位论文全文数据库 信息科技辑》.2019,(第9期),I140311. * 
章政等.运动加速度抑制的动态步长梯度下降姿态解算算法.《信息与控制》.2017,(第2期),136143. * 
赵梓乔.基于三维激光扫描仪的室内移动机器人定位与建图.《中国优秀硕士学位论文全文数据库 信息科技辑.2018,(第1期),I140552. * 
Also Published As
Publication number  Publication date 

CN114252073A (en)  20220329 
Similar Documents
Publication  Publication Date  Title 

CN111207774B (en)  Method and system for laserIMU external reference calibration  
WO2020253854A1 (en)  Mobile robot posture angle calculation method  
CN111024066B (en)  Unmanned aerial vehicle visioninertia fusion indoor positioning method  
CN108731670B (en)  Inertial/visual odometer integrated navigation positioning method based on measurement model optimization  
CN108253963B (en)  Robot active disturbance rejection positioning method and positioning system based on multisensor fusion  
CN107607113B (en)  Method for measuring inclination angles of twoaxis attitude  
JP2012173190A (en)  Positioning system and positioning method  
CN105910606A (en)  Direction adjustment method based on angular velocity difference  
CN113155129B (en)  Holder attitude estimation method based on extended Kalman filtering  
CN114252073B (en)  Robot attitude data fusion method  
CN108871323B (en)  Highprecision navigation method of lowcost inertial sensor in locomotive environment  
CN110986928A (en)  Realtime drift correction method for triaxial gyroscope of photoelectric pod  
CN115540860A (en)  Multisensor fusion pose estimation algorithm  
CN116817896A (en)  Gesture resolving method based on extended Kalman filtering  
CN115752507A (en)  Online singlesteeringwheel AGV parameter calibration method and system based on twodimensional code navigation  
CN109443355B (en)  Visualinertial tight coupling combined navigation method based on selfadaptive Gaussian PF  
WO2022160811A1 (en)  Footed robot motion trajectory tracking method and device, and readable storage medium  
CN115015956A (en)  Laser and vision SLAM system of indoor unmanned vehicle  
CN112797979B (en)  Inertial attitude navigation system applied to AGV  
CN115993089B (en)  PLICPbased online foursteeringwheel AGV internal and external parameter calibration method  
CN109674480B (en)  Human motion attitude calculation method based on improved complementary filtering  
CN114543786B (en)  Wall climbing robot positioning method based on visual inertial odometer  
CN116338719A (en)  Laser radarinertiavehicle fusion positioning method based on B spline function  
CN110645976A (en)  Attitude estimation method of mobile robot and terminal equipment  
CN110733671B (en)  Dynamic correction method for small celestial body spin angular velocity 
Legal Events
Date  Code  Title  Description 

PB01  Publication  
PB01  Publication  
SE01  Entry into force of request for substantive examination  
SE01  Entry into force of request for substantive examination  
GR01  Patent grant  
GR01  Patent grant 