CN114252073B - Robot attitude data fusion method - Google Patents

Robot attitude data fusion method Download PDF

Info

Publication number
CN114252073B
CN114252073B CN202111415634.XA CN202111415634A CN114252073B CN 114252073 B CN114252073 B CN 114252073B CN 202111415634 A CN202111415634 A CN 202111415634A CN 114252073 B CN114252073 B CN 114252073B
Authority
CN
China
Prior art keywords
angle
coordinate system
acceleration
quaternion
world coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111415634.XA
Other languages
Chinese (zh)
Other versions
CN114252073A (en
Inventor
张琳
骆敏舟
杭小树
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Intelligent Manufacturing Technology JITRI
Original Assignee
Institute of Intelligent Manufacturing Technology JITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Intelligent Manufacturing Technology JITRI filed Critical Institute of Intelligent Manufacturing Technology JITRI
Priority to CN202111415634.XA priority Critical patent/CN114252073B/en
Publication of CN114252073A publication Critical patent/CN114252073A/en
Application granted granted Critical
Publication of CN114252073B publication Critical patent/CN114252073B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a robot attitude data fusion method, which utilizes angular acceleration integration output by a gyroscope to obtain attitude angle quaternion, and three-axis acceleration and acceleration true value [0, 1 ] are carried out under a world coordinate system] T Obtaining delta qacc through comparison, correcting the roll angle and the pitch angle, and obtaining delta q according to magnetometer output mag To correct the yaw angle and then to correct the attitude. The drawn map obtained by adopting the data fusion method has obvious edge lines, smooth wall bodies and clear right-angle corners.

Description

Robot attitude data fusion method
Technical Field
The invention belongs to the technical field of robots, and particularly relates to a robot gesture data fusion method.
Background
SLAM (synchronous localization and mapping, simultaneous localization and mapping) of mobile robots is one of the key technologies for automatic navigation. With the increasing artificial intelligence and hardware computing power, SLAM technology is becoming increasingly diverse. SLAM based on two-dimensional radar has been a hotspot in research processes, and will face relatively complex environmental and difficult problems. Gmapping is a common framework of filtering-based open source SLAM algorithms. The algorithm can draw an indoor map in real time, and the progress is high when a small scene map is created, but the algorithm is highly dependent on odometer data. If the odometer uses only the data obtained from the motion sensor to estimate the position of the object, a deviation in positioning may occur. In practice, there is also wheel slip and accumulated error. Accurate odometer data is important for the whole system and certain deviations can occur if the odometer is obtained solely by using an encoder.
The IMU is an inertial measurement unit, is a high-sensitivity sensor, and has certain deviation between an actual coordinate axis and an ideal coordinate axis due to physical factors in the manufacturing process, so that IMU noise, scale error and axis deviation are caused. Therefore, calibration of the collected data is required to compensate for these errors. In the data fusion process of IMU calibration data, high-frequency noise of the accelerometer and the magnetometer and low-frequency noise of the gyroscope can cause larger errors in attitude calculation. It is therefore necessary to select an appropriate data fusion method.
Disclosure of Invention
The invention provides a robot gesture data fusion method aiming at the problems in the prior art.
The invention solves the technical problems by the following technical means:
the accelerometer is sensitive to acceleration, and the inclination angle error is relatively large by taking an instantaneous value; the accelerometer measures tilt angle, which is correspondingly slow in its dynamics, and signals are not available at high frequencies, so high frequencies can be suppressed by low pass. The angle obtained by integrating the gyroscope is not influenced by acceleration, but the error caused by the integral drift and the temperature drift is larger along with the increase of time, the gyroscope responds quickly, and the inclination angle can be measured after the integration.
According to the invention, the angular acceleration output by the gyroscope is used, the angle is obtained through integration, and then the rolling angle and the pitch angle of the angle are corrected through the output of the accelerometer; and finally correcting the yaw angle by a magnetometer.
The method comprises the following steps:
1) The attitude angle quaternion at the Kth moment under a discrete system is obtained by adopting the following formula:
g represents the world coordinate system, L represents the current coordinate system,
and when the k moment is represented, the derivative of the quaternion q.
2) The three-axis acceleration is converted to a world coordinate system using:
representation->The quaternion is inversely transformed.
3) Estimation of acceleration in world coordinate system to be obtained G g p Comparing with the actual gravity acceleration, the method is calculated by adopting the following formulaError quaternion of (c):
through the calculation, the method has the advantages that,
4) Delta q is calculated using mag
Δq mag =[Δq 0mag 0 0 Δq 3mag ] T
Through the calculation, the method has the advantages that,
5) Posture correction is performed by using the following method
The beneficial effects of the invention are as follows: the drawn map obtained by adopting the data fusion method has obvious edge lines, smooth wall bodies and clear right-angle corners.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a map drawn without the method of the present invention;
FIG. 3 is a map drawn using the method of the present invention;
fig. 4 is an application scenario of the robot pose data fusion method of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described in conjunction with the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Examples
As shown in fig. 4, in an application scenario of the method for fusing the gesture data of the robot, the IMU gesture sensor is located on the central axis of the disinfection robot, and is coaxial with the laser radar, the robot is driven by a wheel, the odometer is located in a DS servo driver connected with a hub motor, and the hub motor is distributed on two sides of the central axis and is symmetrical about the central axis.
The rotation matrix R in SLAM can be represented by a quaternion q, q having one real part and three imaginary parts, written as q= [ q0, q1, q2, q3] T Or q= [ w, x, y, z] T Q0 is the real part and is a scalar, [ q1, q2, q3 ]] T Is the imaginary part, which is the vector.
FIG. 1 is a flow chart of the method, and the specific steps of the method include:
using the quaternion q to represent rotation, the attitude angle quaternion at the kth time under the discrete system can be calculated by the following formula (1):
in the formula (1), G represents a world coordinate system, and L represents a current coordinate system;
multiplying by any vector in the world coordinate system, the vector can be rotated from the world coordinate system to the current coordinate system;
when the k moment is represented, the derivative of the quaternion q;
the quaternion derivative calculation formula (2) is:
in the formula (2), the calculation formula (3) of the circle is:
using the forward differential equation (1)
The circle is multiplied into a common point multiplication operation:
using calculation of angular velocityConverting the three-axis acceleration into the world coordinate system, and utilizing the acceleration true value [0, 1 ] in the converted world coordinate system] T And (g) comparing to obtain delta qacc, wherein the delta qacc only comprises correction of the roll angle and the pitch angle because the roll angle and the pitch angle can be calculated by using the three-axis acceleration data. Using gyroscopes to get->Will be L a conversion to world coordinate system:
in the formula (8), the amino acid sequence of the compound,representation->The quaternion is inversely transformed.
Estimation of acceleration in world coordinate system to be obtained G g p Comparing with the actual gravity acceleration, calculatingError quaternion of (c):
r (q) represents a rotation matrix corresponding to the quaternion q, and the specific formula is as follows:
and (3) finishing to obtain:
let Δqacc=0, and finally solve for:
after superposition, finally the method can be obtained:
because the accelerometer can only correct the roll angle and the yaw angle, and the magnetometer is used for correcting the yaw angle, the output of the magnetometer can be understood as the top of the magnetic line or projection of the three-axis included angle of the machine body coordinate system.
Δq mag =[Δq 0mag 0 0 Δq 3mag ] T ......(14);
And (4) combining the two (14) and (15), and obtaining:
gamma is a gamma function in formula (16);
correcting the accelerated posture, and adding yaw angle correction:
simulation experiments are carried out in a Turtlebot simulation experiment environment, wherein FIG. 2 corresponds to a drawing map obtained by data fusion without adopting the invention, and FIG. 3 corresponds to a drawing map obtained by adopting the data fusion. As can be clearly seen from the simulation diagram, the map before data fusion has the defects that the wall identification deviation is unclear at some edge lines, and particularly, the position of a right-angle corner is blurred. The map edge line after data fusion is obvious, the wall is smooth, and the right-angle corner is clear.
It is noted that relational terms such as first and second, and the like, if any, are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (2)

1. The robot posture data fusion method is characterized by comprising the following steps of:
step 1), obtaining an attitude angle quaternion by utilizing angular acceleration integration output by a gyroscope;
step 2), three-axis acceleration and acceleration true value [0, 1 ] are carried out under the world coordinate system] T Obtaining delta qacc through comparison to correct the rolling angle and the pitch angle;
step 3), obtaining delta q according to magnetometer output mag To correct the yaw angle;
step 4), carrying out posture correction by adopting the following formula:
in step 2), three-axis acceleration is converted into a world coordinate system by adopting the following steps:
representation->Inverse transforming quaternions;
estimation of acceleration in world coordinate system to be obtained G g p Comparing with the actual gravity acceleration, the method is calculated by adopting the following formulaError quaternion of (c):
through the calculation, the method has the advantages that,
in step 3), Δq is calculated using the formula mag
Δq mag =[Δq 0mag 0 0 Δq 3mag ] T
Through the calculation, the method has the advantages that,
2. the method for fusing robot pose data according to claim 1, wherein the pose angle quaternion at the kth time under the discrete system in step 1) is calculated by the following formula:
g represents the world coordinate system, L represents the current coordinate system,
and when the k moment is represented, the derivative of the quaternion q.
CN202111415634.XA 2021-11-25 2021-11-25 Robot attitude data fusion method Active CN114252073B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111415634.XA CN114252073B (en) 2021-11-25 2021-11-25 Robot attitude data fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111415634.XA CN114252073B (en) 2021-11-25 2021-11-25 Robot attitude data fusion method

Publications (2)

Publication Number Publication Date
CN114252073A CN114252073A (en) 2022-03-29
CN114252073B true CN114252073B (en) 2023-09-15

Family

ID=80793301

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111415634.XA Active CN114252073B (en) 2021-11-25 2021-11-25 Robot attitude data fusion method

Country Status (1)

Country Link
CN (1) CN114252073B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117288187B (en) * 2023-11-23 2024-02-23 北京小米机器人技术有限公司 Robot pose determining method and device, electronic equipment and storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7216055B1 (en) * 1998-06-05 2007-05-08 Crossbow Technology, Inc. Dynamic attitude measurement method and apparatus
CN102438521A (en) * 2009-04-24 2012-05-02 原子能和辅助替代能源委员会 System and method for determining the posture of a person
CN108225370A (en) * 2017-12-15 2018-06-29 路军 A kind of data fusion and calculation method of athletic posture sensor
WO2018214226A1 (en) * 2017-05-22 2018-11-29 深圳市靖洲科技有限公司 Unmanned vehicle real-time posture measurement method
CN109001787A (en) * 2018-05-25 2018-12-14 北京大学深圳研究生院 A kind of method and its merge sensor of solving of attitude and positioning
CN109108936A (en) * 2018-10-24 2019-01-01 电子科技大学 A kind of the self-balance robot control system and control method of Multiple Source Sensor data fusion
CN110231029A (en) * 2019-05-08 2019-09-13 西安交通大学 A kind of underwater robot Multi-sensor Fusion data processing method
CN110440746A (en) * 2019-08-05 2019-11-12 桂林电子科技大学 A kind of no-dig technique subterranean drill bit posture fusion method based on the decline of quaternary number gradient
CN110595434A (en) * 2019-09-10 2019-12-20 兰州交通大学 Quaternion fusion attitude estimation method based on MEMS sensor
CN111156998A (en) * 2019-12-26 2020-05-15 华南理工大学 Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN112082547A (en) * 2020-09-08 2020-12-15 北京邮电大学 Integrated navigation system optimization method and device, electronic equipment and storage medium
CN112256047A (en) * 2020-09-30 2021-01-22 江苏集萃智能制造技术研究所有限公司 Quaternion-based four-foot attitude control strategy
CN112665574A (en) * 2020-11-26 2021-04-16 江苏科技大学 Underwater robot attitude acquisition method based on momentum gradient descent method
CN113091738A (en) * 2021-04-09 2021-07-09 安徽工程大学 Mobile robot map construction method based on visual inertial navigation fusion and related equipment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7216055B1 (en) * 1998-06-05 2007-05-08 Crossbow Technology, Inc. Dynamic attitude measurement method and apparatus
CN102438521A (en) * 2009-04-24 2012-05-02 原子能和辅助替代能源委员会 System and method for determining the posture of a person
WO2018214226A1 (en) * 2017-05-22 2018-11-29 深圳市靖洲科技有限公司 Unmanned vehicle real-time posture measurement method
CN108225370A (en) * 2017-12-15 2018-06-29 路军 A kind of data fusion and calculation method of athletic posture sensor
CN109001787A (en) * 2018-05-25 2018-12-14 北京大学深圳研究生院 A kind of method and its merge sensor of solving of attitude and positioning
CN109108936A (en) * 2018-10-24 2019-01-01 电子科技大学 A kind of the self-balance robot control system and control method of Multiple Source Sensor data fusion
CN110231029A (en) * 2019-05-08 2019-09-13 西安交通大学 A kind of underwater robot Multi-sensor Fusion data processing method
CN110440746A (en) * 2019-08-05 2019-11-12 桂林电子科技大学 A kind of no-dig technique subterranean drill bit posture fusion method based on the decline of quaternary number gradient
CN110595434A (en) * 2019-09-10 2019-12-20 兰州交通大学 Quaternion fusion attitude estimation method based on MEMS sensor
CN111156998A (en) * 2019-12-26 2020-05-15 华南理工大学 Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN112082547A (en) * 2020-09-08 2020-12-15 北京邮电大学 Integrated navigation system optimization method and device, electronic equipment and storage medium
CN112256047A (en) * 2020-09-30 2021-01-22 江苏集萃智能制造技术研究所有限公司 Quaternion-based four-foot attitude control strategy
CN112665574A (en) * 2020-11-26 2021-04-16 江苏科技大学 Underwater robot attitude acquisition method based on momentum gradient descent method
CN113091738A (en) * 2021-04-09 2021-07-09 安徽工程大学 Mobile robot map construction method based on visual inertial navigation fusion and related equipment

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Saurav Agarwal等.Robust Pose-Graph SLAM Using Absolute Orientation Sensing.《IEEE Robotics and Automation Letters》.2019,981-988. *
吴骁航.非理想情况下非线性系统的滤波及信息融合算法.《中国博士学位论文全文数据库信息科技辑》.2020,(第1期),I135-81. *
孙玉杰.协作机器人姿态检测及姿态误差补偿技术研.《中国优秀硕士学位论文全文数据库 信息科技辑》.2019,(第9期),I140-311. *
章政等.运动加速度抑制的动态步长梯度下降姿态解算算法.《信息与控制》.2017,(第2期),136-143. *
赵梓乔.基于三维激光扫描仪的室内移动机器人定位与建图.《中国优秀硕士学位论文全文数据库 信息科技辑.2018,(第1期),I140-552. *

Also Published As

Publication number Publication date
CN114252073A (en) 2022-03-29

Similar Documents

Publication Publication Date Title
CN111207774B (en) Method and system for laser-IMU external reference calibration
WO2020253854A1 (en) Mobile robot posture angle calculation method
CN111024066B (en) Unmanned aerial vehicle vision-inertia fusion indoor positioning method
CN108731670B (en) Inertial/visual odometer integrated navigation positioning method based on measurement model optimization
CN108253963B (en) Robot active disturbance rejection positioning method and positioning system based on multi-sensor fusion
CN107607113B (en) Method for measuring inclination angles of two-axis attitude
JP2012173190A (en) Positioning system and positioning method
CN105910606A (en) Direction adjustment method based on angular velocity difference
CN113155129B (en) Holder attitude estimation method based on extended Kalman filtering
CN115540860A (en) Multi-sensor fusion pose estimation algorithm
CN114252073B (en) Robot attitude data fusion method
CN108871323B (en) High-precision navigation method of low-cost inertial sensor in locomotive environment
CN116817896A (en) Gesture resolving method based on extended Kalman filtering
CN110986928A (en) Real-time drift correction method for triaxial gyroscope of photoelectric pod
CN110645976B (en) Attitude estimation method of mobile robot and terminal equipment
CN115752507A (en) Online single-steering-wheel AGV parameter calibration method and system based on two-dimensional code navigation
CN109443355B (en) Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF
CN113503872B (en) Low-speed unmanned aerial vehicle positioning method based on fusion of camera and consumption-level IMU
CN115015956A (en) Laser and vision SLAM system of indoor unmanned vehicle
CN115993089B (en) PL-ICP-based online four-steering-wheel AGV internal and external parameter calibration method
CN109674480B (en) Human motion attitude calculation method based on improved complementary filtering
CN114543786B (en) Wall climbing robot positioning method based on visual inertial odometer
CN110733671B (en) Dynamic correction method for small celestial body spin angular velocity
CN113237485A (en) SLAM method and system based on multi-sensor fusion
CN115717901B (en) Inertial/visual odometer installation error estimation method based on filtering

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant