CN116242372A - UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment - Google Patents

UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment Download PDF

Info

Publication number
CN116242372A
CN116242372A CN202310002241.9A CN202310002241A CN116242372A CN 116242372 A CN116242372 A CN 116242372A CN 202310002241 A CN202310002241 A CN 202310002241A CN 116242372 A CN116242372 A CN 116242372A
Authority
CN
China
Prior art keywords
uwb
observation
positioning
matrix
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310002241.9A
Other languages
Chinese (zh)
Inventor
潘树国
刘宏
高旺
吴鹏博
贾丰硕
赵庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202310002241.9A priority Critical patent/CN116242372A/en
Publication of CN116242372A publication Critical patent/CN116242372A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/183Compensation of inertial measurements, e.g. for temperature effects
    • G01C21/188Compensation of inertial measurements, e.g. for temperature effects for accumulated errors, e.g. by coupling inertial systems with absolute positioning systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The invention discloses a UWB-laser radar (LiDAR) -Inertial Navigation (INS) fusion positioning method under a GNSS refusing environment, which comprises the following steps: removing non-line-of-sight errors of ranging information of UWB, and compensating original observation motion distortion by using LiDAR point clouds; UWB utilizes extended Kalman filtering algorithm to realize position calculation, calculates initial position and attitude, and completes initialization of INS system; the UWB positioning needs to pass through a precision factor (DOP) and the number of base stations, the quality judgment is carried out by observing residual errors, and positioning points with larger errors are removed; the INS system adopts a mechanical arrangement algorithm, realizes high-frequency pose calculation, and simultaneously takes the result as a priori for a dynamic matching process of LiDAR; the rear end structure maintains a light factor graph, and the observation value of the laser radar odometer, the IMU pre-integration and the UWB absolute observation are fused to reduce the accumulated error. The invention can provide a reliable solution for the problem of real-time positioning in the GNSS refusing environment and can maintain the positioning accuracy of decimeter level.

Description

UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment
Technical Field
The invention belongs to the field of positioning application in a GNSS rejection environment, relates to a multisource fusion positioning method, and particularly relates to a back-end optimization algorithm of a filtering and factor graph.
Background
Reliable state estimation is a fundamental requirement of autonomous unmanned systems such as autopilot. In outdoor open environments, RTKs can provide centimeter-level positioning accuracy. Inertial Navigation (INS) is a sensor which can realize positioning by relying on the inertial navigation device in the existing navigation means, has high short-time precision and maintains positioning in a GNSS part refused environment, but long-time inertial navigation can be rapidly diffused. Today, how to achieve high-precision and robust positioning is still a technical difficulty in GNSS rejection scenarios in underground spaces, urban markets, etc.
In order to solve such problems, positioning by means of sensors such as a laser radar and a vision camera has been rapidly developed in recent years. Meanwhile, the method realizes self state estimation based on the surrounding environment, establishes a perception map according to the environment, and can be used as a basis of navigation planning. The visual camera has low cost, is easily influenced by illumination, and particularly has higher requirement on monocular camera initialization and is more sensitive to a texture-free environment. The laser radar has strong environmental robustness, can directly measure depth, and has higher positioning accuracy. In addition, with the development of social technology, the hardware cost of the lidar is continuously reduced, and the lidar is widely welcome in recent years in the civil and commercial fields.
However, in the long-time rejection environment of the GNSS, there is still a cumulative error by only relying on the relative positioning sensor such as the lidar. UWB transmission rate is high, transmission power is low, penetration is strong, but is also susceptible to multipath errors, non line of sight errors (NLOS).
Disclosure of Invention
In order to solve the problems, the invention discloses a UWB-laser radar-inertial navigation fusion positioning method under a GNSS refusing environment.
In order to achieve the above purpose, the technical scheme of the invention is as follows:
step 1, firstly, eliminating non-line-of-sight errors of ranging information of UWB, and compensating original observation motion distortion of LiDAR point clouds;
step 2, UWB utilizes extended Kalman filtering algorithm to realize position calculation, calculates initial position and attitude, and completes initialization of INS system;
step 3, UWB positioning needs to pass through a precision factor (DOP) and the number of base stations at the same time, quality judgment is carried out on the observation residual errors, and positioning points with larger errors are removed;
step 4, the INS system adopts a mechanical arrangement algorithm, and the result is used as a priori for a dynamic matching process of the LiDAR while high-frequency pose calculation is realized;
and 5, constructing and maintaining a light factor graph at the rear end, fusing the observed value of the laser radar odometer, the IMU pre-integration and the UWB absolute observation to reduce the accumulated error, and dynamically maintaining the characteristic map.
The method comprises the following specific steps:
step 1, non-line-of-sight error rejection and point cloud motion compensation
UWB positioning and non-line-of-sight error rejection are obtained as follows:
Figure BDA0004035578200000021
Figure BDA0004035578200000022
Figure BDA0004035578200000023
in the formula (1), A k In the form of a state transition matrix,
Figure BDA0004035578200000024
for its corresponding transpose matrix->
Figure BDA0004035578200000025
Estimate for the observed quantity of the last moment, +.>
Figure BDA0004035578200000026
For the predicted state quantity of the state equation, +.>
Figure BDA0004035578200000027
Representing the predicted covariance of the state equation, Σ Q K is process noise k 、R、H k An observation noise matrix and a state observation matrix of Kalman gain and an observation equation, respectively, +.>
Figure BDA0004035578200000028
Is the transpose matrix of the state observation matrix, Z k For the observation vector, residual is the residual of the observation and prediction, when the residual is larger than the set threshold (threshold is 0.35), the observation is regarded as a non-line-of-sight error, the Kalman gain is 0 to realize rejection,
the LiDAR point cloud motion compensation formula is as follows:
Figure BDA0004035578200000029
in the middle of
Figure BDA00040355782000000210
Relative interframe transformation measured for INS, +.>
Figure BDA00040355782000000211
External reference matrix for LiDAR and INS, < ->
Figure BDA00040355782000000212
Compensating the laser spot for the need under the current frame, < >>
Figure BDA00040355782000000213
To be the right ofThe front point compensates for the coordinate representation of the last point.
Step 2 UWB positioning resolving position
UWB utilizes extended Kalman filtering algorithm to realize position calculation, calculates initial position and attitude, completes initialization of INS system, and calculates according to the following formula:
Figure BDA00040355782000000214
wherein X is k As the posterior state quantity, the dynamic state quantity,
Figure BDA00040355782000000215
covariance matrix solved for kalman filtering.
Step 3 UWB quality control
The UWB positioning needs to pass through a precision factor (DOP) and the number of base stations, the quality judgment is carried out by observing residual errors, positioning points with larger errors are removed, and the following formula is adopted:
Figure BDA0004035578200000031
wherein COV is an observation covariance matrix obtained by calculation of an observation matrix, PDOP is a measured error precision factor, and tr is a tracing operation; when the PDOP value is minimum, the corresponding base station geometric layout is optimal; thus, when the UWB base station quality is poor, the UWB base station can be removed according to the PDOP value (the threshold value is set to be 3.0); the number of base stations and the average residual are introduced as a judgment of quality control, when the number of base stations is less than 4, the UWB positioning result is not available, and after the non-visual errors are removed, the average value of the base stations is less than a threshold (set to 0.15 herein), and the positioning is effective.
And 4. The INS system adopts a mechanical arrangement algorithm, realizes high-frequency pose calculation, and simultaneously uses the result as a priori for a dynamic matching process of the LiDAR, wherein a dynamic model is represented by the following formula:
Figure BDA0004035578200000032
Figure BDA0004035578200000033
Figure BDA0004035578200000034
Figure BDA0004035578200000035
wherein W is represented by the earth coordinate system, defined as the northeast and north coordinates of the initial moment, B is the carrier system, g W Is a gravity vector in a world coordinate system; a, a B For the measurement of the acceleration under the carrier system,
Figure BDA0004035578200000036
is the rate of change of the earth's rotation [] × An antisymmetric matrix corresponding to the vector; />
Figure BDA0004035578200000037
Is the position vector, the speed vector and the +.>
Figure BDA0004035578200000038
Is the posture of B in W series, +.>
Figure BDA0004035578200000039
For its rotation matrix expression, superscript-representation derivative operation, < >>
Figure BDA00040355782000000310
Indicating the angular velocity measured by the INS system, +.>
Figure BDA00040355782000000311
Is true value +.>
Figure BDA00040355782000000312
The posture of the W system is the posture of the B system.
And 5, constructing and maintaining a light factor graph at the rear end, fusing an observation value of the laser radar odometer, an IMU pre-integral and UWB absolute observation to reduce accumulated errors, and simultaneously dynamically maintaining a characteristic map, wherein an optimization solving equation is as follows:
Figure BDA00040355782000000313
where min represents the minimization problem in the optimization,
Figure BDA00040355782000000314
radar odometer factor for two adjacent frames of radar transformation,/->
Figure BDA00040355782000000315
Pre-integrating constraint for IMU in INS system, R U (x) Is a UWB global constraint.
The beneficial effects of the invention include:
according to the UWB-laser radar-inertial navigation fusion positioning method under the GNSS rejection environment, provided by the invention, the model theory of filtering and factor graph optimization is flexibly utilized, so that the multi-source fusion positioning algorithm of UWB/LiDAR/INS is realized, and the real-time high-precision positioning result output is realized under the condition of not depending on satellites. The scheme fully digs the complementary characteristics of the three sensors, can effectively avoid the positioning deficiency under the long-time GNSS refusing environment by utilizing the multi-source data information, and provides a new solution for indoor and outdoor continuous positioning. The method can be applied to indoor environments such as large malls, underground spaces and the like, provides a reliable positioning basis for planning, controlling, sensing and other unmanned modules, and has reference significance for other autonomous unmanned systems.
Drawings
FIG. 1 is a flowchart of a UWB/laser radar/INS based positioning mapping algorithm according to the present invention;
FIG. 2 is a schematic diagram of a solution to the optimization model of FIG. 2.
Detailed Description
The present invention is further illustrated in the following drawings and detailed description, which are to be understood as being merely illustrative of the invention and not limiting the scope of the invention.
Example 1: as shown in fig. 1, the invention discloses a UWB-laser radar-inertial navigation fusion positioning method in a GNSS refusing environment, which comprises the following specific steps:
step 1, non-line-of-sight error rejection and point cloud motion compensation
UWB positioning and non-line-of-sight error rejection are obtained as follows:
Figure BDA0004035578200000041
Figure BDA0004035578200000042
Figure BDA0004035578200000043
in the formula (1), A k In the form of a state transition matrix,
Figure BDA0004035578200000044
for its corresponding transpose matrix->
Figure BDA0004035578200000045
Estimate for the observed quantity of the last moment, +.>
Figure BDA0004035578200000046
For the predicted state quantity of the state equation, +.>
Figure BDA0004035578200000047
Representing the predicted covariance of the state equation, Σ Q K is process noise k 、R、H k An observation noise matrix and a state observation matrix of Kalman gain and an observation equation, respectively, +.>
Figure BDA0004035578200000048
Is the transpose matrix of the state observation matrix, Z k For the observation vector, residual is the residual of the observation and prediction, when the residual is larger than the set threshold (threshold is 0.35), the observation is regarded as a non-line-of-sight error, the Kalman gain is 0 to realize rejection,
the LiDAR point cloud motion compensation formula is as follows:
Figure BDA0004035578200000049
in the middle of
Figure BDA00040355782000000410
Relative interframe transformation measured for INS, +.>
Figure BDA00040355782000000411
External reference matrix for LiDAR and INS, < ->
Figure BDA00040355782000000412
Compensating the laser spot for the need under the current frame, < >>
Figure BDA0004035578200000051
To compensate the current point to the coordinate representation of the last point.
Step 2 UWB positioning resolving position
UWB utilizes extended Kalman filtering algorithm to realize position calculation, calculates initial position and attitude, completes initialization of INS system, and calculates according to the following formula:
Figure BDA0004035578200000052
wherein X is k As the posterior state quantity, the dynamic state quantity,
Figure BDA0004035578200000053
solution for Kalman filteringA variance matrix.
Step 3 UWB quality control
The UWB positioning needs to pass through a precision factor (DOP) and the number of base stations, the quality judgment is carried out by observing residual errors, positioning points with larger errors are removed, and the following formula is adopted:
Figure BDA0004035578200000054
wherein COV is an observation covariance matrix obtained by calculation of an observation matrix, PDOP is a measured error precision factor, and tr is a tracing operation; when the PDOP value is minimum, the corresponding base station geometric layout is optimal; thus, when the UWB base station quality is poor, the UWB base station can be removed according to the PDOP value (the threshold value is set to be 3.0); the number of base stations and the average residual are introduced as a judgment of quality control, when the number of base stations is less than 4, the UWB positioning result is not available, and after the non-visual errors are removed, the average value of the base stations is less than a threshold (set to 0.15 herein), and the positioning is effective.
And 4. The INS system adopts a mechanical arrangement algorithm, realizes high-frequency pose calculation, and simultaneously uses the result as a priori for a dynamic matching process of the LiDAR, wherein a dynamic model is represented by the following formula:
Figure BDA0004035578200000055
/>
Figure BDA0004035578200000056
Figure BDA0004035578200000057
Figure BDA0004035578200000058
wherein W is represented by the earth coordinate system, defined as the northeast-north day coordinate of the initial time, and B is the carrierSystem g W Is a gravity vector in a world coordinate system; a, a B For the measurement of the acceleration under the carrier system,
Figure BDA0004035578200000059
is the rate of change of the earth's rotation [] × An antisymmetric matrix corresponding to the vector; />
Figure BDA00040355782000000510
Is the position vector, the speed vector and the +.>
Figure BDA00040355782000000511
Is the posture of B in W series, +.>
Figure BDA0004035578200000061
For its rotation matrix expression, superscript-representation derivative operation, < >>
Figure BDA0004035578200000062
Indicating the angular velocity measured by the INS system, +.>
Figure BDA0004035578200000063
Is true value +.>
Figure BDA0004035578200000064
The posture of the W system is the posture of the B system.
And fifthly, constructing and maintaining a light factor graph at the rear end as shown in the figure 2, fusing the observation value of the laser radar odometer, the IMU pre-integration and the UWB absolute observation to reduce the accumulated error, and simultaneously dynamically maintaining a characteristic map, wherein an optimization solving equation is as follows:
Figure BDA0004035578200000065
where min represents the minimization problem in the optimization,
Figure BDA0004035578200000066
radar odometer factor for two adjacent frames of radar transformation,/->
Figure BDA0004035578200000067
Pre-integrating constraint for IMU in INS system, R U (x) Is a UWB global constraint.
In the scheme, a LinkTrack P-B with the UWB model number of noploop company is adopted; the INS model is ADIS16497; liDAR is Velodyne VLP-16, has 16 scanning lines, a mechanical rotary radar, an angular resolution of 0.2 DEG and a detection distance of 100m; RTK/INS integrated navigation is used as an evaluation reference, and the positioning precision is in the centimeter level. The spatial position of the dual antenna and LiDAR/IMU/UWB has been accurately measured. In the actual measurement vehicle-mounted experiment, the travel path included a UWB-satisfactory and shielded region, and the positioning accuracy (root mean square error) using UWB was 0.54m. The positioning accuracy (root mean square error) of the multi-source fusion system fused with UWB/LiDAR/INS is 0.28m, and is improved by 48.1% compared with the multi-source fusion system fused with UWB/LiDAR/INS.
It should be noted that the foregoing merely illustrates the technical idea of the present invention and is not intended to limit the scope of the present invention, and that a person skilled in the art may make several improvements and modifications without departing from the principles of the present invention, which fall within the scope of the claims of the present invention.

Claims (6)

1. The UWB-laser radar-inertial navigation fusion positioning method under the GNSS refusal environment is characterized by comprising the following steps:
step 1, firstly, eliminating non-line-of-sight errors of ranging information of UWB, and compensating original observation motion distortion of LiDAR point clouds;
step 2, UWB utilizes extended Kalman filtering algorithm to realize position calculation, calculates initial position and attitude, and completes initialization of INS system;
step 3, UWB positioning needs to pass through a precision factor (DOP) and the number of base stations at the same time, quality judgment is carried out on the observation residual errors, and positioning points with larger errors are removed;
step 4, the INS system adopts a mechanical arrangement algorithm, and the result is used as a priori for a dynamic matching process of the LiDAR while high-frequency pose calculation is realized;
and 5, constructing and maintaining a light factor graph at the rear end, fusing the observed value of the laser radar odometer, the IMU pre-integration and the UWB absolute observation to reduce the accumulated error, and dynamically maintaining the characteristic map.
2. The method for positioning by UWB-laser radar-inertial navigation fusion in GNSS rejection environment according to claim 1, wherein in the step 1, UWB positioning and non-line-of-sight error rejection are achieved, and LiDAR point cloud motion compensation is achieved according to the following formula:
Figure FDA0004035578190000011
Figure FDA0004035578190000012
Figure FDA0004035578190000013
in the formula (1), A k In the form of a state transition matrix,
Figure FDA0004035578190000014
for its corresponding transpose matrix->
Figure FDA0004035578190000015
For the observed quantity estimation at the previous time,
Figure FDA0004035578190000016
for the predicted state quantity of the state equation, +.>
Figure FDA0004035578190000017
Representing the predicted covariance of the state equation, Σ Q Is too muchCheng Zaosheng, K k 、R、H k An observation noise matrix and a state observation matrix of Kalman gain and an observation equation, respectively, +.>
Figure FDA0004035578190000018
Is the transpose matrix of the state observation matrix, Z k For the observation vector, residual is the residual of the observation and prediction, when the residual is larger than the set threshold (threshold is 0.35), the observation is regarded as a non-line-of-sight error, and the Kalman gain is 0 to realize rejection
The LiDAR point cloud motion compensation formula is as follows:
Figure FDA0004035578190000019
in the middle of
Figure FDA00040355781900000110
Relative interframe transformation measured for INS, +.>
Figure FDA00040355781900000111
External reference matrix for LiDAR and INS, < ->
Figure FDA00040355781900000112
Compensating the laser spot for the need under the current frame, < >>
Figure FDA00040355781900000113
To compensate the current point to the coordinate representation of the last point.
3. The UWB-laser radar-inertial navigation fusion positioning method under the GNSS rejection environment as claimed in claim 1, wherein the UWB in the step 2 utilizes an extended Kalman filtering algorithm to realize position calculation, calculates initial positions and postures, and completes initialization of an INS system
Figure FDA0004035578190000021
/>
Wherein X is k As the posterior state quantity, the dynamic state quantity,
Figure FDA0004035578190000022
covariance matrix solved for kalman filtering.
4. The UWB-laser radar-inertial navigation fusion positioning method under the GNSS rejection environment according to claim 1, wherein in the step 3, UWB positioning is performed by passing through a precision factor (DOP) and the number of base stations, observing residual errors to perform quality judgment, removing positioning points with larger errors, and the method is characterized by comprising the following steps:
Figure FDA0004035578190000023
wherein COV is an observation covariance matrix obtained by calculation of an observation matrix, PDOP is a measured error precision factor, and tr is a tracing operation; when the PDOP value is minimum, the corresponding base station geometric layout is optimal; thus, when the UWB base station quality is poor, the UWB base station can be removed according to the PDOP value (the threshold value is set to be 3.0); the number of base stations and the average residual are introduced as a judgment of quality control, when the number of base stations is less than 4, the UWB positioning result is not available, and after the non-visual errors are removed, the average value of the base stations is less than a threshold (set to 0.15 herein), and the positioning is effective.
5. The method for positioning UWB-laser radar-inertial navigation fusion in GNSS rejection environment according to claim 1, wherein in step 4, an INS system adopts a mechanical arrangement algorithm, and results are used as a priori for a dynamic matching process of LiDAR while high-frequency pose calculation is realized, and an INS dynamic model is as follows:
Figure FDA0004035578190000024
Figure FDA0004035578190000025
Figure FDA0004035578190000026
Figure FDA0004035578190000027
wherein W is represented by the earth coordinate system, defined as the northeast and north coordinates of the initial moment, B is the carrier system, g W Is a gravity vector in a world coordinate system; a, a B For the measurement of the acceleration under the carrier system,
Figure FDA0004035578190000028
is the rate of change of the earth's rotation [] × An antisymmetric matrix corresponding to the vector; />
Figure FDA0004035578190000029
Is the position vector, the speed vector and the +.>
Figure FDA00040355781900000210
Is the posture of B in W series, +.>
Figure FDA0004035578190000031
For its rotation matrix expression, superscript-representation derivative operation, < >>
Figure FDA0004035578190000032
Representing the angular velocity measured by the INS system,
Figure FDA0004035578190000033
is true value +.>
Figure FDA0004035578190000034
The posture of the W system is the posture of the B system.
6. The method for positioning UWB-laser radar-inertial navigation fusion in GNSS refusal environment according to claim 1, wherein in step 5, the rear end structure maintains a light factor graph, and the observation value of the laser radar odometer, the IMU pre-integration and the UWB absolute observation are fused to reduce the accumulated error, and meanwhile, the characteristic map is dynamically maintained, and the optimization solving equation is as follows:
Figure FDA0004035578190000035
where min represents the minimization problem in the optimization,
Figure FDA0004035578190000036
for radar odometer factors of two adjacent frame transforms of radar,
Figure FDA0004035578190000037
pre-integrating constraint for IMU in INS system, R U (x) Is a UWB global constraint. />
CN202310002241.9A 2023-01-03 2023-01-03 UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment Pending CN116242372A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310002241.9A CN116242372A (en) 2023-01-03 2023-01-03 UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310002241.9A CN116242372A (en) 2023-01-03 2023-01-03 UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment

Publications (1)

Publication Number Publication Date
CN116242372A true CN116242372A (en) 2023-06-09

Family

ID=86628876

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310002241.9A Pending CN116242372A (en) 2023-01-03 2023-01-03 UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment

Country Status (1)

Country Link
CN (1) CN116242372A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117804448A (en) * 2024-02-23 2024-04-02 北京理工大学 Autonomous system positioning method, device, computer equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117804448A (en) * 2024-02-23 2024-04-02 北京理工大学 Autonomous system positioning method, device, computer equipment and storage medium
CN117804448B (en) * 2024-02-23 2024-04-30 北京理工大学 Autonomous system positioning method, device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110262546B (en) Tunnel intelligent unmanned aerial vehicle inspection method
CN110428467B (en) Robot positioning method combining camera, imu and laser radar
Fan et al. Data fusion for indoor mobile robot positioning based on tightly coupled INS/UWB
CN110702091B (en) High-precision positioning method for moving robot along subway rail
CN110243358A (en) The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN103983263A (en) Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network
CN110764506B (en) Course angle fusion method and device of mobile robot and mobile robot
CN109855621B (en) Combined indoor pedestrian navigation system and method based on UWB and SINS
CN109752725A (en) A kind of low speed business machine people, positioning navigation method and Position Fixing Navigation System
CN112987065B (en) Multi-sensor-integrated handheld SLAM device and control method thereof
CN104197928A (en) Multi-camera collaboration-based method for detecting, positioning and tracking unmanned aerial vehicle
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN111025366B (en) Grid SLAM navigation system and method based on INS and GNSS
CN113124856A (en) Visual inertia tight coupling odometer based on UWB online anchor point and metering method
CN113155124B (en) Multi-source auxiliary navigation method and device
CN114019552A (en) Bayesian multi-sensor error constraint-based location reliability optimization method
CN111077907A (en) Autonomous positioning method of outdoor unmanned aerial vehicle
CN116242372A (en) UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment
CN115435779A (en) Intelligent body pose estimation method based on GNSS/IMU/optical flow information fusion
CN112444245B (en) Insect-imitating vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor
Chiang et al. Semantic proximity update of GNSS/INS/VINS for Seamless Vehicular Navigation using Smartphone sensors
CN115930977A (en) Method and system for positioning characteristic degradation scene, electronic equipment and readable storage medium
CN114485623B (en) Focusing distance camera-IMU-UWB fusion accurate positioning method
CN115183767A (en) Monocular VIO/UWB indoor combined positioning method based on ARKF
CN115930948A (en) Orchard robot fusion positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination