CN116242372A - UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment - Google Patents
UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment Download PDFInfo
- Publication number
- CN116242372A CN116242372A CN202310002241.9A CN202310002241A CN116242372A CN 116242372 A CN116242372 A CN 116242372A CN 202310002241 A CN202310002241 A CN 202310002241A CN 116242372 A CN116242372 A CN 116242372A
- Authority
- CN
- China
- Prior art keywords
- uwb
- observation
- positioning
- matrix
- laser radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/183—Compensation of inertial measurements, e.g. for temperature effects
- G01C21/188—Compensation of inertial measurements, e.g. for temperature effects for accumulated errors, e.g. by coupling inertial systems with absolute positioning systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Abstract
The invention discloses a UWB-laser radar (LiDAR) -Inertial Navigation (INS) fusion positioning method under a GNSS refusing environment, which comprises the following steps: removing non-line-of-sight errors of ranging information of UWB, and compensating original observation motion distortion by using LiDAR point clouds; UWB utilizes extended Kalman filtering algorithm to realize position calculation, calculates initial position and attitude, and completes initialization of INS system; the UWB positioning needs to pass through a precision factor (DOP) and the number of base stations, the quality judgment is carried out by observing residual errors, and positioning points with larger errors are removed; the INS system adopts a mechanical arrangement algorithm, realizes high-frequency pose calculation, and simultaneously takes the result as a priori for a dynamic matching process of LiDAR; the rear end structure maintains a light factor graph, and the observation value of the laser radar odometer, the IMU pre-integration and the UWB absolute observation are fused to reduce the accumulated error. The invention can provide a reliable solution for the problem of real-time positioning in the GNSS refusing environment and can maintain the positioning accuracy of decimeter level.
Description
Technical Field
The invention belongs to the field of positioning application in a GNSS rejection environment, relates to a multisource fusion positioning method, and particularly relates to a back-end optimization algorithm of a filtering and factor graph.
Background
Reliable state estimation is a fundamental requirement of autonomous unmanned systems such as autopilot. In outdoor open environments, RTKs can provide centimeter-level positioning accuracy. Inertial Navigation (INS) is a sensor which can realize positioning by relying on the inertial navigation device in the existing navigation means, has high short-time precision and maintains positioning in a GNSS part refused environment, but long-time inertial navigation can be rapidly diffused. Today, how to achieve high-precision and robust positioning is still a technical difficulty in GNSS rejection scenarios in underground spaces, urban markets, etc.
In order to solve such problems, positioning by means of sensors such as a laser radar and a vision camera has been rapidly developed in recent years. Meanwhile, the method realizes self state estimation based on the surrounding environment, establishes a perception map according to the environment, and can be used as a basis of navigation planning. The visual camera has low cost, is easily influenced by illumination, and particularly has higher requirement on monocular camera initialization and is more sensitive to a texture-free environment. The laser radar has strong environmental robustness, can directly measure depth, and has higher positioning accuracy. In addition, with the development of social technology, the hardware cost of the lidar is continuously reduced, and the lidar is widely welcome in recent years in the civil and commercial fields.
However, in the long-time rejection environment of the GNSS, there is still a cumulative error by only relying on the relative positioning sensor such as the lidar. UWB transmission rate is high, transmission power is low, penetration is strong, but is also susceptible to multipath errors, non line of sight errors (NLOS).
Disclosure of Invention
In order to solve the problems, the invention discloses a UWB-laser radar-inertial navigation fusion positioning method under a GNSS refusing environment.
In order to achieve the above purpose, the technical scheme of the invention is as follows:
step 1, firstly, eliminating non-line-of-sight errors of ranging information of UWB, and compensating original observation motion distortion of LiDAR point clouds;
step 2, UWB utilizes extended Kalman filtering algorithm to realize position calculation, calculates initial position and attitude, and completes initialization of INS system;
step 3, UWB positioning needs to pass through a precision factor (DOP) and the number of base stations at the same time, quality judgment is carried out on the observation residual errors, and positioning points with larger errors are removed;
step 4, the INS system adopts a mechanical arrangement algorithm, and the result is used as a priori for a dynamic matching process of the LiDAR while high-frequency pose calculation is realized;
and 5, constructing and maintaining a light factor graph at the rear end, fusing the observed value of the laser radar odometer, the IMU pre-integration and the UWB absolute observation to reduce the accumulated error, and dynamically maintaining the characteristic map.
The method comprises the following specific steps:
step 1, non-line-of-sight error rejection and point cloud motion compensation
UWB positioning and non-line-of-sight error rejection are obtained as follows:
in the formula (1), A k In the form of a state transition matrix,for its corresponding transpose matrix->Estimate for the observed quantity of the last moment, +.>For the predicted state quantity of the state equation, +.>Representing the predicted covariance of the state equation, Σ Q K is process noise k 、R、H k An observation noise matrix and a state observation matrix of Kalman gain and an observation equation, respectively, +.>Is the transpose matrix of the state observation matrix, Z k For the observation vector, residual is the residual of the observation and prediction, when the residual is larger than the set threshold (threshold is 0.35), the observation is regarded as a non-line-of-sight error, the Kalman gain is 0 to realize rejection,
the LiDAR point cloud motion compensation formula is as follows:
in the middle ofRelative interframe transformation measured for INS, +.>External reference matrix for LiDAR and INS, < ->Compensating the laser spot for the need under the current frame, < >>To be the right ofThe front point compensates for the coordinate representation of the last point.
Step 2 UWB positioning resolving position
UWB utilizes extended Kalman filtering algorithm to realize position calculation, calculates initial position and attitude, completes initialization of INS system, and calculates according to the following formula:
wherein X is k As the posterior state quantity, the dynamic state quantity,covariance matrix solved for kalman filtering.
Step 3 UWB quality control
The UWB positioning needs to pass through a precision factor (DOP) and the number of base stations, the quality judgment is carried out by observing residual errors, positioning points with larger errors are removed, and the following formula is adopted:
wherein COV is an observation covariance matrix obtained by calculation of an observation matrix, PDOP is a measured error precision factor, and tr is a tracing operation; when the PDOP value is minimum, the corresponding base station geometric layout is optimal; thus, when the UWB base station quality is poor, the UWB base station can be removed according to the PDOP value (the threshold value is set to be 3.0); the number of base stations and the average residual are introduced as a judgment of quality control, when the number of base stations is less than 4, the UWB positioning result is not available, and after the non-visual errors are removed, the average value of the base stations is less than a threshold (set to 0.15 herein), and the positioning is effective.
And 4. The INS system adopts a mechanical arrangement algorithm, realizes high-frequency pose calculation, and simultaneously uses the result as a priori for a dynamic matching process of the LiDAR, wherein a dynamic model is represented by the following formula:
wherein W is represented by the earth coordinate system, defined as the northeast and north coordinates of the initial moment, B is the carrier system, g W Is a gravity vector in a world coordinate system; a, a B For the measurement of the acceleration under the carrier system,is the rate of change of the earth's rotation [] × An antisymmetric matrix corresponding to the vector; />Is the position vector, the speed vector and the +.>Is the posture of B in W series, +.>For its rotation matrix expression, superscript-representation derivative operation, < >>Indicating the angular velocity measured by the INS system, +.>Is true value +.>The posture of the W system is the posture of the B system.
And 5, constructing and maintaining a light factor graph at the rear end, fusing an observation value of the laser radar odometer, an IMU pre-integral and UWB absolute observation to reduce accumulated errors, and simultaneously dynamically maintaining a characteristic map, wherein an optimization solving equation is as follows:
where min represents the minimization problem in the optimization,radar odometer factor for two adjacent frames of radar transformation,/->Pre-integrating constraint for IMU in INS system, R U (x) Is a UWB global constraint.
The beneficial effects of the invention include:
according to the UWB-laser radar-inertial navigation fusion positioning method under the GNSS rejection environment, provided by the invention, the model theory of filtering and factor graph optimization is flexibly utilized, so that the multi-source fusion positioning algorithm of UWB/LiDAR/INS is realized, and the real-time high-precision positioning result output is realized under the condition of not depending on satellites. The scheme fully digs the complementary characteristics of the three sensors, can effectively avoid the positioning deficiency under the long-time GNSS refusing environment by utilizing the multi-source data information, and provides a new solution for indoor and outdoor continuous positioning. The method can be applied to indoor environments such as large malls, underground spaces and the like, provides a reliable positioning basis for planning, controlling, sensing and other unmanned modules, and has reference significance for other autonomous unmanned systems.
Drawings
FIG. 1 is a flowchart of a UWB/laser radar/INS based positioning mapping algorithm according to the present invention;
FIG. 2 is a schematic diagram of a solution to the optimization model of FIG. 2.
Detailed Description
The present invention is further illustrated in the following drawings and detailed description, which are to be understood as being merely illustrative of the invention and not limiting the scope of the invention.
Example 1: as shown in fig. 1, the invention discloses a UWB-laser radar-inertial navigation fusion positioning method in a GNSS refusing environment, which comprises the following specific steps:
step 1, non-line-of-sight error rejection and point cloud motion compensation
UWB positioning and non-line-of-sight error rejection are obtained as follows:
in the formula (1), A k In the form of a state transition matrix,for its corresponding transpose matrix->Estimate for the observed quantity of the last moment, +.>For the predicted state quantity of the state equation, +.>Representing the predicted covariance of the state equation, Σ Q K is process noise k 、R、H k An observation noise matrix and a state observation matrix of Kalman gain and an observation equation, respectively, +.>Is the transpose matrix of the state observation matrix, Z k For the observation vector, residual is the residual of the observation and prediction, when the residual is larger than the set threshold (threshold is 0.35), the observation is regarded as a non-line-of-sight error, the Kalman gain is 0 to realize rejection,
the LiDAR point cloud motion compensation formula is as follows:
in the middle ofRelative interframe transformation measured for INS, +.>External reference matrix for LiDAR and INS, < ->Compensating the laser spot for the need under the current frame, < >>To compensate the current point to the coordinate representation of the last point.
Step 2 UWB positioning resolving position
UWB utilizes extended Kalman filtering algorithm to realize position calculation, calculates initial position and attitude, completes initialization of INS system, and calculates according to the following formula:
wherein X is k As the posterior state quantity, the dynamic state quantity,solution for Kalman filteringA variance matrix.
Step 3 UWB quality control
The UWB positioning needs to pass through a precision factor (DOP) and the number of base stations, the quality judgment is carried out by observing residual errors, positioning points with larger errors are removed, and the following formula is adopted:
wherein COV is an observation covariance matrix obtained by calculation of an observation matrix, PDOP is a measured error precision factor, and tr is a tracing operation; when the PDOP value is minimum, the corresponding base station geometric layout is optimal; thus, when the UWB base station quality is poor, the UWB base station can be removed according to the PDOP value (the threshold value is set to be 3.0); the number of base stations and the average residual are introduced as a judgment of quality control, when the number of base stations is less than 4, the UWB positioning result is not available, and after the non-visual errors are removed, the average value of the base stations is less than a threshold (set to 0.15 herein), and the positioning is effective.
And 4. The INS system adopts a mechanical arrangement algorithm, realizes high-frequency pose calculation, and simultaneously uses the result as a priori for a dynamic matching process of the LiDAR, wherein a dynamic model is represented by the following formula:
wherein W is represented by the earth coordinate system, defined as the northeast-north day coordinate of the initial time, and B is the carrierSystem g W Is a gravity vector in a world coordinate system; a, a B For the measurement of the acceleration under the carrier system,is the rate of change of the earth's rotation [] × An antisymmetric matrix corresponding to the vector; />Is the position vector, the speed vector and the +.>Is the posture of B in W series, +.>For its rotation matrix expression, superscript-representation derivative operation, < >>Indicating the angular velocity measured by the INS system, +.>Is true value +.>The posture of the W system is the posture of the B system.
And fifthly, constructing and maintaining a light factor graph at the rear end as shown in the figure 2, fusing the observation value of the laser radar odometer, the IMU pre-integration and the UWB absolute observation to reduce the accumulated error, and simultaneously dynamically maintaining a characteristic map, wherein an optimization solving equation is as follows:
where min represents the minimization problem in the optimization,radar odometer factor for two adjacent frames of radar transformation,/->Pre-integrating constraint for IMU in INS system, R U (x) Is a UWB global constraint.
In the scheme, a LinkTrack P-B with the UWB model number of noploop company is adopted; the INS model is ADIS16497; liDAR is Velodyne VLP-16, has 16 scanning lines, a mechanical rotary radar, an angular resolution of 0.2 DEG and a detection distance of 100m; RTK/INS integrated navigation is used as an evaluation reference, and the positioning precision is in the centimeter level. The spatial position of the dual antenna and LiDAR/IMU/UWB has been accurately measured. In the actual measurement vehicle-mounted experiment, the travel path included a UWB-satisfactory and shielded region, and the positioning accuracy (root mean square error) using UWB was 0.54m. The positioning accuracy (root mean square error) of the multi-source fusion system fused with UWB/LiDAR/INS is 0.28m, and is improved by 48.1% compared with the multi-source fusion system fused with UWB/LiDAR/INS.
It should be noted that the foregoing merely illustrates the technical idea of the present invention and is not intended to limit the scope of the present invention, and that a person skilled in the art may make several improvements and modifications without departing from the principles of the present invention, which fall within the scope of the claims of the present invention.
Claims (6)
1. The UWB-laser radar-inertial navigation fusion positioning method under the GNSS refusal environment is characterized by comprising the following steps:
step 1, firstly, eliminating non-line-of-sight errors of ranging information of UWB, and compensating original observation motion distortion of LiDAR point clouds;
step 2, UWB utilizes extended Kalman filtering algorithm to realize position calculation, calculates initial position and attitude, and completes initialization of INS system;
step 3, UWB positioning needs to pass through a precision factor (DOP) and the number of base stations at the same time, quality judgment is carried out on the observation residual errors, and positioning points with larger errors are removed;
step 4, the INS system adopts a mechanical arrangement algorithm, and the result is used as a priori for a dynamic matching process of the LiDAR while high-frequency pose calculation is realized;
and 5, constructing and maintaining a light factor graph at the rear end, fusing the observed value of the laser radar odometer, the IMU pre-integration and the UWB absolute observation to reduce the accumulated error, and dynamically maintaining the characteristic map.
2. The method for positioning by UWB-laser radar-inertial navigation fusion in GNSS rejection environment according to claim 1, wherein in the step 1, UWB positioning and non-line-of-sight error rejection are achieved, and LiDAR point cloud motion compensation is achieved according to the following formula:
in the formula (1), A k In the form of a state transition matrix,for its corresponding transpose matrix->For the observed quantity estimation at the previous time,for the predicted state quantity of the state equation, +.>Representing the predicted covariance of the state equation, Σ Q Is too muchCheng Zaosheng, K k 、R、H k An observation noise matrix and a state observation matrix of Kalman gain and an observation equation, respectively, +.>Is the transpose matrix of the state observation matrix, Z k For the observation vector, residual is the residual of the observation and prediction, when the residual is larger than the set threshold (threshold is 0.35), the observation is regarded as a non-line-of-sight error, and the Kalman gain is 0 to realize rejection
The LiDAR point cloud motion compensation formula is as follows:
3. The UWB-laser radar-inertial navigation fusion positioning method under the GNSS rejection environment as claimed in claim 1, wherein the UWB in the step 2 utilizes an extended Kalman filtering algorithm to realize position calculation, calculates initial positions and postures, and completes initialization of an INS system
4. The UWB-laser radar-inertial navigation fusion positioning method under the GNSS rejection environment according to claim 1, wherein in the step 3, UWB positioning is performed by passing through a precision factor (DOP) and the number of base stations, observing residual errors to perform quality judgment, removing positioning points with larger errors, and the method is characterized by comprising the following steps:
wherein COV is an observation covariance matrix obtained by calculation of an observation matrix, PDOP is a measured error precision factor, and tr is a tracing operation; when the PDOP value is minimum, the corresponding base station geometric layout is optimal; thus, when the UWB base station quality is poor, the UWB base station can be removed according to the PDOP value (the threshold value is set to be 3.0); the number of base stations and the average residual are introduced as a judgment of quality control, when the number of base stations is less than 4, the UWB positioning result is not available, and after the non-visual errors are removed, the average value of the base stations is less than a threshold (set to 0.15 herein), and the positioning is effective.
5. The method for positioning UWB-laser radar-inertial navigation fusion in GNSS rejection environment according to claim 1, wherein in step 4, an INS system adopts a mechanical arrangement algorithm, and results are used as a priori for a dynamic matching process of LiDAR while high-frequency pose calculation is realized, and an INS dynamic model is as follows:
wherein W is represented by the earth coordinate system, defined as the northeast and north coordinates of the initial moment, B is the carrier system, g W Is a gravity vector in a world coordinate system; a, a B For the measurement of the acceleration under the carrier system,is the rate of change of the earth's rotation [] × An antisymmetric matrix corresponding to the vector; />Is the position vector, the speed vector and the +.>Is the posture of B in W series, +.>For its rotation matrix expression, superscript-representation derivative operation, < >>Representing the angular velocity measured by the INS system,is true value +.>The posture of the W system is the posture of the B system.
6. The method for positioning UWB-laser radar-inertial navigation fusion in GNSS refusal environment according to claim 1, wherein in step 5, the rear end structure maintains a light factor graph, and the observation value of the laser radar odometer, the IMU pre-integration and the UWB absolute observation are fused to reduce the accumulated error, and meanwhile, the characteristic map is dynamically maintained, and the optimization solving equation is as follows:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310002241.9A CN116242372A (en) | 2023-01-03 | 2023-01-03 | UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310002241.9A CN116242372A (en) | 2023-01-03 | 2023-01-03 | UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116242372A true CN116242372A (en) | 2023-06-09 |
Family
ID=86628876
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310002241.9A Pending CN116242372A (en) | 2023-01-03 | 2023-01-03 | UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116242372A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117804448A (en) * | 2024-02-23 | 2024-04-02 | 北京理工大学 | Autonomous system positioning method, device, computer equipment and storage medium |
-
2023
- 2023-01-03 CN CN202310002241.9A patent/CN116242372A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117804448A (en) * | 2024-02-23 | 2024-04-02 | 北京理工大学 | Autonomous system positioning method, device, computer equipment and storage medium |
CN117804448B (en) * | 2024-02-23 | 2024-04-30 | 北京理工大学 | Autonomous system positioning method, device, computer equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110262546B (en) | Tunnel intelligent unmanned aerial vehicle inspection method | |
CN110428467B (en) | Robot positioning method combining camera, imu and laser radar | |
Fan et al. | Data fusion for indoor mobile robot positioning based on tightly coupled INS/UWB | |
CN110702091B (en) | High-precision positioning method for moving robot along subway rail | |
CN110243358A (en) | The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion | |
CN103983263A (en) | Inertia/visual integrated navigation method adopting iterated extended Kalman filter and neural network | |
CN110764506B (en) | Course angle fusion method and device of mobile robot and mobile robot | |
CN109855621B (en) | Combined indoor pedestrian navigation system and method based on UWB and SINS | |
CN109752725A (en) | A kind of low speed business machine people, positioning navigation method and Position Fixing Navigation System | |
CN112987065B (en) | Multi-sensor-integrated handheld SLAM device and control method thereof | |
CN104197928A (en) | Multi-camera collaboration-based method for detecting, positioning and tracking unmanned aerial vehicle | |
CN111426320B (en) | Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter | |
CN111025366B (en) | Grid SLAM navigation system and method based on INS and GNSS | |
CN113124856A (en) | Visual inertia tight coupling odometer based on UWB online anchor point and metering method | |
CN113155124B (en) | Multi-source auxiliary navigation method and device | |
CN114019552A (en) | Bayesian multi-sensor error constraint-based location reliability optimization method | |
CN111077907A (en) | Autonomous positioning method of outdoor unmanned aerial vehicle | |
CN116242372A (en) | UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment | |
CN115435779A (en) | Intelligent body pose estimation method based on GNSS/IMU/optical flow information fusion | |
CN112444245B (en) | Insect-imitating vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor | |
Chiang et al. | Semantic proximity update of GNSS/INS/VINS for Seamless Vehicular Navigation using Smartphone sensors | |
CN115930977A (en) | Method and system for positioning characteristic degradation scene, electronic equipment and readable storage medium | |
CN114485623B (en) | Focusing distance camera-IMU-UWB fusion accurate positioning method | |
CN115183767A (en) | Monocular VIO/UWB indoor combined positioning method based on ARKF | |
CN115930948A (en) | Orchard robot fusion positioning method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |