CN113124856A - Visual inertia tight coupling odometer based on UWB online anchor point and metering method - Google Patents

Visual inertia tight coupling odometer based on UWB online anchor point and metering method Download PDF

Info

Publication number
CN113124856A
CN113124856A CN202110558941.7A CN202110558941A CN113124856A CN 113124856 A CN113124856 A CN 113124856A CN 202110558941 A CN202110558941 A CN 202110558941A CN 113124856 A CN113124856 A CN 113124856A
Authority
CN
China
Prior art keywords
imu
uwb
frame
unmanned aerial
anchor point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110558941.7A
Other languages
Chinese (zh)
Other versions
CN113124856B (en
Inventor
田栢苓
李海松
谌宏鸣
宗群
王聪
何雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202110558941.7A priority Critical patent/CN113124856B/en
Publication of CN113124856A publication Critical patent/CN113124856A/en
Application granted granted Critical
Publication of CN113124856B publication Critical patent/CN113124856B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention belongs to the field of navigation and control of a quad-rotor unmanned aerial vehicle, and aims to solve the problem of positioning and navigation of the quad-rotor unmanned aerial vehicle in a GNSS rejection environment. The invention obtains the position of anchor point under the world coordinate system relatively accurately on line, the invention, based on UWB visual inertia tight coupling odometer and metering method of anchor point on line, the electric wave distance measuring module UWB that the ultra wide band technology makes, obtain the distance between two modules according to TW-ToF measurement principle, after filtering some noises through the outlier detecting element, as inputting and sending into the tight coupling optimization unit; the IMU inertia measurement result is also used as an input to be sent to the tight coupling optimization unit; the embedded onboard processor processes distance measurement values, IMU measurement values and binocular camera measurement values among UWB modules by using the tight coupling optimization unit, and analyzes the position and attitude information of the unmanned aerial vehicle by using a tight coupling optimization mode. The invention is mainly applied to the navigation and control occasions of the quad-rotor unmanned aerial vehicle.

Description

Visual inertia tight coupling odometer based on UWB online anchor point and metering method
Technical Field
The invention belongs to the field of navigation and control of a quad-rotor unmanned aerial vehicle, relates to an embedded system and a sensor network, and particularly relates to a quad-rotor unmanned aerial vehicle positioning system and a related algorithm thereof in a Global Navigation Satellite System (GNSS) rejection environment. In particular to a visual inertia tight coupling odometer based on UWB online anchor point calibration.
Background
In recent years, the quad-rotor unmanned aerial vehicle has the characteristics of small volume, light weight, good concealment, suitability for multi-platform and multi-space use and the like, has the advantages of flexible vertical take-off and landing on the ground and warships, no need of catapult, launcher launching and the like, and has great application value in the military and civil fields. In the civil field, street view shooting, power inspection, traffic monitoring, express delivery, post-disaster rescue and the like can be completed by a four-rotor unmanned aerial vehicle; in the military aspect, the quad-rotor unmanned aerial vehicle plays an important role in tasks such as information reconnaissance, military attack, information countermeasure, air early warning, communication relay and the like. However, in the scenes of tunnels, jungles, indoors and the like, due to the non-line-of-sight measurement of the GNSS, the positioning error is extremely large. The positioning and navigation problem of drones in GNSS-denied environments remains a challenge.
Ultra-wideband (UWB) is a novel radio ranging technique in recent years, which calculates a distance between two modules by measuring an arrival time, an arrival time difference, or an arrival angle of a radio wave. Because the wave band of the transmitted electric wave is between 3.1GHz and 4.8GHz, the interference of other electric wave signals can be effectively overcome. In addition, the higher bandwidth can easily overcome the multipath effect and weaken the influence of non-line-of-sight measurement. The P440 UWB module produced by Time domain company adopts a two-way Time-of-flight method (TW-ToF) to measure distance and communication, and due to good performance and low price, the module can be used for building a set of positioning system to provide positioning information for the unmanned aerial vehicle in a GNSS denial environment.
The step of using UWB positioning generally includes sequentially measuring distances to ground anchor points through airborne UWB tags, and estimating the position of the drone through a specific algorithm. The UWB positioning method has the advantages that the cost of the sensor is low, and the sensor is not affected by illumination and weather, but the UWB positioning method also has significant defects, such as being easily interfered by external electric waves, and having an unsatisfactory positioning accuracy, which undoubtedly limits the application of the unmanned aerial vehicle in the GNSS denial environment.
To the problem of UWB-enhanced-based autonomous positioning of quad-rotor unmanned aerial vehicles in a GNSS rejection environment, scholars at home and abroad carry out a great deal of research. In 2017, Chen Wang et al, university of southern ocean, Singapore, proposed a set of UWB-enhanced rapid positioning and mapping system. They realized that the visual SLAM system has higher accuracy in a short time, but has accumulated errors in a long-time operation, and the system robustness is improved by correcting the accumulated errors through UWB distance measurement. In 2018, Jianxin Li et al, university of Singapore, national university of Singapore, adopted EKF (extended Kalman filter) algorithm to fuse IMU (extended Kalman filter)Inertial measurement unit) And UWB measurement data, improve UWB sensor and fix a position alone and have lags, the lower scheduling problem of estimation accuracy. Yanjun Cao et al, Montreal university of industry, 2020, proposed VIR-SLAM, which corrected the accumulated error of monocular visual inertial odometry using UWB distance measurements. Therefore, the multi-sensor fusion positioning based on UWB enhancement is a promising scheme. However, the method based on UWB enhancement requires manual measurement and calculation in advance to obtain the relative pose between the initial time of the unmanned aerial vehicle and the anchor point, so each experiment needs to be calibrated once, the operating conditions of indoor experiments are harsh, and outdoor experiments are more limited thereto.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to solve the problem of positioning and navigation of the quad-rotor unmanned aerial vehicle in the GNSS rejection environment. The invention provides a design method of a visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point calibration, an online anchor point calibration unit using the method can obtain the position of an anchor point under a world coordinate system more accurately on line, the pose information of an unmanned aerial vehicle is analyzed in a tight coupling mode by fusing IMU (inertial measurement unit) inertia measurement, binocular visual image information and UWB distance measurement, and the unmanned aerial vehicle is quickly and accurately positioned by designing a quad-rotor unmanned aerial vehicle platform and operating a positioning algorithm, so that the design method has great application value. Therefore, the technical scheme adopted by the invention is that the visual inertia tightly-coupled odometer based on the UWB online anchor point has the following structure: the radio wave ranging module UWB made by ultra-wideband technology is deployed at two positions, one is deployed around the environment as an anchor point, and the position of the anchor point is calculated by an online anchor point calibration unit; secondly, the label is placed on the unmanned aerial vehicle as a label, the label actively sends a ranging request to the anchor point according to a preset sequence, the distance between the two modules is obtained according to the TW-ToF measurement principle, and the distance is sent to the tight coupling optimization unit as an input after part of noise is filtered by the outlier detection unit; the IMU inertia measurement unit is used for measuring acceleration and angular velocity information of the unmanned aerial vehicle at high frequency, and IMU measurement values are processed by the low-pass filter module and then are also used as input to be sent to the tight coupling optimization unit; the measured value of the binocular camera is processed by the visual odometer and then is used as input to be sent to the tight coupling optimization unit; the online anchor point calibration unit, the outlier detection unit and the tight coupling optimization unit are all modules for the embedded onboard processor to call, the embedded onboard processor utilizes the tight coupling optimization unit to process distance measurement values, IMU measurement values and binocular camera measurement values among UWB modules, and the position and attitude information of the unmanned aerial vehicle is analyzed in a tight coupling optimization mode.
The visual inertia tight coupling mileage metering method based on the UWB online anchor point is realized by the following devices: the radio wave ranging module UWB made by ultra-wideband technology is deployed at two positions, one is deployed around the environment as an anchor point, and the position of the anchor point is calculated by online anchor point calibration; secondly, the label is placed on the unmanned aerial vehicle as a label; the system also comprises an airborne binocular camera and an IMU unit; the method comprises the steps that a tag actively sends a ranging request to an anchor point according to a preset sequence, the distance between two modules is obtained according to a TW-ToF measuring principle, and after part of noise is detected and filtered through an outlier, the noise is used as input to carry out tight coupling optimization; the IMU inertia measurement unit is used for measuring acceleration and angular velocity information of the unmanned aerial vehicle at high frequency, and IMU measurement values are processed by a low-pass filter and then are also used as input to carry out tight coupling optimization; the measured value of the binocular camera is processed by the visual odometer and then is used as input to carry out tight coupling optimization; the embedded onboard processor processes distance measurement values, IMU measurement values and binocular camera measurement values among the UWB modules, and the position and attitude information of the unmanned aerial vehicle is analyzed in a tight coupling optimization mode.
IMU pre-integration: in order to fully utilize IMU data, when UWB data and image data are not received, the IMU data need to be processed, and rough estimation is carried out to obtain the predicted pose of the IMU; on the other hand, to avoid repetitive calculations, the value of the pre-integral is updated after each update of zero offset of acceleration and angular velocity.
The IMU pre-integration comprises the following specific steps: by using the position, the speed and the attitude of the known camera at the kth frame time, integrating all IMU measurement data between the kth frame and the (k + 1) th frame of the camera to obtain the position, the speed and the attitude state corresponding to the (k + 1) th frame, wherein the IMU integration continuous form is as follows:
Figure BDA0003078362430000031
there are:
Figure BDA0003078362430000032
wherein the content of the first and second substances,
Figure BDA0003078362430000033
and
Figure BDA0003078362430000034
a quaternion representation representing the position and velocity of the body b and the rotation at the time of the kth frame in the world coordinate system w,
Figure BDA0003078362430000035
and
Figure BDA0003078362430000036
to representQuaternion representation of the position and speed of the body b and the rotation at the time of the (k + 1) th frame in the world coordinate system w,. DELTA.tkThe time difference of the data is measured for both IMUs,
Figure BDA0003078362430000037
and
Figure BDA0003078362430000038
respectively representing acceleration and angular velocity measured by the IMU at time t,
Figure BDA0003078362430000039
is a quaternion representation at time t under the world coordinate system w,
Figure BDA00030783624300000310
and
Figure BDA00030783624300000311
respectively representing the corresponding measured deviations bias, g, of acceleration and angular velocitywRepresenting gravitational acceleration, omega, in world coordinatesx、ωy、ωzThree components, ω, respectively;
because the IMU is discrete sampling in the actual process, the solution of the median discrete form of the IMU state at the i +1 th time by the IMU state at the i th time is:
Figure BDA00030783624300000312
wherein the content of the first and second substances,
Figure BDA00030783624300000313
and
Figure BDA00030783624300000314
respectively converting the measured acceleration and angular velocity under the IMU system to the body acceleration and angular velocity of the world system;
to avoid duplicate calculations, the optimization variables are separated from the IMU pre-integral terms from frame k to frame k + 1:
Figure BDA00030783624300000315
wherein:
Figure BDA0003078362430000041
wherein
Figure BDA0003078362430000042
Is a rotation matrix from the k frame world coordinate system to the body coordinate system,
Figure BDA0003078362430000043
and (3) converting the formula (3) into a corresponding discrete form, and solving the IMU state relation at the i +1 th moment through the IMU state at the i th moment as follows:
Figure BDA0003078362430000044
here, the
Figure BDA0003078362430000045
Is relative to bkAt the frame time
Figure BDA0003078362430000046
The status of the IMU delta information of (1) is only equal to bkAnd (4) frame correlation, and pre-integrating the relative pose between two frames by iterative solution of an equation (5).
The specific steps of outlier detection and partial noise filtering are as follows: when the program is in the stage of on-line anchor point calibration, the outlier detection unit is used for filtering the received UWB ranging information, assuming that the measured value of the UWB is d, and collecting n sample points d under the current scenei(i-1, …, n), if the following condition is satisfied, the i-th distance measurement value diIs determined as outlier:
Figure BDA0003078362430000047
Figure BDA0003078362430000048
wherein
Figure BDA0003078362430000049
Is the average of the n ranging values, gamma is an adjustable threshold, vmaxF is the maximum flying speed of the unmanned aerial vehicle, the distance measurement updating rate of the UWB module is f, if the distance is judged to be an outlier, the distance is directly abandoned, and if the distance is a normal condition, the distance d is usediAs input, sending the data to an online anchor point calibration unit;
when the online anchor point calibration stage is finished, using a two-norm of the difference between the current pose of the unmanned aerial vehicle and the calibrated anchor point position as a criterion, and if the two-norm meets the following conditions, determining the ith distance measurement value diIs determined as outlier:
Figure BDA00030783624300000410
wherein P iskIs the current pose of unmanned aerial vehicle, PaIs the anchor point position coordinate under the unmanned aerial vehicle initial moment coordinate system obtained by the online anchor point calibration unit, gamma is an adjustable threshold value, vmaxF is the maximum flight speed of the unmanned aerial vehicle, f is the ranging updating rate of the UWB module, and if the distance is judged to be an outlier, the distance is directly discarded; if the situation is normal, the distance d is determinediFed as input into the close-coupled unit.
Calibrating an online anchor point: when the UWB is used for positioning, coordinates of the anchor point under the coordinate system at the initial takeoff moment of the unmanned aerial vehicle are calibrated by utilizing the output odometer information and the UWB ranging information.
The online anchor point calibration comprises the following specific steps:
Figure BDA0003078362430000051
Figure BDA0003078362430000052
wherein d isiIs the ith distance measurement, dnIs the n (i)>n) distance measurements, gamma is an adjustable threshold, f is the UWB module ranging update rate,
Figure BDA0003078362430000053
for the pose, v, of the current unmanned aerial vehicle under the world coordinate systemthrIf the speed threshold is preset, the ith distance measurement value is discarded, the (i + 1) th distance measurement value is substituted into the formula for judgment, and by analogy, n is assigned as m until the mth distance measurement value does not meet the condition;
thus, for the ith distance measurement, a residual error is obtained about the ranging information
Figure BDA0003078362430000054
Figure BDA0003078362430000055
Wherein d isiIs the ith time of the distance measurement,
Figure BDA0003078362430000056
is the pose of the unmanned aerial vehicle under the world coordinate system under the same timestamp with the UWB ranging information obtained by IMU pre-integration and a visual odometer,
Figure BDA0003078362430000057
is the three-dimensional coordinate of the anchor point to be estimated in the world coordinate system, thus the optimized cost function Er
Figure BDA0003078362430000058
Wherein
Figure BDA0003078362430000059
Rho () is a pseudo-Huber loss function, and the variable to be optimized is determined after the cost function is written by using the automatic derivation function of Ceres
Figure BDA00030783624300000510
Will be the parameter di
Figure BDA00030783624300000511
The optimization can be carried out after the interface is transmitted, and the method judges the condition of the termination of the optimization as the condition obtained by the optimization twice before and after
Figure BDA00030783624300000512
The difference is less than 0.1 in two norms.
The tight coupling optimization comprises an IMU factor, a UWB factor and a visual factor, in a division window, the IMU factor respectively obtains pose residual errors between two adjacent frames of data through pre-integration, the UWB factor obtains distance residual errors between an unmanned aerial vehicle and an anchor point at each frame time, the visual factor obtains the pose residual errors at each frame through re-projection, after the residual errors of all factors are obtained, a solver for modeling and solving optimization problems is used, the automatic derivation function is used for optimizing parameters of each frame in the division window to obtain zero deviation of pose information, acceleration and angular velocity, and t is assumed to be zero deviation of pose information, acceleration and angular velocitykFor the time stamp, t, corresponding to the k +1 frame image in the windowk+1For the time stamp, t, corresponding to the k +2 frame image in the windowkiTime stamp of ith data of IMU in the k +1 frame image and the k +2 frame image in the window, tkuTime stamps of the u-th data of UWB in the k +1 frame image and the k +2 frame image in the window;
UWB factor
UWB ranging information d for each time within a windowiObtaining a residual error with respect to the ranging information
Figure BDA00030783624300000513
And a cost function Γ α:
Figure BDA0003078362430000061
△tuk=tku-tk
Figure BDA0003078362430000062
wherein
Figure BDA0003078362430000063
Is tkThe position coordinates of the unmanned aerial vehicle under the world coordinate system at the moment,
Figure BDA0003078362430000064
speed, delta t, of unmanned aerial vehicle under world coordinate system at time instantukIs the difference between the UWB data timestamp and the image timestamp, gWIs a three-dimensional representation of gravitational acceleration under a world coordinate system,
Figure BDA0003078362430000065
is tkA rotation matrix of the unmanned aerial vehicle under a world coordinate system at a moment,
Figure BDA0003078362430000066
is a pre-integrated position component derived from IMU information,
Figure BDA0003078362430000067
the three-dimensional coordinate of the anchor point in the world coordinate system is obtained after the online anchor point calibration unit, and rho () is a pseudo-Huber loss function;
IMU factor
Aiming at the measurement information of the IMU twice, obtaining a cost function gamma related to the IMU measurement informationβ
Figure BDA0003078362430000068
Wherein
Figure BDA0003078362430000069
Respectively the increment of the position, the speed and the posture from the k frame to the k +1 frame obtained by the odometer,
Figure BDA00030783624300000610
the gyroscope deviations under the world coordinate systems at the moment of the k +1 th frame and the k frame respectively,
Figure BDA00030783624300000611
respectively setting the accelerometer deviations under the world coordinate systems at the k +1 th frame and the k frame;
visual factor
Vision is defined as the reprojection error of the tracked features, the reprojection of all features in the current frame is compared with their first observation, a cost function Γ for the reprojection error is obtainedγ
Figure BDA00030783624300000612
Wherein
Figure BDA00030783624300000613
Is the l < th > frame image of the camerakThe coordinates of each feature, pi is a projection function that converts homogeneous coordinates to image coordinates,
Figure BDA00030783624300000614
respectively representing the frame rotation matrix and translation matrix from the ith frame to the jth frame of the camera,
Figure BDA00030783624300000615
is the 3D position of the kth feature in the first observation frame i, the visual factor iterates continuously over all frames and all tracked features in the estimation state.
The method also comprises a step of using a sliding window in cooperation with a marginalization strategy, wherein the marginalization strategy is to use a frame which is moved out of the sliding window as a marginalization frame, a constraint related to the marginalization frame is called a marginalization constraint, the marginalization frame is not updated later, but the marginalization constraint becomes prior information of a frame to be normally optimized, which is equivalent to changing from solving a joint probability distribution to solving a conditional probability distribution with the prior information.
Ten key frames are selected by one window, the result of tight coupling optimization is used as input, and each frame respectively comprises the same time tkAnd the factors are independent from each other to form a constraint item of tight coupling optimization, so that the condition that at most ten key frames exist in a window is ensured, if the number of the key frames exceeds ten, each factor in the key frames to be rimmed is modified firstly, all data in the rimmed key frames are directly discarded, the modified factor is coupled with the recently received factor, tight coupling optimization is carried out again, variables to be optimized are optimized, and the pose and map information of the unmanned aerial vehicle are obtained.
The invention has the characteristics and beneficial effects that:
the method has very important significance for the research of the positioning and navigation algorithm of the quad-rotor unmanned aerial vehicle in the GNSS rejection environment. The method is stable and reliable, is not influenced by weather, light and the like, does not need to know the position of the anchor point in advance, calibrates the position of the anchor point by an online method, saves computing resources by an algorithm, and has lower requirements on hardware. The invention has high theoretical and engineering value, and the advantages of the invention are summarized as follows:
(1) the invention provides IMU pre-integration and windowing strategies, which can save computing resources and improve the real-time performance of the system. Through the IMU pre-integration unit, the step of repeatedly calculating the predicted pose after optimizing zero offset of the acceleration and the angular velocity every time can be avoided, and therefore time for each sub-optimization in the close coupling optimization unit is saved. After the unmanned aerial vehicle takes off, along with the accumulation of time, if the optimization method is used, variables to be optimized are more and more, the window-dividing strategy enables the number of the optimized poses to be reduced and the previous information is not lost, so that the precision is not influenced.
(2) The invention designs an algorithm for calibrating the anchor point position on line, thereby saving manpower and improving the practicability of UWB enhanced positioning. The traditional method needs manual measurement and calculation in advance to obtain the relative pose between the initial moment of the unmanned aerial vehicle and the anchor point, so that each experiment needs to be calibrated once, the running conditions of indoor experiments are harsh, and outdoor experiments are more limited to the situation. The on-line anchor point calibration algorithm designed by the invention couples odometer information and ranging information obtained by using IMU and camera data, and obtains the anchor point position coordinate under the unmanned aerial vehicle initial moment coordinate system after the unmanned aerial vehicle flies for a period of time. After calibration is completed, the UWB information will enter the tight coupling optimization unit as a constraint.
(3) The invention designs an optimization algorithm based on tight coupling. The optimization mode of loose coupling is that a pose is calculated as an initial value after the raw data of a certain sensor is utilized, then the pose is fused with other sensors, and the optimization is continued by utilizing the initial value. The tight coupling method is to use the raw data of all sensors, respectively use the data observed by each sensor to obtain residual errors, and use a large optimization function to obtain optimized parameter information. Correspondingly, the method is to use a close-coupled optimization algorithm to process the IMU, the UWB and the binocular camera information to obtain the pose, the acceleration and the zero offset of the angular velocity of the unmanned aerial vehicle.
(4) The unmanned aerial vehicle positioning platform under the GNSS rejection environment built by the invention has strong expandability, and besides sensor equipment such as IMU, UWB and binocular cameras, the unmanned aerial vehicle positioning platform can be added with sensor equipment according to the self of developers, such as laser radar, height-fixing radar and the like, so that secondary development can be carried out.
Description of the drawings:
figure 1 hardware physical diagram of a quad-rotor unmanned aerial vehicle system.
FIG. 2 is a block diagram of a design structure of a visual inertia tight coupling odometer calibrated based on UWB online anchor points.
FIG. 3 is a schematic diagram of an online anchor point calibration unit.
Fig. 4 is a schematic diagram of the calibration result of the four online anchor points in the indoor scene.
FIG. 5 is a diagram illustrating four times of online anchor point calibration results in a corridor scenario.
Fig. 6 is an experimental top view of an indoor scene under Rviz.
Fig. 7 experimental side view of indoor scene under Rviz.
FIG. 8 is a table of comparative error analysis for four experiments in a room scenario.
Fig. 9 is an experimental top view of a corridor scene under Rviz.
Fig. 10 is an experimental side view of a corridor scene under Rviz.
Figure 11 shows a table of error comparison analysis for four experiments in the corridor scenario.
FIG. 12 is a graph of absolute error for the Vins (an advanced open source visual inertial odometer) method under the Euroc dataset (an approved public dataset).
Fig. 13 absolute error plot of the present method under Euroc data set.
FIG. 14 is a trace plot of Vins, method, truth values under the Euroc dataset.
Detailed Description
The technical scheme adopted by the invention comprises two parts, namely hardware and software, wherein the hardware comprises: the device comprises a UWB ranging module, an IMU inertial measurement sensor, a binocular camera, an embedded onboard processor and an unmanned aerial vehicle platform. The software comprises: the device comprises an IMU pre-integration unit, an outlier detection unit, an online anchor point calibration unit, a sliding window unit and a tight coupling optimization unit. The hardware objects of the system are shown in the attached figure 1, and the whole algorithm flow is shown in the attached figure 2. The following is a detailed description of each part.
Hardware part:
UWB refers to an electric wave distance measuring module made by adopting ultra wide band technology, which generally measures the distance between two end points by emitting nanosecond or even subnanosecond pulse signals, the time resolution of the pulse signals is extremely high, thereby ensuring high-precision distance measurement and achieving centimeter-level precision under ideal conditions; the frequency spectrum range of the signal of the UWB ranging device is wider than the frequency spectrum of most electric wave ranging technologies, and meanwhile, the UWB signal has high penetrating power due to abundant low-frequency components, can easily penetrate through barriers such as boards and glass, and can finish ranging under the scene that the sight distance and the non-sight distance coexist. The UWB module needs to be deployed at two positions, one is used as an anchor point to be deployed around the environment, the anchor point needs to know the position of the UWB module, and the position of the UWB module can be calculated through an online anchor point calibration unit; and the second mode is that the label is placed on the unmanned aerial vehicle as a label, the label actively sends a ranging request to the anchor point according to a preset sequence, and the distance between the two modules is obtained according to the TW-ToF measurement principle. Because the native distance measurement has a certain error, the noise is partially filtered by the outlier detection unit and then is used as an input to be sent to the tight coupling optimization unit. The IMU inertia measurement unit is used for measuring acceleration and angular velocity information of the unmanned aerial vehicle at high frequency, and IMU measurement values are processed by the low-pass filter module and then are also used as input to be sent to the tight coupling optimization unit. The measurement of the binocular camera is processed by the vision odometer and then is sent to the tight coupling optimization unit as input. The embedded onboard processor is mainly used for the algorithm to run in real time, and the position and attitude information of the unmanned aerial vehicle is analyzed in a tight coupling optimization mode by processing distance measurement values, IMU measurement and binocular camera measurement among UWB modules; the unmanned aerial vehicle platform is used for carrying on above equipment and flies and verify.
A software part:
(1) IMU pre-integration unit
The measurement frequency of the IMU is often above 100Hz, and its output frequency is much higher compared to the camera and UWB. The IMU inertial measurement unit can obtain more accurate acceleration and angular velocity information in a short time, but a drift phenomenon can occur in a period of time. On one hand, in order to fully utilize IMU data, when UWB data and image data are not received, the IMU data need to be processed, and rough estimation is carried out to obtain the predicted pose of the IMU. On the other hand, the state quantity needing to be estimated has zero offset of acceleration and angular velocity besides the pose of the unmanned aerial vehicle, which means that the predicted pose of the IMU is recalculated each time optimization is carried out. In order to avoid repeated calculation, the IMU pre-integration unit updates the value of the pre-integration after updating zero offset of the acceleration and the angular velocity each time.
(2) Outlier detection unit
Even though the distance of the UWB measurement can reach centimeter level in the ideal case, in practice, the distance measurement often has static errors and errors due to temperature, scene, and the like. Furthermore, in a scene with dense obstacles, outliers may be introduced, which indicate measurements that are significantly adverse to the fact, such as negative readings, sudden large jumps, etc., and which lead to erroneous state estimation. The outlier is detected by adopting a threshold adding method, and once the distance measurement is considered to be the outlier, the outlier is directly rejected. Through experimental tests, the outlier has a relatively obvious influence on state estimation, positioning accuracy is influenced, even a complete error of positioning is possibly caused, and the situation can be improved after the outlier detection unit is introduced.
(3) Online anchor point calibration unit
Conventional positioning methods using UWB require knowledge of the relative pose between anchor points disposed on a drone and anchor points disposed on the ground in advance. The general method is to arrange anchor points on the ground to be perpendicular or parallel to the initial course of the unmanned aerial vehicle and manually measure the distance, or to switch the mode after obtaining the relative pose by an off-line method, and to use the calibrated anchor point position as the known information. Obviously, the writing method has defects, not only is not convenient enough, but also cannot ensure the precision, and the positioning practicability based on UWB enhancement is limited. As shown in fig. 3, the invention designs an online anchor point calibration unit, supposing that the positioning of the IMU and the vision is accurate enough during a period of time when the unmanned aerial vehicle takes off, and calibrates the coordinates of the anchor point in the coordinate system at the initial moment of the unmanned aerial vehicle taking off by using the outputted odometer information and the UWB ranging information.
(4) Sliding window unit
In the optimization-based positioning technology, as the running time is prolonged, the states to be optimized and observation constraints are more and more, and the Hessian matrix HlThe scale of (2) will rise rapidly, so that the solving speed becomes slow, and the real-time requirement cannot be met. To address this issue, a sliding window strategy based backend optimization is used herein. Only the frames and constraints within the sliding window are considered in constructing the joint optimization problem. OdometerDuring operation, new frames are continuously added into the window, and old frames are continuously removed from the sliding window, so that the scale of the frame to be optimized can be maintained, and the reduction of the solving speed is avoided. When the old frame is removed out of the sliding window, the frame with the sliding window removed still has a constraint with the frame in the sliding window, and if the frame is directly removed, the constraint of the part is lost, so that the accuracy of the odometer is reduced, so that the use of the sliding window is usually matched with an marginalization strategy. The marginalization strategy is to take the frame moved out of the sliding window as a marginalized frame, and the constraint related to the marginalized frame is called a marginalized constraint. The marginalized frame is not updated later, but the marginalized constraint becomes the prior information of the normal frame to be optimized, which is equivalent to changing from solving a joint probability distribution to solving a conditional probability distribution with prior information. This maintains the scale of the optimization problem without losing constraints.
(5) Tightly coupled optimization unit
The close coupling optimization unit is a core unit of the unmanned aerial vehicle positioning technology and comprises an IMU factor, a UWB factor and a visual factor. In the window dividing process, IMU factors are respectively used for obtaining pose residual errors between two adjacent frames of data through pre-integration, distance residual errors between the unmanned aerial vehicle and the anchor points at each frame moment are obtained through UWB factors, and pose residual errors at each frame are obtained through re-projection of the vision factors. After obtaining the residual errors of all factors, optimizing the parameters of each frame in the window by using a solver (Ceres) for modeling and solving large and complex optimization problems by using the function of automatic derivation to obtain pose information and zero deviation of acceleration and angular velocity.
The invention provides a design of a visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point calibration, which mainly comprises a hardware part and an algorithm part.
The following describes a design of a visual inertia tightly-coupled odometer based on UWB online anchor point calibration in a GNSS rejection environment in further detail with reference to the accompanying drawings.
A hardware object of a system of a quad-rotor unmanned aerial vehicle designed based on a visual inertia tight coupling odometer calibrated by UWB online anchor points in a GNSS rejection environment is shown in an attached figure 1. First, the hardware configuration of the system is described in detail: the hardware of the system comprises a quad-rotor unmanned aerial vehicle, a UWB (ultra wide band) ranging sensor, a binocular camera, an IMU (inertial measurement unit) inertial measurement unit and an embedded onboard processor. In consideration of the reliability and positioning accuracy of the system, a sensor device having high accuracy is required, in which the UWB ultra-wideband ranging sensor employs a P440 module manufactured by the american Time domain company. The P440 module is an ultra-wideband radio transceiver with a band between 3.1G and 4.8GHz, which is a coherent radio transceiver, i.e., the energy of each transmitted pulse can be summed to increase the signal-to-noise ratio (SNR) of the received signal. The P440 mainly adopts a two-way flight time mode to carry out distance measurement among 2 or more than 2 modules, the refresh rate is up to 125Hz, and communication can be realized among two or more modules. The invention has the networking function of optimizing the two-way flight time ranging, the network ranging can adopt an ALOHA (random access) or TDMA (time division multiple access) protocol, the measuring accuracy can reach 10cm, the transmitting power is extremely low, four functions (ranging, data transmission, a monostatic radar and a multistatic radar) can be simultaneously executed, the ranging comprises a single-point ranging mode and a networking ranging mode, and the invention mainly uses the single-point ranging function.
For a binocular camera, the invention adopts a real sense D435i depth camera manufactured by intel corporation, which has an imager, an IR projector, a left imager, an RGB module, a built-in IMU, etc., but the method uses only the left and right imagers as binocular cameras. RealSense D435i is with low costs, light in weight and powerful, and the shooting range reaches 10 meters, can easily integrate on various platforms, provides cross-platform support.
For the IMU inertial measurement unit, the present invention employs the MTi-300 inertial measurement unit manufactured by Xsense Technology, the Netherlands, which is an industrial-grade IMU based on micro inertial sensing Technology that can distribute native angular rate and linear acceleration information at a frequency of 1000 Hz. And the sensor information is directly transmitted to an upper computer for processing by adopting a USB3.0 interface.
The embedded onboard sensor is selected in consideration of the need to process a large amount of data and to be able to be mounted on the drone. Embedded development platforms need to have the advantages of excellent performance, small volume, light weight, and the like. The invention uses intel eighth generation NUCs as an onboard processor. NUC is the lightest weight computer from intel corporation, carries the eighth generation i7 processor, 32G high memory, making it a good system platform for embedded onboard processors.
Four rotor unmanned aerial vehicle platforms are formed by the equipment such as carbon fiber frame, Pixhawk open source flight controller, DJIE 300 motor-generator and 6S high voltage battery and above-mentioned sensor on the hardware. The algorithm program is developed based on ROS (robot Operating system) robot open source framework under Linux Operating system. Data collected by a camera, an IMU sensor and a UWB ranging module are received under an ROS frame and under an ROS system environment, and the data are processed through the design of the visual inertia tight coupling odometer based on the UWB online anchor point calibration designed by the invention, so that the real-time autonomous positioning and environment sensing of the unmanned aerial vehicle are realized. Wherein, the key algorithms are all written by adopting C + + language.
(1) IMU pre-integration unit
The measurement frequency of the IMU is often above 100Hz, and its output frequency is much higher compared to the camera and UWB. The IMU inertial measurement unit can obtain more accurate acceleration and angular velocity information in a short time, but a drift phenomenon can occur in a period of time. On one hand, in order to fully utilize IMU data, when UWB data and image data are not received, the IMU data need to be processed, and rough estimation is carried out to obtain the predicted pose of the IMU. On the other hand, the state quantity needing to be estimated has zero offset of acceleration and angular velocity besides the pose of the unmanned aerial vehicle, which means that the predicted pose of the IMU is recalculated each time optimization is carried out. In order to avoid repeated calculation, the IMU pre-integration unit updates the value of the pre-integration after updating zero offset of the acceleration and the angular velocity each time. The method not only uses the IMU for predicting the pose of each frame of image at the moment, but also applies IMU pre-integration to the UWB preprocessing part by finding the relation between the UWB timestamp and the IMU timestamp, and the pose of the unmanned aerial vehicle receiving UWB information can be predicted and obtained through the method.
And integrating all IMU measurement data between the kth frame and the (k + 1) th frame of the camera by using the position, the speed and the posture of the known kth frame of the camera, so as to obtain the position, the speed and the posture state corresponding to the (k + 1) th frame. The IMU integration continues in the form:
Figure BDA0003078362430000111
there are:
Figure BDA0003078362430000112
wherein the content of the first and second substances,
Figure BDA0003078362430000113
and
Figure BDA0003078362430000114
a quaternion representation representing the position and velocity of the body b and the rotation at the time of the kth frame in the world coordinate system w,
Figure BDA0003078362430000115
and
Figure BDA0003078362430000116
quaternion representation of the position and speed and rotation of the body b at the time of the (k + 1) th frame in the world coordinate system w,. DELTA.tkThe time difference of the data is measured for both IMUs,
Figure BDA0003078362430000117
and
Figure BDA0003078362430000118
respectively representing acceleration and angular velocity measured by the IMU at time t,
Figure BDA0003078362430000119
is a quaternion representation at time t under the world coordinate system w,
Figure BDA00030783624300001110
and
Figure BDA00030783624300001111
respectively representing the corresponding measured deviations bias, g, of acceleration and angular velocitywRepresenting gravitational acceleration, omega, in world coordinatesx、ωy、ωzThree components of ω, respectively.
Because the IMU is discrete sampling in the actual process, the solution of the median discrete form of the IMU state at the i +1 th time by the IMU state at the i th time is:
Figure BDA00030783624300001112
wherein the content of the first and second substances,
Figure BDA00030783624300001113
and
Figure BDA00030783624300001114
the measured acceleration and angular velocity in the IMU system are converted to the body acceleration and angular velocity in the world system, respectively, with the other notations explained above.
The method can be obtained through a continuous form of IMU integration, and the state at the moment of solving k +1 needs to depend on the posture of the kth frame
Figure BDA0003078362430000121
Backend optimization update
Figure BDA0003078362430000122
The IMU between frames then needs to be re-integrated again, resulting in many iterations. To address this problem, the optimization variables are separated from the IMU pre-integral terms from frame k to frame k + 1:
Figure BDA0003078362430000123
wherein:
Figure BDA0003078362430000124
wherein
Figure BDA0003078362430000125
Is a rotation matrix from the k frame world coordinate system to the body coordinate system,
Figure BDA0003078362430000126
and (3) converting the formula (3) into a corresponding discrete form, and solving the IMU state relation at the i +1 th moment through the IMU state at the i th moment as follows:
Figure BDA0003078362430000127
here, the
Figure BDA0003078362430000128
Is relative to bkAt the frame time
Figure BDA0003078362430000129
The status of the IMU delta information of (1) is only equal to bkThe frame is relevant. Through the iterative solution of the formula, the relative pose between two frames can be pre-integrated.
(2) Outlier detection unit
Even though the distance of the UWB measurement can reach centimeter level in the ideal case, in practice, the distance measurement often has static errors and errors due to temperature, scene, and the like. Furthermore, in a scene with dense obstacles, outliers may be introduced, which indicate measurements that are significantly adverse to the fact, such as negative readings, sudden large jumps, etc., and which lead to erroneous state estimation. The outlier is detected by adopting a threshold adding method, and once the distance measurement is considered to be the outlier, the outlier is directly rejected. Through experimental tests, the outlier has a relatively obvious influence on state estimation, positioning accuracy is influenced, even a complete error of positioning is possibly caused, and the situation can be improved after the outlier detection unit is introduced.
When the program is in the online anchor point calibration stage, the outlier detection unit is used for filtering the received UWB ranging information. Assuming that the measured value of UWB is d, in the current scenario we collect n sample points di(i-1, …, n), if the following condition is satisfied, the i-th distance measurement value diIs determined as outlier:
Figure BDA0003078362430000131
Figure BDA0003078362430000132
wherein
Figure BDA0003078362430000133
Is the average of the n ranging values, gamma is an adjustable threshold, vmaxF is the maximum flight speed of the unmanned aerial vehicle, and is the ranging update rate of the UWB module. If the distance is determined to be an outlier, it is directly discarded. If the situation is normal, the distance d is determinediAnd sending the data as input to an online anchor point calibration unit.
After the calibration stage of the online anchor point is finished, because each UWB ranging data is processed after being received, whether the current measurement value is an outlier or not can not be judged in a mode of averaging historical UWB data, a two-norm of the difference between the current pose of the unmanned aerial vehicle and the calibrated anchor point position is used as a criterion, and if the two-norm meets the following conditions, the distance measurement value d at the ith time is used as a criterioniIs determined as outlier:
Figure BDA0003078362430000134
wherein P iskIs the current pose of unmanned aerial vehicle, PaIs the anchor point position coordinate under the unmanned aerial vehicle initial moment coordinate system obtained by the online anchor point calibration unit, gamma is an adjustable threshold value, vmaxFor maximum flight speed of the drone, f isUWB module ranging update rate. If the distance is determined to be an outlier, it is directly discarded. If the situation is normal, the distance d is determinediFed as input into the close-coupled unit.
(3) Online anchor point calibration unit
Conventional positioning methods using UWB require knowledge of the relative pose between anchor points disposed on a drone and anchor points disposed on the ground in advance. The general method is to arrange anchor points on the ground to be perpendicular or parallel to the initial course of the unmanned aerial vehicle and manually measure the distance, or to switch the mode after obtaining the relative pose by an off-line method, and to use the calibrated anchor point position as the known information. Obviously, the writing method has defects, not only is not convenient enough, but also cannot ensure the precision, and the positioning practicability based on UWB enhancement is limited. As shown in fig. 3, in the method, an online anchor point calibration unit is designed, so that if the positioning of the IMU and the vision is accurate enough within a period of time when the unmanned aerial vehicle takes off, the coordinates of the anchor point in the coordinate system at the initial take-off time of the unmanned aerial vehicle are calibrated by using the outputted odometer information and the UWB ranging information.
From unmanned aerial vehicle running time, online anchor point calibration unit begins work promptly, but probably appear under unmanned aerial vehicle is static or the translation rate is slower, the UWB of accepting has the higher condition of repeatability, can appear data redundancy on the one hand, extravagant unnecessary space resource, on the other hand duplicated data can lead to these data to account for than too high to can't accomplish the task of anchor point calibration.
Therefore, the method selects the following criteria to screen the UWB data:
Figure BDA0003078362430000135
Figure BDA0003078362430000136
wherein d isiIs the ith distance measurement, dnIs the n (i)>n) times of distance measurementγ is an adjustable threshold, f is the UWB module ranging update rate,
Figure BDA0003078362430000141
for the pose, v, of the current unmanned aerial vehicle under the world coordinate systemthrIs a preset speed threshold (0.2 m/s is taken by the method). If the condition is met, the ith distance measurement value is discarded, the (i + 1) th distance measurement value is substituted into the formula for judgment, and the process is repeated until the mth distance measurement value does not meet the condition, and n is assigned as m.
Thus, for the ith distance measurement, a residual error about the ranging information may be obtained
Figure BDA0003078362430000142
Figure BDA0003078362430000143
Wherein d isiIs the ith time of the distance measurement,
Figure BDA0003078362430000144
is the pose of the unmanned aerial vehicle under the world coordinate system under the same timestamp with the UWB ranging information obtained by IMU pre-integration and a visual odometer,
Figure BDA0003078362430000145
is the three-dimensional coordinate of the anchor point to be estimated in the world coordinate system. Thus an optimized cost function Er
Figure BDA0003078362430000146
Wherein
Figure BDA0003078362430000147
ρ () is a pseudo-Huber loss function. The method uses the automatic derivation function of Ceres, writes a cost function and then determines a variable to be optimized
Figure BDA0003078362430000148
Will be the parameter di
Figure BDA0003078362430000149
The optimization can be carried out after the interface is transmitted, and the method judges the condition of the termination of the optimization as the condition obtained by the optimization twice before and after
Figure BDA00030783624300001410
The difference is less than 0.1 in two norms.
(4) Sliding window unit
In the optimization-based positioning technology, as the running time is prolonged, the to-be-optimized state and observation constraint are more and more, and the matrix HlThe scale of (2) will rise rapidly, so that the solving speed becomes slow, and the real-time requirement cannot be met. To address this issue, a sliding window strategy based backend optimization is used herein. Only the frames and constraints within the sliding window are considered in constructing the joint optimization problem. When the odometer runs, new frames are continuously added into the window, and old frames are continuously removed from the sliding window, so that the scale of the frame to be optimized can be maintained, and the reduction of the solving speed is avoided. When the old frame is removed out of the sliding window, the frame with the sliding window removed still has a constraint with the frame in the sliding window, and if the frame is directly removed, the constraint of the part is lost, so that the accuracy of the odometer is reduced, so that the use of the sliding window is usually matched with an marginalization strategy. The marginalization strategy is to take the frame moved out of the sliding window as a marginalized frame, and the constraint related to the marginalized frame is called a marginalized constraint. The marginalized frame is not updated later, but the marginalized constraint becomes the prior information of the normal frame to be optimized, which is equivalent to changing from solving a joint probability distribution to solving a conditional probability distribution with prior information. This maintains the scale of the optimization problem without losing constraints.
Ten key frames are selected by one window, the output of the tight coupling optimization unit module is used as the input of the module, and each frame respectively comprises the same time tkCorresponding IMU factor, UWB factor and vision factor, the phase between these factorsIndependent of each other, constitute the constraint term in the tight coupling optimization unit module. This module ensures that there are only a maximum of ten key frames in the window, and if there are more than ten, the factors in the key frame to be rimmed are modified first, so as not to lose the constraint between the above factors, and then all data in the rimmed key frame is discarded directly. After the processing of the module, the modified factor of the module is coupled with the recently received factor, and the modified factor is input into the tight coupling optimization unit module again to optimize the variable to be optimized, so that the pose and the map information of the unmanned aerial vehicle are obtained.
(5) Tightly coupled optimization unit
The close coupling optimization unit is a core unit of the unmanned aerial vehicle positioning technology and comprises an IMU factor, a UWB factor and a visual factor. In the window dividing process, IMU factors are respectively used for obtaining pose residual errors between two adjacent frames of data through pre-integration, distance residual errors between the unmanned aerial vehicle and the anchor points at each frame moment are obtained through UWB factors, and pose residual errors at each frame are obtained through re-projection of the vision factors. After obtaining the residual errors of all factors, optimizing the parameters of each frame in the window by using a solver (Ceres) for modeling and solving large and complex optimization problems by using the function of automatic derivation to obtain pose information and zero deviation of acceleration and angular velocity. Let tkFor the time stamp, t, corresponding to the k +1 frame image in the windowk+1For the time stamp, t, corresponding to the k +2 frame image in the windowkiTime stamp of ith data of IMU in the k +1 frame image and the k +2 frame image in the window, tkuTime stamps of the u-th data of UWB within the k + 1-th frame image and the k + 2-th frame image within the window.
UWB factor
UWB ranging information d for each time within a windowiResidual error about the ranging information can be obtained
Figure BDA0003078362430000151
And a cost function Γ α:
Figure BDA0003078362430000152
△tuk=tku-tk
Figure BDA0003078362430000153
wherein
Figure BDA0003078362430000154
Is tkThe position coordinates of the unmanned aerial vehicle under the world coordinate system at the moment,
Figure BDA0003078362430000155
speed, delta t, of unmanned aerial vehicle under world coordinate system at time instantukIs the difference between the UWB data timestamp and the image timestamp, gWIs a three-dimensional representation of gravitational acceleration under a world coordinate system,
Figure BDA0003078362430000156
is tkA rotation matrix of the unmanned aerial vehicle under a world coordinate system at a moment,
Figure BDA0003078362430000157
is a pre-integrated position component derived from IMU information,
Figure BDA0003078362430000158
is the three-dimensional coordinate of the anchor point in the world coordinate system obtained after the online anchor point calibration unit, and rho () is a pseudo-Huber loss function.
1) IMU factor
Aiming at the measurement information of the IMU twice, a cost function gamma related to the IMU measurement information can be obtainedβ
Figure BDA0003078362430000159
Wherein
Figure BDA00030783624300001510
Respectively the increment of the position, the speed and the posture from the k frame to the k +1 frame obtained by the odometer,
Figure BDA00030783624300001511
the gyroscope deviations under the world coordinate systems at the moment of the k +1 th frame and the k frame respectively,
Figure BDA00030783624300001512
the accelerometer bias in the world coordinate system at the time of the k +1 th frame and the k frame respectively.
2) Visual factor
Vision is defined as the reprojection error of the tracked features, and the invention compares the reprojection of all features in the current frame with their first observation to obtain a cost function Γ about the reprojection errorγ
Figure BDA0003078362430000161
Wherein
Figure BDA0003078362430000162
Is the l < th > frame image of the camerakThe coordinates of each feature, pi is a projection function that converts homogeneous coordinates to image coordinates,
Figure BDA0003078362430000163
respectively representing the frame rotation matrix and translation matrix from the ith frame to the jth frame of the camera,
Figure BDA0003078362430000164
is the 3D position of the kth feature in the first observation frame i. The visual factors are iterated continuously over all frames and all tracked features in the estimation state.
The results of the experiment are given below:
fig. 4 is a schematic diagram of a calibration result of the four online anchor points in an indoor scene, and fig. 5 is a schematic diagram of a calibration result of the four online anchor points in a corridor scene. It can be seen from the table that the maximum relative error of the indoor online anchor point calibration is 4.48%, and the maximum relative error in the corridor scene is 7.62%, and the calibration result is enough to meet the use of the subsequent odometer.
As shown in fig. 6 and 7, the experimental scenario is indoor, and the starting point and the ending point are at the same location. In the figure, the trajectory (i) represents the trajectory obtained by Vins, the trajectory (ii) represents the trajectory obtained by the method, and the trajectory (iii) represents the trajectory (which can be considered as a true value) obtained by Motion Capture. The curve (c) is between the curve (c) and the curve (c), which shows that the estimation of the trajectory by the method is closer to the true value compared with the Vins. Fig. 8 is a table showing comparison and analysis of error of four experiments in an indoor scene, and it can be seen from the table that the accuracy of the method in the indoor scene can be improved by about 15% compared with that of the Vins method.
Similarly, as shown in fig. 9 and fig. 10, the experimental scene is a corridor, and the starting point and the ending point are the same place. Because the experimental scene is a corridor, the phenomena of large yaw change and sudden reduction of characteristic points can occur at the turning, the track obtained by the Vins can have very obvious burrs and can find that the terminal point has large deviation from the starting point, and although the track obtained by the method also has burrs, the deviation between the terminal point and the starting point is very small. FIG. 11 is a table showing comparison and analysis of error of four experiments in a corridor scene, and it can be seen from the table that the accuracy of the method in the corridor scene can be improved by about 20% compared with that of the Vins method.
Fig. 12 and 13 show absolute error curves for Vins and the method under the Euroc data set. It can be seen that the method is improved in both maximum and mean absolute error values over Vins, and fig. 14 is a trace plot of Vins, method, true values under the Euroc data set. The track obtained by the method is between the true value and the Vins, which shows that the method has certain improvement compared with the Vins
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (8)

1. A visual inertia tight coupling odometer based on UWB online anchor points is characterized by having the following structure: the radio wave ranging module UWB made by ultra-wideband technology is deployed at two positions, one is deployed around the environment as an anchor point, and the position of the anchor point is calculated by an online anchor point calibration unit; secondly, the label is placed on the unmanned aerial vehicle as a label, the label actively sends a ranging request to the anchor point according to a preset sequence, the distance between the two modules is obtained according to the TW-ToF measurement principle, and the distance is sent to the tight coupling optimization unit as an input after part of noise is filtered by the outlier detection unit; the IMU inertia measurement unit is used for measuring acceleration and angular velocity information of the unmanned aerial vehicle at high frequency, and IMU measurement values are processed by the low-pass filter module and then are also used as input to be sent to the tight coupling optimization unit; the measured value of the binocular camera is processed by the visual odometer and then is used as input to be sent to the tight coupling optimization unit; the online anchor point calibration unit, the outlier detection unit and the tight coupling optimization unit are all modules for the embedded onboard processor to call, the embedded onboard processor utilizes the tight coupling optimization unit to process distance measurement values, IMU measurement values and binocular camera measurement values among UWB modules, and the position and attitude information of the unmanned aerial vehicle is analyzed in a tight coupling optimization mode.
2. A visual inertia tight coupling mileage metering method based on UWB online anchor points is characterized by being realized by the following devices: the radio wave ranging module UWB made by ultra-wideband technology is deployed at two positions, one is deployed around the environment as an anchor point, and the position of the anchor point is calculated by online anchor point calibration; secondly, the label is placed on the unmanned aerial vehicle as a label; the system also comprises an airborne binocular camera and an IMU unit; the method comprises the steps that a tag actively sends a ranging request to an anchor point according to a preset sequence, the distance between two modules is obtained according to a TW-ToF measuring principle, and after part of noise is detected and filtered through an outlier, the noise is used as input to carry out tight coupling optimization; the IMU inertia measurement unit is used for measuring acceleration and angular velocity information of the unmanned aerial vehicle at high frequency, and IMU measurement values are processed by a low-pass filter and then are also used as input to carry out tight coupling optimization; the measured value of the binocular camera is processed by the visual odometer and then is used as input to carry out tight coupling optimization; the embedded onboard processor processes distance measurement values, IMU measurement values and binocular camera measurement values among the UWB modules, and the position and attitude information of the unmanned aerial vehicle is analyzed in a tight coupling optimization mode.
3. The UWB online anchor point based visual inertial tightly-coupled odometry method of claim 2, wherein the IMU pre-integrates: in order to fully utilize IMU data, when UWB data and image data are not received, the IMU data need to be processed, and rough estimation is carried out to obtain the predicted pose of the IMU; on the other hand, to avoid repetitive calculations, the value of the pre-integral is updated after each update of zero offset of acceleration and angular velocity.
4. The method of claim 3, wherein the IMU pre-integration comprises the following steps: by using the position, the speed and the attitude of the known camera at the kth frame time, integrating all IMU measurement data between the kth frame and the (k + 1) th frame of the camera to obtain the position, the speed and the attitude state corresponding to the (k + 1) th frame, wherein the IMU integration continuous form is as follows:
Figure FDA0003078362420000011
there are:
Figure FDA0003078362420000021
wherein the content of the first and second substances,
Figure FDA0003078362420000022
and
Figure FDA0003078362420000023
a quaternion representation representing the position and velocity of the body b and the rotation at the time of the kth frame in the world coordinate system w,
Figure FDA0003078362420000024
and
Figure FDA0003078362420000025
quaternion representation, Δ t, representing the position and velocity of the body b and rotation at the time of frame k +1 in the world coordinate system wkThe time difference of the data is measured for both IMUs,
Figure FDA0003078362420000026
and
Figure FDA0003078362420000027
respectively representing acceleration and angular velocity measured by the IMU at time t,
Figure FDA0003078362420000028
is a quaternion representation at time t under the world coordinate system w,
Figure FDA0003078362420000029
and
Figure FDA00030783624200000210
respectively representing the corresponding measured deviations bias, g, of acceleration and angular velocitywRepresenting gravitational acceleration, omega, in world coordinatesx、ωy、ωzThree components, ω, respectively;
because the IMU is discrete sampling in the actual process, the solution of the median discrete form of the IMU state at the i +1 th time by the IMU state at the i th time is:
Figure FDA00030783624200000211
wherein the content of the first and second substances,
Figure FDA00030783624200000212
and
Figure FDA00030783624200000213
respectively converting the measured acceleration and angular velocity under the IMU system to the body acceleration and angular velocity of the world system;
to avoid duplicate calculations, the optimization variables are separated from the IMU pre-integral terms from frame k to frame k + 1:
Figure FDA00030783624200000214
wherein:
Figure FDA00030783624200000215
wherein
Figure FDA00030783624200000216
Is a rotation matrix from the k frame world coordinate system to the body coordinate system,
Figure FDA00030783624200000217
and (3) converting the formula (3) into a corresponding discrete form, and solving the IMU state relation at the i +1 th moment through the IMU state at the i th moment as follows:
Figure FDA0003078362420000031
here, the
Figure FDA0003078362420000032
Is relative to bkTime of frameIs carved down
Figure FDA0003078362420000033
The status of the IMU delta information of (1) is only equal to bkAnd (4) frame correlation, and pre-integrating the relative pose between two frames by iterative solution of an equation (5).
5. The method of claim 2, wherein the step of outlier detection to filter out part of the noise is as follows: when the program is in the stage of on-line anchor point calibration, the outlier detection unit is used for filtering the received UWB ranging information, assuming that the measured value of the UWB is d, and collecting n sample points d under the current scenei(i-1, …, n), if the following condition is satisfied, the i-th distance measurement value diIs determined as outlier:
Figure FDA0003078362420000034
Figure FDA0003078362420000035
wherein
Figure FDA0003078362420000036
Is the average of the n ranging values, gamma is an adjustable threshold, vmaxF is the maximum flying speed of the unmanned aerial vehicle, the distance measurement updating rate of the UWB module is f, if the distance is judged to be an outlier, the distance is directly abandoned, and if the distance is a normal condition, the distance d is usediAs input, sending the data to an online anchor point calibration unit;
when the online anchor point calibration stage is finished, using a two-norm of the difference between the current pose of the unmanned aerial vehicle and the calibrated anchor point position as a criterion, and if the two-norm meets the following conditions, determining the ith distance measurement value diIs determined as outlier:
Figure FDA0003078362420000037
wherein P iskIs the current pose of unmanned aerial vehicle, PaIs the anchor point position coordinate under the unmanned aerial vehicle initial moment coordinate system obtained by the online anchor point calibration unit, gamma is an adjustable threshold value, vmaxF is the maximum flight speed of the unmanned aerial vehicle, f is the ranging updating rate of the UWB module, and if the distance is judged to be an outlier, the distance is directly discarded; if the situation is normal, the distance d is determinediFed as input into the close-coupled unit.
6. The method for measuring the visual inertia tightly-coupled mileage based on the UWB online anchor point as claimed in claim 2, wherein the online anchor point calibration step comprises: when the UWB is used for positioning, the coordinates of the anchor point under the coordinate system at the initial takeoff moment of the unmanned aerial vehicle are calibrated by utilizing the output odometer information and the UWB ranging information, and the specific steps are as follows:
Figure FDA0003078362420000038
Figure FDA0003078362420000039
wherein d isiIs the ith distance measurement, dnIs the nth (i > n) distance measurement, gamma is an adjustable threshold, f is the UWB module ranging update rate,
Figure FDA0003078362420000041
for the pose, v, of the current unmanned aerial vehicle under the world coordinate systemthrIf the speed threshold is preset, the ith distance measurement value is discarded, the (i + 1) th distance measurement value is substituted into the formula for judgment, and by analogy, n is assigned as m until the mth distance measurement value does not meet the condition;
thus, the needleFor the ith distance measurement, obtaining the residual error related to the distance measurement information
Figure FDA0003078362420000042
Figure FDA0003078362420000043
Wherein d isiIs the ith time of the distance measurement,
Figure FDA0003078362420000044
is the pose of the unmanned aerial vehicle under the world coordinate system under the same timestamp with the UWB ranging information obtained by IMU pre-integration and a visual odometer,
Figure FDA0003078362420000045
is the three-dimensional coordinate of the anchor point to be estimated in the world coordinate system, thus the optimized cost function Er
Figure FDA0003078362420000046
Wherein
Figure FDA0003078362420000047
Rho () is a pseudo-Huber loss function, and the variable to be optimized is determined after the cost function is written by using the automatic derivation function of Ceres
Figure FDA0003078362420000048
Will be the parameter di
Figure FDA0003078362420000049
The optimization can be carried out after the interface is transmitted, and the method judges the condition of the termination of the optimization as the condition obtained by the optimization twice before and after
Figure FDA00030783624200000410
The difference is less than 0.1 in two norms.
7. The method as claimed in claim 2, wherein the tight coupling optimization includes an IMU factor, a UWB factor and a visual factor, the IMU factor is a pose residual between two adjacent frames of data obtained by pre-integration, the UWB factor is a distance residual between the unmanned aerial vehicle and the anchor at each frame time obtained by the UWB factor, the visual factor is a pose residual at each frame time obtained by re-projection, after obtaining the residuals of the factors, a solver for modeling and solving the optimization problem is used, the parameters of each frame in the division window are optimized by using an automatic derivation function to obtain pose information and zero-offset of acceleration and angular velocity, and it is assumed that t is zero-offsetkFor the time stamp, t, corresponding to the k +1 frame image in the windowk+1For the time stamp, t, corresponding to the k +2 frame image in the windowkiTime stamp of ith data of IMU in the k +1 frame image and the k +2 frame image in the window, tkuTime stamps of the u-th data of UWB in the k +1 frame image and the k +2 frame image in the window;
UWB factor
UWB ranging information d for each time within a windowiObtaining a residual error with respect to the ranging information
Figure FDA00030783624200000411
And a cost function Γα
Figure FDA00030783624200000412
Δtuk=tku-tk
Figure FDA00030783624200000413
Wherein
Figure FDA00030783624200000414
Is tkThe position coordinates of the unmanned aerial vehicle under the world coordinate system at the moment,
Figure FDA00030783624200000415
speed, Δ t, of unmanned aerial vehicle under world coordinate system at time instantukIs the difference between the UWB data timestamp and the image timestamp, gWIs a three-dimensional representation of gravitational acceleration under a world coordinate system,
Figure FDA0003078362420000051
is tkA rotation matrix of the unmanned aerial vehicle under a world coordinate system at a moment,
Figure FDA0003078362420000052
is a pre-integrated position component derived from IMU information,
Figure FDA0003078362420000053
the three-dimensional coordinate of the anchor point in the world coordinate system is obtained after the online anchor point calibration unit, and rho () is a pseudo-Huber loss function;
IMU factor
Aiming at the measurement information of the IMU twice, obtaining a cost function gamma related to the IMU measurement informationβ
Figure FDA0003078362420000054
Wherein
Figure FDA0003078362420000055
Respectively the increment of the position, the speed and the posture from the k frame to the k +1 frame obtained by the odometer,
Figure FDA0003078362420000056
the gyroscope deviations under the world coordinate systems at the moment of the k +1 th frame and the k frame respectively,
Figure FDA0003078362420000057
Figure FDA0003078362420000058
respectively setting the accelerometer deviations under the world coordinate systems at the k +1 th frame and the k frame;
visual factor
Vision is defined as the reprojection error of the tracked features, the reprojection of all features in the current frame is compared with their first observation, a cost function Γ for the reprojection error is obtainedγ
Figure FDA0003078362420000059
Wherein
Figure FDA00030783624200000510
Is the l < th > frame image of the camerakThe coordinates of each feature, pi is a projection function that converts homogeneous coordinates to image coordinates,
Figure FDA00030783624200000511
respectively representing the frame rotation matrix and translation matrix from the ith frame to the jth frame of the camera,
Figure FDA00030783624200000512
is the 3D position of the kth feature in the first observation frame i, the visual factor iterates continuously over all frames and all tracked features in the estimation state.
8. The method of claim 2 further comprising the step of using a sliding window in conjunction with an marginalization strategy that uses frames that are moved out of the sliding window as marginalized frames, wherein constraints associated with the marginalized frames are referred to as marginalized constraints, and wherein the marginalized frames are not subsequently processed by the sliding windowUpdating is carried out, but marginalization constraint is changed into prior information of a normal frame to be optimized, which is equivalent to changing from solving a joint probability distribution into solving a conditional probability distribution with the prior information; ten key frames are selected by one window, the result of tight coupling optimization is used as input, and each frame respectively comprises the same time tkAnd the factors are independent from each other to form a constraint item of tight coupling optimization, so that the condition that at most ten key frames exist in a window is ensured, if the number of the key frames exceeds ten, each factor in the key frames to be rimmed is modified firstly, all data in the rimmed key frames are directly discarded, the modified factor is coupled with the recently received factor, tight coupling optimization is carried out again, variables to be optimized are optimized, and the pose and map information of the unmanned aerial vehicle are obtained.
CN202110558941.7A 2021-05-21 2021-05-21 Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method Active CN113124856B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110558941.7A CN113124856B (en) 2021-05-21 2021-05-21 Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110558941.7A CN113124856B (en) 2021-05-21 2021-05-21 Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method

Publications (2)

Publication Number Publication Date
CN113124856A true CN113124856A (en) 2021-07-16
CN113124856B CN113124856B (en) 2023-03-14

Family

ID=76782363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110558941.7A Active CN113124856B (en) 2021-05-21 2021-05-21 Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method

Country Status (1)

Country Link
CN (1) CN113124856B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113589266A (en) * 2021-07-21 2021-11-02 中国煤炭科工集团太原研究院有限公司 Mining multi-mode wireless ranging system based on UWB technology
CN113625774A (en) * 2021-09-10 2021-11-09 天津大学 Multi-unmanned aerial vehicle cooperative positioning system and method for local map matching and end-to-end distance measurement
CN113865584A (en) * 2021-08-24 2021-12-31 知微空间智能科技(苏州)有限公司 UWB three-dimensional object finding method and device based on visual inertial odometer
CN114222366A (en) * 2021-08-06 2022-03-22 深圳技术大学 Indoor positioning method and device based on single base station
CN114264297A (en) * 2021-12-01 2022-04-01 清华大学 Positioning and mapping method and system for UWB and visual SLAM fusion algorithm
CN114323002A (en) * 2021-12-27 2022-04-12 浙江大学 AGV positioning navigation method based on binocular vision, IMU and UWB fusion
CN114459506A (en) * 2022-02-28 2022-05-10 清华大学深圳国际研究生院 Method and system for calibrating external parameters between global navigation satellite system receiver and visual inertial odometer on line
CN114485623A (en) * 2022-02-16 2022-05-13 东南大学 Camera-IMU-UWB fusion accurate positioning method for focusing distance
CN115540854A (en) * 2022-12-01 2022-12-30 成都信息工程大学 Active positioning method, equipment and medium based on UWB assistance

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170205903A1 (en) * 2015-03-05 2017-07-20 Magic Leap, Inc. Systems and methods for augmented reality
CN111091587A (en) * 2019-11-25 2020-05-01 武汉大学 Low-cost motion capture method based on visual markers
WO2020087846A1 (en) * 2018-10-31 2020-05-07 东南大学 Navigation method based on iteratively extended kalman filter fusion inertia and monocular vision
CN111890373A (en) * 2020-09-29 2020-11-06 常州唯实智能物联创新中心有限公司 Sensing and positioning method of vehicle-mounted mechanical arm
CN111983660A (en) * 2020-07-06 2020-11-24 天津大学 System and method for positioning quad-rotor unmanned aerial vehicle in GNSS rejection environment
CN112378396A (en) * 2020-10-29 2021-02-19 江苏集萃未来城市应用技术研究所有限公司 Hybrid high-precision indoor positioning method based on robust LM visual inertial odometer and UWB
CN112484725A (en) * 2020-11-23 2021-03-12 吉林大学 Intelligent automobile high-precision positioning and space-time situation safety method based on multi-sensor fusion
WO2021048500A1 (en) * 2019-09-12 2021-03-18 Dronisos Method and system for automatically positioning drones in a swarm
CN112525197A (en) * 2020-11-23 2021-03-19 中国科学院空天信息创新研究院 Ultra-wideband inertial navigation fusion pose estimation method based on graph optimization algorithm
CN112762929A (en) * 2020-12-24 2021-05-07 华中科技大学 Intelligent navigation method, device and equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170205903A1 (en) * 2015-03-05 2017-07-20 Magic Leap, Inc. Systems and methods for augmented reality
WO2020087846A1 (en) * 2018-10-31 2020-05-07 东南大学 Navigation method based on iteratively extended kalman filter fusion inertia and monocular vision
WO2021048500A1 (en) * 2019-09-12 2021-03-18 Dronisos Method and system for automatically positioning drones in a swarm
CN111091587A (en) * 2019-11-25 2020-05-01 武汉大学 Low-cost motion capture method based on visual markers
CN111983660A (en) * 2020-07-06 2020-11-24 天津大学 System and method for positioning quad-rotor unmanned aerial vehicle in GNSS rejection environment
CN111890373A (en) * 2020-09-29 2020-11-06 常州唯实智能物联创新中心有限公司 Sensing and positioning method of vehicle-mounted mechanical arm
CN112378396A (en) * 2020-10-29 2021-02-19 江苏集萃未来城市应用技术研究所有限公司 Hybrid high-precision indoor positioning method based on robust LM visual inertial odometer and UWB
CN112484725A (en) * 2020-11-23 2021-03-12 吉林大学 Intelligent automobile high-precision positioning and space-time situation safety method based on multi-sensor fusion
CN112525197A (en) * 2020-11-23 2021-03-19 中国科学院空天信息创新研究院 Ultra-wideband inertial navigation fusion pose estimation method based on graph optimization algorithm
CN112762929A (en) * 2020-12-24 2021-05-07 华中科技大学 Intelligent navigation method, device and equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
KAI WEN,等: "A New Quaternion Kalman Filter Based Foot-Mounted IMU and UWB Tightly-Coupled Method for Indoor Pedestrian Navigation", 《 IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY》 *
王文博,等: "基于超宽带、里程计、RGB-D融合的室内定位方法", 《计算机科学》 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113589266A (en) * 2021-07-21 2021-11-02 中国煤炭科工集团太原研究院有限公司 Mining multi-mode wireless ranging system based on UWB technology
CN113589266B (en) * 2021-07-21 2023-11-24 中国煤炭科工集团太原研究院有限公司 Mining multi-mode wireless ranging system based on UWB technology
CN114222366B (en) * 2021-08-06 2023-08-01 深圳技术大学 Indoor positioning method and device based on single base station
CN114222366A (en) * 2021-08-06 2022-03-22 深圳技术大学 Indoor positioning method and device based on single base station
CN113865584A (en) * 2021-08-24 2021-12-31 知微空间智能科技(苏州)有限公司 UWB three-dimensional object finding method and device based on visual inertial odometer
CN113865584B (en) * 2021-08-24 2024-05-03 知微空间智能科技(苏州)有限公司 UWB three-dimensional object searching method and device based on visual inertial odometer
CN113625774A (en) * 2021-09-10 2021-11-09 天津大学 Multi-unmanned aerial vehicle cooperative positioning system and method for local map matching and end-to-end distance measurement
CN114264297A (en) * 2021-12-01 2022-04-01 清华大学 Positioning and mapping method and system for UWB and visual SLAM fusion algorithm
CN114323002A (en) * 2021-12-27 2022-04-12 浙江大学 AGV positioning navigation method based on binocular vision, IMU and UWB fusion
CN114323002B (en) * 2021-12-27 2023-12-08 浙江大学 AGV positioning navigation method based on binocular vision, IMU and UWB fusion
CN114485623A (en) * 2022-02-16 2022-05-13 东南大学 Camera-IMU-UWB fusion accurate positioning method for focusing distance
CN114485623B (en) * 2022-02-16 2024-02-23 东南大学 Focusing distance camera-IMU-UWB fusion accurate positioning method
CN114459506A (en) * 2022-02-28 2022-05-10 清华大学深圳国际研究生院 Method and system for calibrating external parameters between global navigation satellite system receiver and visual inertial odometer on line
CN114459506B (en) * 2022-02-28 2023-08-08 清华大学深圳国际研究生院 Method and system for calibrating external parameters between global navigation satellite system receiver and visual inertial odometer on line
CN115540854A (en) * 2022-12-01 2022-12-30 成都信息工程大学 Active positioning method, equipment and medium based on UWB assistance

Also Published As

Publication number Publication date
CN113124856B (en) 2023-03-14

Similar Documents

Publication Publication Date Title
CN113124856B (en) Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method
CN113945206B (en) Positioning method and device based on multi-sensor fusion
US11218689B2 (en) Methods and systems for selective sensor fusion
CN112347840B (en) Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN108225302B (en) Petrochemical plant inspection robot positioning system and method
CN109887057B (en) Method and device for generating high-precision map
Li et al. LIDAR/MEMS IMU integrated navigation (SLAM) method for a small UAV in indoor environments
CN112505737B (en) GNSS/INS integrated navigation method
Stromback et al. Foot-mounted inertial navigation and cooperative sensor fusion for indoor positioning
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN112130579A (en) Tunnel unmanned aerial vehicle inspection method and system
CN106289235A (en) Autonomous computational accuracy controllable chamber inner position air navigation aid based on architecture structure drawing
CN111983660A (en) System and method for positioning quad-rotor unmanned aerial vehicle in GNSS rejection environment
Steiner et al. A vision-aided inertial navigation system for agile high-speed flight in unmapped environments: Distribution statement a: Approved for public release, distribution unlimited
Yun et al. IMU/Vision/Lidar integrated navigation system in GNSS denied environments
CN112923919B (en) Pedestrian positioning method and system based on graph optimization
CN114019552A (en) Bayesian multi-sensor error constraint-based location reliability optimization method
CN110968113A (en) Unmanned aerial vehicle autonomous tracking take-off and landing system and tracking positioning method
Wang et al. Micro aerial vehicle navigation with visual-inertial integration aided by structured light
CN104792336B (en) A kind of state of flight measurement method and device
CN110514226B (en) Course correction method for pedestrian inertial navigation
Sanna et al. A novel ego-motion compensation strategy for automatic target tracking in FLIR video sequences taken from UAVs
Cao et al. Research on Key Technologies of Multi-rotor UAV High-precision Positioning and Navigation System
RU2264598C1 (en) Method for deterination of coordinates of flight vehicle
CN116380057B (en) Unmanned aerial vehicle autonomous landing positioning method under GNSS refusing environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Tian Bailian

Inventor after: Li Haisong

Inventor after: Chen Hongming

Inventor after: Zong Qun

Inventor before: Tian Bailian

Inventor before: Li Haisong

Inventor before: Chen Hongming

Inventor before: Zong Qun

Inventor before: Wang Cong

Inventor before: He Lei

GR01 Patent grant
GR01 Patent grant