CN117629187A - Close coupling odometer method and system based on UWB and vision fusion - Google Patents

Close coupling odometer method and system based on UWB and vision fusion Download PDF

Info

Publication number
CN117629187A
CN117629187A CN202311710254.8A CN202311710254A CN117629187A CN 117629187 A CN117629187 A CN 117629187A CN 202311710254 A CN202311710254 A CN 202311710254A CN 117629187 A CN117629187 A CN 117629187A
Authority
CN
China
Prior art keywords
pose
unmanned aerial
aerial vehicle
uwb
global
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311710254.8A
Other languages
Chinese (zh)
Inventor
何雷
王聪
马文峰
田辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Army Engineering University of PLA
Original Assignee
Army Engineering University of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Army Engineering University of PLA filed Critical Army Engineering University of PLA
Priority to CN202311710254.8A priority Critical patent/CN117629187A/en
Publication of CN117629187A publication Critical patent/CN117629187A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention aims to provide a tightly coupled odometer method and a tightly coupled odometer system for UWB and vision fusion. The mode of loose coupling between modules is adopted, namely UWB positioning and vision autonomous positioning are independently operated, and finally the estimation results of the UWB positioning and the vision autonomous positioning are optimally fused through the global pose graph, so that the positioning algorithm can still maintain stable state estimation even if vision tracking is temporarily invalid or outliers appear in UWB measurement. And the UWB positioning part fuses UWB, IMU and data of the target Gao Leida through an unscented Kalman filter, and estimates the pose of the unmanned aerial vehicle under the global coordinate system. Meanwhile, the IMU and the binocular camera are subjected to data fusion to form a visual inertial odometer, the pose of the unmanned aerial vehicle under an inertial system can be estimated, and a data fusion algorithm of a visual part is tightly coupled nonlinear optimization based on a sliding window. The global fusion part fuses output results of the two positioning methods, and solves the problem of pose graph optimization through design, so that the high-precision and drift-free global optimal pose of the unmanned aerial vehicle is estimated. By designing the four-rotor unmanned aerial vehicle platform and running the positioning algorithm, the unmanned aerial vehicle can be positioned quickly and accurately, and the method has great application value.

Description

Close coupling odometer method and system based on UWB and vision fusion
Technical Field
The invention relates to a tightly coupled odometer method and a tightly coupled odometer system based on UWB and vision fusion, and belongs to the technical field of autonomous positioning.
Background
The four-rotor unmanned aircraft has been developed rapidly in recent decades because of the advantages of small volume, capability of vertical take-off and landing, hovering, strong maneuverability, easy control and the like. In the current actual production and life, the four-rotor unmanned aerial vehicle is increasingly widely applied in the fields of aerial photography, plant protection, unknown environment detection, electric power inspection, disaster monitoring and the like. Conventional unmanned aerial vehicles rely on satellite positioning information such as global satellite positioning systems (Global Positioning System, GPS) for flight, however GPS signals are unstable or vanished in scenes such as underground spaces, forests, complex building environments, etc., and the use of unmanned aerial vehicles in such scenes is greatly limited. Therefore, the unmanned aerial vehicle can automatically position and navigate under the GPS refusing environment, and the unmanned aerial vehicle is a popular research subject in the field of unmanned aerial vehicles at present.
The unmanned aerial vehicle is required to have two functions, namely, the unmanned aerial vehicle is required to perform autonomous positioning and navigation in a GPS refused environment, the unmanned aerial vehicle is required to estimate the movement of the unmanned aerial vehicle in real time according to the information of the sensor of the unmanned aerial vehicle in an unknown environment, and a surrounding environment map is built, so that the unmanned aerial vehicle can sense obstacles and an available airspace. The two functions are realized, and the unmanned aerial vehicle can realize autonomous positioning navigation by combining other necessary algorithms such as a control algorithm, a track planning algorithm and the like. Synchronous positioning and mapping (Simultaneous LocalizationAnd Mapping, SLAM) technology refers to a technology of constructing an environment in the motion process and estimating self motion under the condition that a main body carrying a specific sensor does not have environment prior information. The SLAM algorithm may also be simply referred to as an odometer (odometer) because of its motion estimation function. If the SLAM technology is applied to the unmanned aerial vehicle platform, two functions required by autonomous positioning and navigation of the unmanned aerial vehicle can be met, so that SLAM technology research and application based on the four-rotor unmanned aerial vehicle are gradually and widely focused.
Ultra-wideband (UWB) is a novel electric wave ranging technology in recent years that calculates the distance between two modules by measuring the arrival time, the arrival time difference, or the arrival angle of electric waves. Because the wave band of the emitted electric wave is between 3.1GHz and 4.8GHz, the interference of other electric wave signals can be effectively overcome. In addition, the higher bandwidth can easily overcome the multipath effect and weaken the influence of non-line-of-sight measurement.
The UWB positioning method can complete position estimation in the GPS rejection environment only by deploying a low-cost sensor, is not influenced by illumination and weather, and more importantly has a global reference system, thereby providing convenience for direct application of multiple unmanned aerial vehicles. The method is based on UWB and vision fusion unmanned aerial vehicle positioning method, and aims at the characteristics that the UWB positioning method is not smooth enough in track estimation and is easy to be interfered by electric waves but has a global reference system and the characteristics that the vision autonomous positioning method is easy to generate drifting phenomenon but has high estimation accuracy in a short time.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and aims to provide a tightly coupled odometer method and a tightly coupled odometer system based on UWB and vision fusion, so that the positioning performance of an unmanned aerial vehicle in a GPS refusing environment is improved.
In order to achieve the above purpose, the invention is realized by adopting the following technical scheme:
in a first aspect, the invention provides a tightly coupled odometer method based on UWB and vision fusion, comprising the following steps:
the Utility model unmanned aerial vehicle pose under the global system is estimated by fusing UWB, IMU and constant Gao Leida data through an unscented Kalman filter;
data fusion is carried out to form a visual inertial odometer, and the pose of the unmanned aerial vehicle under an inertial system is estimated; the data fusion algorithm is tightly coupled nonlinear optimization based on a sliding window;
and fusing the pose of the unmanned aerial vehicle under the global system with the pose of the unmanned aerial vehicle under the global system to obtain the high-precision and drift-free global optimal pose of the unmanned aerial vehicle.
Further, fusing the pose of the unmanned aerial vehicle under the global system with the pose of the unmanned aerial vehicle under the global system to obtain a high-precision and drift-free global optimal pose of the unmanned aerial vehicle, which comprises the following steps:
by constructing a pose optimization graph, taking the pose of the unmanned aerial vehicle under the global system as the angular point of a variable to be optimized, representing the pose result of the unmanned aerial vehicle under the inertial system measured by a visual inertial odometer into a pose increment form, drawing the pose increment form in the graph in a binary increment side form, using the global unitary pose side to represent the UWB positioning measurement result, defining the cost function of the graph optimization problem, and calculating the minimum value of the cost function to obtain the optimal estimation of the pose state vector in the sliding window, thereby obtaining the global optimal pose of the unmanned aerial vehicle with high precision and no drift.
Further, data fusion is carried out to form a visual inertial odometer, and the pose of the unmanned aerial vehicle under an inertial system is estimated; the data fusion algorithm is tightly coupled nonlinear optimization based on a sliding window, and comprises the following steps:
pre-integrating the IMU, wherein an IMU pre-integration term is used as a residual term for constructing a bundling adjustment equation during back-end optimization;
preprocessing the latest frame of visual observation, tracking the existing features by using a KLT sparse optical flow method, detecting the new features at the same time, screening out key frames, sending the key frames into a back-end optimizer for state estimation and depth recovery, and defining visual measurement residual errors on a unit sphere, wherein the visual measurement residual errors are re-projection residual errors;
the method of tight coupling fusion is used for guaranteeing the solving precision, the constraint information is converted into a state prior condition by adopting the idea of marginalization and added into the optimization, and nonlinear least square optimization is carried out by constructing a bundling adjustment equation;
and constructing a graph by taking the pose of the unmanned aerial vehicle under the global system as a corner point, namely, the variables to be optimized in the graph are the global pose of the unmanned aerial vehicle, including a unitary edge and a binary edge, and optimizing and solving based on the graph.
Furthermore, the method for pre-integrating the IMU reduces the calculation amount, and the IMU pre-integration term is used as a residual term for constructing a bundle adjustment equation during back-end optimization, including:
For the time interval between two frames of images, there are more IMU measurement values, and in order to avoid repeat propagation, a mode of pre-integrating the IMU is adopted:
for IMU measurements between a kth frame image and its neighboring kth+1st frame image in a sliding window, the residual term for IMU pre-integration can be defined as:
is an IMU pre-integral observation obtained between two consecutive frames using noisy gyroscope and accelerometer measurements. />Representing the three-dimensional error state of the quaternion, [ ·] xyz Representing the imaginary part of the extracted quaternion q. Accelerometer and gyroscope zero offsets are also included in the residual term for online correction.
Further, preprocessing the latest frame of visual observation, tracking the existing features by using a KLT sparse optical flow method, and screening out key frames, wherein the key frames are sent to a back-end optimizer for state estimation and depth recovery, and visual observation residual errors are defined on a unit sphere, namely re-projection residual errors, and the method comprises the following steps:
when the latest frame shot by the camera arrives, the preprocessing of visual observation is performed first, and the steps include tracking the existing features by using the KLT sparse optical flow method, so as to ensure that the number of feature points is maintained at a specific number, On the basis of tracking the original characteristics, new characteristics need to be detected at the same time, in addition, the step is also responsible for screening out key frames, the key frames are sent to a back-end optimizer for state estimation and depth recovery, and two criteria are selected for the key frames: first, the average parallax with the previous key frame is larger than a certain threshold value k if the average parallax between the latest frame and the previous frame is larger than a certain threshold value k 1 Setting the latest frame as a new key frame; and secondly, the quality of feature tracking, if the number of feature tracking is lower than a certain threshold value kappa 2 The latest frame is set as the new key frame.
The vision measurement residual is re-projection residual, and the pixel is marked as follows under the assumption that the first feature point is observed when the ith frame imageWith the movement of the drone, the feature point l is observed again in the j-th frame image, whereby a visual measurement residual can be defined, here defined on a unit sphere, which is then defined as:
wherein the method comprises the steps ofIs the pixel coordinates of the first feature point observed for the first time in the ith frame,/>Is the pixel coordinates of the feature point in the j-th frame image,>is a back projection function that contains camera internal parameters.
Since the visual measurement residual has only two degrees of freedom, and the residual vector is a three-dimensional vector, the residual vector is projected onto the tangential plane here, thus ensuring the correct degrees of freedom. b 1 ,b 2 Is two arbitrarily selected orthogonal bases which can be stretched intoIs a tangential plane to the plane of the lens.
Further, the pose of the unmanned aerial vehicle under the global system is used as a corner point to construct a graph, namely, variables to be optimized in the graph are the global pose of the unmanned aerial vehicle, and the graph-based optimization solution comprises the following steps:
and taking the pose of the unmanned aerial vehicle under the global system as an angular point, namely, the variable to be optimized in the graph is the global pose of the unmanned aerial vehicle. Edges in the figure include a unitary edge and a binary edge.
The transformation matrix form of SE (3) is adopted:
wherein R is k Representing the attitude of the unmanned aerial vehicle at the moment k, p k Indicating the position of the drone at time k.
And taking the pose of the unmanned aerial vehicle under the global system as a corner point of the constructed graph, namely, the variable to be optimized in the graph is the global pose of the unmanned aerial vehicle. Edges in the figure include a unitary edge and a binary edge.
(3) Incremental edge
Because the visual inertial odometer measures the pose of the unmanned aerial vehicle under the inertial systemThe variable to be optimized is the pose T of the unmanned aerial vehicle under the global system k . In order to get rid of the limitation of the reference system, the form of pose increment is used here, +.>The increment between the pose at the time k and the pose at the time k-l of the unmanned aerial vehicle is represented, and the increment is represented under the body system at the time k-l. The pose increment can be deduced according to the estimation result of the visual inertial odometer:
Therefore, the increment edges in the graph are connected with pose angular points at two moments, represent the increment of the pose at the two moments, and are binary edges.
(4) Global pose edge
The global pose edge is from the UWB positioning result, and represents the global pose measurement of the unmanned aerial vehicle at a certain moment, and the global pose edge is a unitary edge.
The constructed graph G= { V, E }, a certain angular point at the moment k is connected with an angular point at an earlier moment by an incremental edge, a black connecting line represents a global pose edge, and the global pose edge is a unitary edge.
The system state vector in the sliding window is now defined as:
X=[T 0 ,T 1 ,…,T n ] (5)
at a certain moment k in the sliding window, after the unmanned aerial vehicle acquires a new measured value, a cost function of the graph optimization problem is defined as follows:
t in the formula (6) represents a set of global pose edges in the sliding window,representing global pose residuals:
s denotes the set of all delta edges in the sliding window,representing pose increment residual errors:
wherein the symbol-! Representing a generalized subtraction on SE (3), by arranging each term in a cost function (6):
minimizing the cost function (9) to obtain an optimal estimate of the state vector in the sliding window
A Robust Kernel function (robustkernel) is added to the residual term when defining the cost function. The Huber kernel function is used to suppress the effect of outliers, expressed as:
Further, the pose of the unmanned aerial vehicle in the global system is estimated and obtained by fusing UWB, IMU and Gao Leida data through an unscented Kalman filter, and the method comprises the following steps:
step 1: acquiring UWB original data and IMU data;
step 2: performing distance calibration on the UWB original data to obtain calibrated UWB data;
step 3: judging whether the anchor point position is known or not, if not, carrying out anchor point calibration according to the calibrated UWB data, and then jumping to the step 4, if yes, jumping to the step 4;
step 4: performing outlier detection on the calibrated UWB data to obtain screened UWB data;
step 5: performing low-pass filtering on the IMU data to obtain filtered IMU data;
step 6: and inputting the screened UWB data and the filtered IMU data into a state estimator to obtain the position, speed and attitude information of the unmanned aerial vehicle. The state estimator approximates probability distribution of states by taking unscented Kalman filtering as a core, so that mean and covariance are estimated.
The input of the state estimator is UWB ranging data after information calibration and outlier rejection and IMU data after noise processing through low-pass filtering; the output is the position, speed and attitude information of the unmanned aerial vehicle.
Further, step 2: performing distance calibration on the UWB original data, including:
measuring UWB distanceModeling is linear relation with the true value of the distance d, namely:
wherein n is d Is zero mean white noise.
A linear regression method is adopted, and a plurality of true value-measurement samples are obtained by placing two UWB modules at different positions for distance measurementThe measured value in each sample is a reading of the UWB range, and the range truth value is obtained by a high-precision indoor positioning system. And (3) determining the scale factor a and the static difference b under the current model through fitting the model by a large number of samples.
Where a is the mean of the distance truth values,is the mean of the distance measurements.
Further, in step 3, performing anchor point calibration according to the calibrated UWB data, including:
firstly, a global coordinate system is established, and the coordinate system is an anchor point A in the overlooking view 1 The position is the origin of the coordinate system, the anchor point A 2 And A is a 1 The connecting line of (a) is the direction of the x-axis, A 1 The height direction of (2) is the z-axis and the y-axis direction can be determined according to the right-hand rule.
Will A 3 Placed in a direction approximating the y-axis, A 4 Can be placed at any position。
The calibration stage makes the unmanned aerial vehicle keep static, the label periodically sends a distance measurement request to the anchor points according to the set time slot table, and the distance from the label to each anchor point is analyzed according to the TW-TOF principle.
And starting an echo ranging function of the UWB module, namely enabling the tag to receive the distance information between the anchor points. The mutual ranging is also started between the anchor points, and the distance measurement value is stored in the buffer memory of the anchor points. After the tag sends a ranging request to the anchor point, the echo ranging value is added into the response information when the anchor point replies to the tag, so that the tag receives the distance measurement between the two anchor points.
The to-be-estimated value of the anchor point calibration stage is the position p of the tag under the global system G T And anchor point position p Ai
Constructing a nonlinear least squares problem:
in the formula (7), C is a set of anchor points, and the positions of the tag and each anchor point can be calibrated by solving a cost function. Offset of tag from origin of body systemKnown a priori information, the position p of the drone under the global system at this time can be deduced, which will be the initial iteration value p of the state estimator 0
In a second aspect, the present invention provides a tightly coupled odometer system based on UWB and visual fusion, comprising:
UWB module: the method is used for fusing UWB, IMU and Gao Leida data through an unscented Kalman filter, and estimating to obtain the pose of the unmanned aerial vehicle under the global system;
the inertia system module: the method comprises the steps of performing data fusion to form a visual inertial odometer, and estimating to obtain the pose of the unmanned aerial vehicle under an inertial system; the data fusion algorithm is tightly coupled nonlinear optimization based on a sliding window;
And the pose fusion module is used for: the method is used for fusing the pose of the unmanned aerial vehicle under the global system with the pose of the unmanned aerial vehicle under the global system, and obtaining the high-precision and drift-free global optimal pose of the unmanned aerial vehicle.
Compared with the prior art, the invention has the beneficial effects that:
the method has very important significance for researching the positioning and navigation algorithm of the four-rotor unmanned aerial vehicle in the GNSS refusing environment. The method is stable and reliable, is not influenced by weather, light and the like, saves calculation resources by an algorithm, has lower requirement on hardware, and has high theoretical and practical values. The invention has the following characteristics and advantages:
(1) The method and the system select the P440 module produced by the United states TimeDomain company to realize the tightly coupled odometer method and the system based on UWB and vision fusion, and the scheme has high positioning precision, lower cost and wide application range. The sensors such as radars used in the autonomous navigation scheme under the traditional GNSS refusal environment are expensive and the stability cannot be ensured. By means of the visual inertial odometer unit, high-precision autonomous positioning and mapping of various scenes such as indoor and outdoor scenes can be achieved.
(2) The invention provides a positioning algorithm based on UWB and vision fusion, which is assisted by IMU pre-integration constraint, and realizes the high-precision and stable autonomous positioning and mapping technology of the unmanned aerial vehicle by using a factor graph-based optimization mode. By means of the characteristic that the IMU updates frequently and the visual constraint provided by the binocular camera, the problem that small-range vibration is more in UWB positioning is solved. The method improves the positioning precision, and improves the problems that the method is sensitive to environmental characteristics, easy to estimate drift and lacks a global reference system in the pure vision positioning.
(3) The pose solving framework adopted by the invention is factor graph optimization, is easy to expand, can be added with sensor equipment such as laser radar and the like according to developers besides sensor equipment such as IMU, UWB, binocular camera, radar altimeter and the like, and can be developed secondarily. Compared with filtering, the optimization mode has the advantage of improving the accuracy of the odometer through the tight coupling mode.
Drawings
FIG. 1 is a flow chart of a tightly coupled odometer method and system based on UWB and visual fusion;
FIG. 2 is a diagram of the hardware module connection relationship;
FIG. 3 is a diagram of a hardware architecture of a tightly coupled odometer method and system based on UWB and visual fusion;
FIG. 4 is a flowchart of a tightly coupled odometer method and system algorithm based on UWB and visual fusion;
FIG. 5 is a flow chart of a close-coupled visual inertial mileage calculation method;
FIG. 6 is a schematic diagram of system status and sensor measurements in a sliding window;
FIG. 7 is a factor graph optimization schematic;
FIG. 8 shows the indoor track tracking effect of a close-coupled odometer based on UWB and vision fusion;
FIG. 9 is a graph of the Absolute Position Error (APE) of a close-coupled odometer based on UWB and visual fusion;
FIG. 10 is a close-coupled odometer outdoor track following effect based on UWB and visual fusion.
Detailed Description
The invention is further described below with reference to the accompanying drawings. The following examples are only for more clearly illustrating the technical aspects of the present invention, and are not intended to limit the scope of the present invention.
Embodiment one:
the invention provides a tightly coupled odometer method for solving UWB and vision fusion in a GNSS refused environment.
The UWB and vision fusion unmanned aerial vehicle positioning method adopts a loose coupling mode between modules, namely UWB positioning and vision autonomous positioning are independently operated, and finally the estimation results of the UWB positioning and vision autonomous positioning are optimally fused through a global pose graph.
And the UWB positioning is realized by adopting a robust autonomous positioning and mapping technology based on the fusion of UWB and IMU, namely, the pose of the unmanned aerial vehicle under the global coordinate system is estimated by fusing UWB, IMU and Gao Leida data through an unscented Kalman filter.
The IMU and the binocular camera are subjected to data fusion to form a vision autonomous positioning part, which is generally called a vision inertial odometer, the pose of the unmanned aerial vehicle under an inertial system can be estimated, and a data fusion algorithm of the vision part is tightly coupled nonlinear optimization based on a sliding window.
The global fusion part fuses output results of the two positioning methods, and solves the problem of pose graph optimization through design, so that the high-precision and drift-free global optimal pose of the unmanned aerial vehicle is estimated.
The implementation process of the visual odometer and the global pose chart optimization part is specifically designed in an analysis mode. By designing the four-rotor unmanned aerial vehicle platform and running the positioning algorithm, the unmanned aerial vehicle can be positioned quickly and accurately, and the method has great application value.
The technical scheme adopted by the invention comprises two parts of hardware and software:
the software comprises: the system comprises an IMU residual error constraint unit, a visual residual error constraint unit, a UWB ranging residual error unit, an marginalization unit and a global fusion algorithm unit based on graph optimization. The whole algorithm flow is shown in fig. 4. The following will specifically describe each part.
Hardware part:
the unmanned aerial vehicle is a four-rotor aircraft with a wheel base of 380mm, and is provided with a DJI E300 motor electric power and a 4S high-voltage battery. The sensor is provided with an Intel binocular camera d435, a 3DM-GX5-25 Inertial Measurement Unit (IMU), a P440 UWB ranging module, a TF mini laser ranging module (constant Gao Leida), the attitude control is completed by a Pixhawk flight controller, and an airborne computer adopts an Intel eighth-generation NUC (the processor is Core i7-8559U 2.7GHz, four cores and eight threads, and the memory size is 32G). The positioning program is developed based on a ROS (Robot Operating System) robot open source framework under a Linux operating system, all algorithms are realized by adopting a C++ language, the positioning program receives data acquired by a sensor in real time by adopting an ROS Message, and the position and the gesture analyzed by the program are sent to a Pixhawk flight controller to further gesture control by using a Mavlink protocol through a serial port.
Considering the reliability of the system, the UWB module needs to have higher ranging precision and more robust multipath interference resistance, and the platform selects and carries the P440 module produced by the American Time domain company. The P440 module is an ultra-wideband wireless transceiver with a wave band between 3.1GHz and 4.8GHz, mainly adopts TW-ToF principle, can simultaneously execute four functions (ranging, data transmission, monostatic radar and multistatic radar), only uses ranging functions, the ranging comprises single-point ranging and networking ranging modes, a ranging network is formed by a tag and an anchor point in an experiment, and networking ranging can adopt ALOHA (random) protocol or TDMA (time division multiple access) protocol. The wave emission power of the UWB module is extremely low (50 uW), the tag uses an on-board battery to supply power after voltage conversion, and the anchor point uses a mobile power supply to directly supply power.
UWB ranging module: four functions can be performed simultaneously by using an ultra-wideband wireless transceiver with a band between 3.1GHz and 4.8 GHz: ranging, data transmission, monostatic radar and multistatic radar mainly use the ranging function of the module, and the distance between the two modules is calculated by measuring the arrival time, arrival time difference or arrival angle of electric waves.
IMU inertial measurement sensor: the method mainly realizes data processing in two aspects, on one hand, the pose of the unmanned aerial vehicle under a global coordinate system is estimated by fusing the UWB ranging module and the data of the positioning Gao Leida through the unscented Kalman filter, and the positioning function of UWB is realized; on the other hand, the detection, tracking and preprocessing of the environmental characteristic points observed by vision are realized by matching with a binocular camera, so that the vision autonomous positioning is realized.
Binocular camera: and collecting environment information, transmitting the data back to an onboard processor, extracting environment characteristic points, and matching with an Inertial Measurement Unit (IMU) to realize visual positioning.
Radar altimeter: and measuring the height data of the unmanned aerial vehicle, and correcting and adjusting the UWB positioning result according to the measured height data.
An embedded on-board processor: the onboard Intel eighth generation high-computation power processor can process data information of various sensors and calculate the pose of the unmanned aerial vehicle.
Unmanned aerial vehicle platform: the four-rotor unmanned aerial vehicle comprises a four-rotor unmanned aerial vehicle, a motor, a battery and a flight control component.
The UWB ranging module is combined with the IMU inertial measurement sensor and the radar altimeter to realize the positioning function of robust autonomous positioning and mapping technology based on the fusion of UWB and IMU; the IMU inertial measurement sensor and the binocular camera realize visual positioning through sensing and measuring the flight environment information; the algorithm of the two positioning technologies is realized through the operation of an embedded airborne processor, the two output results are fused, and the minimum value of the cost function is calculated by constructing a pose optimization diagram, so that the high-precision and drift-free global optimal pose of the unmanned aerial vehicle is obtained. The connection relation of the hardware modules is shown in fig. 2.
Software part:
(1) IMU residual error constraint unit
For the time interval between two frames of images, there are more IMU measured values, and in order to avoid repeated propagation, the purpose of reducing the calculated amount is achieved by adopting a mode of pre-integrating the IMU. The IMU pre-integral term will be used as a residual term in constructing the bundle adjustment equation when back-end optimizing.
For IMU measurements between a kth frame image and its neighboring kth+1st frame image in a sliding window, the residual term for IMU pre-integration can be defined as:
is an IMU pre-integral observation obtained between two consecutive frames using noisy gyroscope and accelerometer measurements. />Representing the three-dimensional error state of the quaternion, [ ·] xyz Representing the imaginary part of the extracted quaternion q. Accelerometer and gyroscope zero offsets are also included in the residual term for online correction.
(2) Vision residual error constraint unit
The IMU and the binocular camera are subjected to data fusion to form a vision autonomous positioning part, which is generally called a vision inertial odometer, so that the pose of the unmanned aerial vehicle under an inertial system can be estimated.
A close-coupled odometer frame based on visual inertia is shown in fig. 5. When the latest frame shot by the camera arrives, firstly, preprocessing for visual observation is carried out, the steps comprise tracking the existing features by using a KLT sparse optical flow method, in order to ensure that the number of feature points is maintained at a specific number, new features need to be detected simultaneously on the basis of tracking the original features, in addition, the steps are also responsible for screening out key frames, the key frames are sent to a back-end optimizer for state estimation and depth recovery, and two criteria are selected for the key frames: first, the average parallax with the previous key frame is larger than a certain threshold value k if the average parallax between the latest frame and the previous frame is larger than a certain threshold value k 1 Setting the latest frame as a new key frame; and secondly, the quality of feature tracking, if the number of feature tracking is lower than a certain threshold value kappa 2 The latest frame is set as the new key frame.
The vision measurement residual is re-projection residual, and the pixel is marked as follows under the assumption that the first feature point is observed when the ith frame imageWith the movement of the drone, the feature points l are observed again in the j-th frame image, whereby a visual measurement residual can be defined, which is defined here on the unit sphere.
The vision measurement residual is re-projection residual, and the pixel is marked as follows under the assumption that the first feature point is observed when the ith frame imageWith the movement of the drone, the feature points l are observed again in the j-th frame image, whereby a visual measurement residual can be defined, which is defined here on the unit sphere. The visual measurement residual is defined as:
wherein the method comprises the steps ofIs the pixel coordinates of the first feature point observed for the first time in the ith frame,/>Is the pixel coordinates of the feature point in the j-th frame image,>is a back projection function that contains camera internal parameters.
Since the visual measurement residual has only two degrees of freedom, and the residual vector is a three-dimensional vector, the residual vector is projected onto the tangential plane here, thus ensuring the correct degrees of freedom. b 1 ,b 2 Is two arbitrarily selected orthogonal bases which can be stretched intoIs a tangential plane to the plane of the lens.
(3) Marginalization unit
In order to ensure the characteristics of rapidness, light weight and the like of the unmanned aerial vehicle, strict requirements are put on the size and the weight of an onboard computer, and therefore the computing resources of a processor are limited. As shown in fig. 6, the visual odometer adopts a tightly coupled fusion method to ensure the solving precision, namely, a nonlinear least square optimization is performed by constructing a bundle adjustment equation, if the equation is not modified, the scale of the equation can be infinitely increased along with the increase of time, and the real-time operation of the system cannot be ensured to a certain extent, so that the scale of the optimization problem must be limited to an acceptable range by some means under the condition of not remarkably reducing the precision. The conventional idea is when the latest key frame KF n When arriving, the oldest image frame KF in the window is directly processed 0 Discard and add the latest frame to the sliding window to participate in optimization, taking into account KF 0 And other image frames KF in sliding window k The constraint relation may be generated by IMU pre-integration or some visual characteristic observation, if K is directly discardedF 0 The missing part of observation constraint can be caused, and the pose estimation result is greatly influenced for a long time. Thus, the idea of marginalization can be employed to convert these constraint information into state priors { r p ,H p Adding to the optimization.
(4) Global fusion algorithm unit based on graph optimization
Graph optimization is one way to represent the optimization problem as a Graph (Graph). A graph is composed of several corner points (Vertex), and edges (Edge) connecting these corner points, G can be expressed mathematically as g= { V, E }, where V is the set of corner points and E is the set of edges. The corner points represent variables to be optimized, the sides represent the connection between the corner points, and the sides can be connected with one corner point (Unary Edge), two corner points (binary Edge) and a plurality of corner points (Hyper Edge).
According to the graph constructed by the method, the pose of the unmanned aerial vehicle under the global system is used as the corner point, namely, the variable to be optimized in the graph is the global pose of the unmanned aerial vehicle. Edges in the figure include a unitary edge and a binary edge.
For convenience of description, this section does not write the position and the gesture separately, but adopts a transformation matrix form of SE (3):
the diagram constructed in the section takes the pose of the unmanned aerial vehicle under the global system as a corner point, namely the variable to be optimized in the diagram is the global pose of the unmanned aerial vehicle. Edges in the figure include a unitary edge and a binary edge.
(5) Incremental edge
Because the visual inertial odometer measures the pose of the unmanned aerial vehicle under the inertial system The variable to be optimized is the pose T of the unmanned aerial vehicle under the global system k . In order to get rid of the limitation of the reference system, the form of pose increment is used here, +.>The increment between the pose at the time k and the pose at the time k-l of the unmanned aerial vehicle is represented, and the increment is represented under the body system at the time k-l. The pose increment can be deduced according to the estimation result of the visual inertial odometer:
therefore, the increment edges in the graph are connected with pose angular points at two moments, represent the increment of the pose at the two moments, and are binary edges. In this example let l=3, i.e. one pose corner point is connected with the previous three corner points with an incremental edge at most.
(6) Global pose edge
The global pose edge is from the UWB positioning result, and represents the global pose measurement of the unmanned aerial vehicle at a certain moment, and the global pose edge is a unitary edge.
Fig. 7 shows a graph g= { V, E }, which is constructed in this section, where circles represent pose angular points to be optimized of the unmanned aerial vehicle at a certain moment, gray connecting lines represent incremental edges, a certain angular point at a moment k is connected with angular points at an earlier moment by the incremental edges, black connecting lines represent global pose edges, and the global pose edges are unitary edges.
The system state vector in the sliding window is now defined as:
X=[T 0 ,T 1 ,…,T n ] (5)
at a certain moment k in the sliding window, after the unmanned aerial vehicle acquires a new measured value, a cost function of the graph optimization problem is defined as follows:
T in the formula (6) represents a set of global pose edges in the sliding window,representing global pose residuals:
s denotes the set of all delta edges in the sliding window,representing pose increment residual errors:
wherein the symbol-! Representing a generalized subtraction on SE (3), by arranging each term in a cost function (6):
minimizing the cost function (9) to obtain an optimal estimate of the state vector in the sliding window
In order to avoid the interference of outliers to the optimization problem as much as possible, a robust kernel function is added to the residual term when defining the cost function. The main idea of the robust kernel function is to suppress the rate of increase of the square term of the outlier residual, here the Huber kernel function is used to suppress the effect of outliers, expressed as:
the experimental results are given below
The case carries out a plurality of groups of high-precision autonomous positioning experiments based on UWB and vision fusion under the GNSS refusing environment based on multi-sensor fusion on the positioning system.
Fig. 8 shows an indoor track following effect in a GNSS rejection environment, where the experimental scenario is indoor and the start point and the end point are the same. From the results, it can be seen that the trajectory based on the method is smoother and more approximate to the true value than the trajectory obtained by positioning only using UWB and only using vision.
Fig. 9 is an Absolute Position Error (APE) curve in a GNSS rejection environment, which quantitatively analyzes the error of autonomous positioning of the unmanned aerial vehicle, and it is known from the analysis that the method is improved in both maximum and average absolute errors compared to the UWB-based and IMU-based fusion positioning.
FIG. 10 is a close-coupled odometer outdoor trajectory tracking effect based on UWB and visual fusion in a GNSS rejection environment. The experiment shows that the drift correction function of the visual autonomous positioning method after UWB fusion is carried out, the difference between the tail end of the visual estimated track and the tail end of the UWB estimated track is about 3 meters, the visual drift problem is corrected after UWB and vision fusion, and the smoothness of the track estimation is better than that of the UWB positioning method.
In this embodiment, the specific method for implementing UWB positioning by adopting the robust autonomous positioning and mapping technology based on UWB and IMU fusion includes:
the UWB positioning part is realized by adopting a robust autonomous positioning and mapping technology based on the fusion of UWB and IMU, meanwhile, the IMU and the binocular camera are subjected to data fusion to form a visual autonomous positioning part, which is generally called a visual inertial odometer, the pose of the unmanned aerial vehicle under an inertial system can be estimated, and a data fusion algorithm of the visual part is tightly coupled nonlinear optimization based on a sliding window. And then the fusion of UWB positioning and vision autonomous positioning is realized by adopting a loose coupling mode between modules, namely, the UWB positioning and the vision autonomous positioning are independently operated, and finally, the estimation results of the UWB positioning and the vision autonomous positioning are optimally fused through a global pose diagram.
The method comprises three parts of ranging information calibration, outlier removal and IMU noise processing of UWB original data. The processed IMU data can effectively reduce noise pollution. The processed result is fed as input to a kalman filter unit.
The accuracy of UWB distance measurements can ideally be on the order of centimeters, but in practice the measurements often have static differences and errors for some reason. In order to reduce factors affecting positioning accuracy as much as possible, the method adds a distance calibration link before the measured value is sent to the Kalman filter, and the measured value can be considered to be closer to a true value after distance calibration.
Measuring UWB distanceModeling is linear relation with the true value of the distance d, namely:
wherein n is d Is zero mean white noise. The distance calibration link adopts a linear regression method, and a plurality of true value-measurement samples are obtained by placing two UWB modules at different positions for distance measurementThe measured value in each sample is a reading of the UWB range, and the range truth value is obtained by a high-precision indoor positioning system. And (3) determining the scale factor a and the static difference b under the current model through fitting the model by a large number of samples.
Where a is the mean of the distance truth values,is the mean of the distance measurements.
And (3) when the UWB sensor measurement data are acquired, carrying out linear regression processing through a distance calibration link by default, and calculating a more reasonable distance value.
Although the excellent characteristics of UWB can avoid the influence of multipath effect and non-line of sight to the maximum extent, it is still possible to introduce outliers in the scene of dense obstacles, where the outliers indicate measurements that are contrary to the fact, such as negative readings, abrupt occurrence of large jumps, etc., and the introduction of outliers can lead to erroneous state estimation. Here, an outlier is detected by adopting an adding threshold method, and once the distance measurement is considered as the outlier, the outlier is directly rejected.
If p k The position of the unmanned aerial vehicle output by the k moment estimator is the position of the unmanned aerial vehicle when the label on the unmanned aerial vehicle at the k+1 moment receives the position of the anchor point A i Distance measurement of (2)If it meets
The current distance measurement is determined to be an outlier. Where γ is an adjustable threshold, v max For the maximum flight speed of the unmanned aerial vehicle, f is the ranging update rate of the UWB module. If the distance measurement is determined to be an outlier, the distance measurement is directly abandoned, and if the distance measurement is normal, the distance measurement is directly abandonedAs distance measurement is sent to the estimator for status update.
For IMU, because it adopts the strapdown mode to fix directly on the organism, unmanned aerial vehicle receives the influence of screw vibrations in the flight in-process, leads to measured data to receive great noise pollution. According to the invention, an IIR digital low-pass filter with the cut-off frequency of about 30Hz is designed according to the fact that the main source of IMU measurement noise is high-frequency vibration of a machine body, and noise pollution can be effectively reduced through IMU data processed through low-pass filtering. The IIR low-pass filtering algorithm frequency domain representation is shown in equation (4):
Wherein m and n, a i I=1, …, m and b j J=1, …, n are filter parameters. The time domain form of the low pass filter is:
y(t)=a 0 u(t)+a 1 u(t-1)+…+a m u(t-m)-b 1 y(t-1)-…-b n y(t-n) (5)
wherein u represents IMU original measurement data, y represents a low-pass filtering processing result, and the data after low-pass filtering is used as IMU measurement value to perform multi-sensor fusion.
(2) UWB anchor point online calibration unit
The section adopts a nonlinear least square theory to design an anchor point calibration algorithm, the positions of anchor points around the environment are calibrated while a global system is created, and the anchor point calibration result is sent to a state estimator for storage, so that the subsequent algorithm can be smoothly carried out. The anchor point calibration link is performed only once in the whole process.
Anchor point calibration: firstly, a global coordinate system is established, and the coordinate system is an anchor point A in the overlooking view 1 The position is the origin of the coordinate system, the anchor point A 2 And A is a 1 The connecting line of (a) is the direction of the x-axis, A 1 The height direction of (2) is the z-axis, thus establishing two coordinate axes of the right-hand system, and the y-axis direction can be determined according to the right-hand rule. For simplicity, A may be 3 Placed in a direction approximating the y-axis, A 4 Can be placed at any position. The calibration stage makes the unmanned aerial vehicle keep static, the label periodically sends a distance measurement request to the anchor points according to the set time slot table, and the distance from the label to each anchor point is analyzed according to the TW-TOF principle. In addition, the echo ranging function of the UWB module is started, namely the tag can receive the distance information between the anchor points. The principle can be explained as follows: the mutual ranging is also started between the anchor points, and the distance measurement value is stored in the buffer memory of the anchor points. After the tag sends a ranging request to the anchor point, the echo ranging value is added into the response information when the anchor point replies to the tag, so that the tag receives the distance measurement between the two anchor points.
The to-be-estimated value of the anchor point calibration stage is the position p of the tag under the global system G T And anchor point position
A nonlinear least squares problem can be constructed:
in the formula (7), C is a set of anchor points, and the positions of the tag and each anchor point can be calibrated by solving a cost function. Offset of tag from origin of body systemKnown a priori information, the position p of the drone under the global system at this time can be deduced, which will be the initial iteration value p of the state estimator 0
(3) Unscented Kalman filter unit
Kalman (r.e. Kalman) first proposed Kalman Filter (KF) in 1960, KF is a model-based linear minimum variance estimation, and its standard form has advantages such as recursive computation. KF is only applicable in cases where both the system equation and the observation equation are linear, i.e. the system is linear gaussian. The KF is used in the linear Gaussian system to propagate the state and the noise respectively and then form the linear Gaussian system after the state is propagated, so that recursive estimation is completed, and the conclusion cannot be established in the nonlinear system. Extended Kalman Filtering (EKF) solves the estimation problem when the system and observation are nonlinear by making a linear approximation to the equation, but the linearization process discards higher order terms above second order, resulting in insufficient state estimation accuracy for strongly nonlinear systems.
The state estimator designed in this section takes Unscented Kalman Filtering (UKF) as a core, the state estimator is designed based on unscented Kalman filtering, the UKF is different from EKF to perform linear approximation on a system, but approximates probability distribution of states, the core idea is to sample the states before state propagation, perform state propagation on a point set (called Sigma points) after sampling according to a nonlinear system equation and an observation equation, and then estimate distribution functions of the Sigma points after propagation according to Gaussian distribution, so that mean and covariance are estimated. Errors due to the EKF linearization process are avoided while the distribution of the propagated states is made closer to the true distribution by increasing the sampling points, although still under the assumption of gaussian distribution. The UKF has almost the same time complexity as the EKF, but the estimation accuracy of the UKF is higher, and the method is simpler in implementation than the EKF.
Unscented Kalman filtering is based on a model state estimation method, and therefore requires determining system equations and observation equations before sensor fusion and state estimation, selecting the estimator's system state vector. Generally the system equations and observation equations have the form:
Wherein u is k Is an input to the system, w k Zero mean system noise. y is k For observation of state, η k Observation noise is zero mean. w (w) k And eta k Uncorrelated, variance is respectivelyAnd->Let x be k ,w k ,η k The dimensions of (2) are n, p, q, respectively.
The estimator is mainly used for estimating the position, the speed, the gesture and the like of the unmanned aerial vehicle, so that the state vector of the system is selected as follows:
wherein p is the position of the unmanned aerial vehicle under the global system, v is the speed, q is the unit quaternion, and b is the attitude of the unmanned aerial vehicle a And b ω Zero offset for IMU.
After the IMU sensor is added, the system equation can be written as follows:
/>
here the IMU reading is taken as an input to the system equationThe system noise vector is
Starting state estimator when k=0, first iterating initial value for system stateInitializing, collecting 400 groups of IMU accelerometer and gyroscope measurement data in a static state, taking an average value, and zero-offset initial value of acceleration in the state +.>Zero offset of angular velocity +.>And (3) estimating:
initial value p of position 0 According to the deduction of the anchor point calibration step result, other state variables are initialized to zero, and a covariance matrix is initialized according to an empirical value:
when IMU data is acquired at time k after the unmanned aerial vehicle takes off, predicting the system state at time k+1 by using the data. Suppose at this time the system state x k Mean and covariance of (2) areAnd P k System noise w k Mean and covariance of (2) are/>And->The augmented state and corresponding covariance of the l=n+p dimension is:
obtaining an aggregate matrix of Sigma points by applying UT transformation to the system state:
where λ is a scalar parameter, λ=α 2 (L+k) -L, constant α characterizes the degree of dispersion of Sigma points around the state, 1e -4 Alpha is more than or equal to 1, and k is an auxiliary scaling parameter. (. Cndot. i The i-th column is shown in the matrix.
Instead of linearly approximating the system, the Sigma points are then propagated according to a nonlinear system equation to obtain new Sigma points:
wherein the method comprises the steps ofFor status part in Sigma Point, +.>Is the systematic noise fraction in Sigma points. Sigma Point set χ propagated through nonlinear System equations ik+1|k Can be used to predict the state mean after propagation: />
Where W is i m ,W i c Is scalar weight
The above is the prediction step of the estimator, when k+1 is time, the observed quantity y of UWB or fixed-height radar is present k+1 When arrived, an update step is performed. Firstly, the state average value obtained in the prediction step is calculatedAnd observation noise eta k Composition l=n+q-dimensional augmentation state:
then generate a new Sigma point set χ according to UT transformation ik+1|k ,χ ik+1|k By a nonlinear observation equation h (x kk ) Propagation generates a set of points y ik+1|k ,y ik+1|k Observations used to generate estimates
Thus, the Kalman gain K can be calculated to update the state to obtain the average value of the posterior stateAnd covariance P k+1
The multi-sensor fusion mainly refers to fusion of a UWB ranging module and an IMU inertial measurement sensor, and the height of the radar altimeter in the direction of the auxiliary measurement z axis is added, so that the state estimation of the unmanned aerial vehicle is more accurate, and the fusion process is introduced in step six.
The UWB positioning method can complete position estimation in the GPS rejection environment only by deploying a low-cost sensor, is not influenced by illumination and weather, and more importantly has a global reference system, thereby providing convenience for direct application of multiple unmanned aerial vehicles. However, the positioning effect of the UWB sensor is not ideal when the UWB sensor acts alone, firstly, UWB signals are easy to be interfered, the positioning precision is reduced to some extent under the non-line-of-sight propagation scene with dense obstacles, and the stability is general; secondly, the accuracy of the UWB positioning method is not high due to the adoption of a constant speed assumption; thirdly, the update rate of the position estimation is limited by the UWB measurement frequency, about 20Hz, and the stability of the unmanned aerial vehicle is kept, and the update rate of the position is at least more than 100Hz, which obviously does not meet the requirement of the controller on the state estimation. The IMU can measure the acceleration and the angular velocity of the three axes of the machine body at high frequency, and can just make up the defect of UWB when acting independently. Therefore, UWB and IMU measurement can be fused, and the positioning performance of the unmanned aerial vehicle in the GPS refusing environment is improved.
(4) Multi-sensor redundancy unit
The method is used for improving the z-axis estimation accuracy through a fusion stator Gao Leida, and the sensor is arranged on the back of the frame.
To correct the z-axis component of position p in the system state using the fix Gao Leida, the radar measurement data needs to be processed in one step before the fix Gao Leida measurement value is sent to the estimator for updating, when the radar measurement data is readWhen the unmanned plane is used, the gesture of the unmanned plane is utilized to enable +.>Projected onto the z-axis of the global system for state update and fixed-height radarThe projection equation and the observation equation are as shown in formulas (24) and (25):
equation (24) is the projection equation, R k In the form of a rotation matrix of the gesture quaternion, the rotation relation from the machine system to the global system at the moment k is expressed, (. Cndot.) z Representing the z-axis component in the vector. Equation (25) is an observation equation,for global system down position p k+1 Observation of the z-axis component of +.>Is the measurement noise of the stator Gao Leida.
The measurement principle of the fixed-height radar is similar to UWB, a laser emitter of the radar emits a beam of laser to the ground, and a sensor calculates the linear distance to the ground according to the emission time and the receiving time. When the ground is always level, the measurement of the fixed Gao Leida can be directly sent to the estimator for state updating after passing through a projection equation, and when the terrain changes (such as the roughness of a certain place on the ground), the measurement result of the fixed-height radar also contains the terrain information. To this end, this section proposes a multi-sensor redundancy strategy for terrain estimation, where the drone can estimate the terrain and change the fly height appropriately when the terrain changes, in order to avoid accidents. The principle of the terrain estimator is as follows:
When a certain Gao Leida measurement is obtainedAt the time, the measured value is projected onto the z-axis of the global system according to the current posture by using the formula (24) to obtain a height measurement result +.>
When (when)In this case, the z-axis height is not updated, and the optimal use distance of the fixed Gao Leida is 0.2 to 20m according to the instruction of the fixed Gao Lei. And the sensor is arranged on the back of the frame, when the aircraft is resting on the ground, the reading of the sensor is about 0.2, and when the reading is less than 0.2, the sensor can be considered to generate outliers, so that the outliers are directly removed.
When (when)In this case, the altitude component and altitude measurement in the current position estimate are taken>Calculating relative deviation of difference err When |l err When I > 0.3, it is considered that there is a topographic mutation, use +.>De-updating state quantity p k But directly estimates the terrain:
when (when)But |l err When the level is less than or equal to 0.3, the topography is not mutated, but the topography is possibly changed slowly, and the +.>The altitude is updated and then the terrain is estimated according to equation (26).
Embodiment two:
the embodiment provides a close-coupling odometer system based on UWB and vision fusion, which comprises:
UWB module: the method is used for fusing UWB, IMU and Gao Leida data through an unscented Kalman filter, and estimating to obtain the pose of the unmanned aerial vehicle under the global system;
the inertia system module: the method comprises the steps of performing data fusion to form a visual inertial odometer, and estimating to obtain the pose of the unmanned aerial vehicle under an inertial system; the data fusion algorithm is tightly coupled nonlinear optimization based on a sliding window;
And the pose fusion module is used for: the method is used for fusing the pose of the unmanned aerial vehicle under the global system with the pose of the unmanned aerial vehicle under the global system, and obtaining the high-precision and drift-free global optimal pose of the unmanned aerial vehicle.
The specific hardware comprises:
UWB ranging module: four functions can be performed simultaneously by using an ultra-wideband wireless transceiver with a band between 3.1GHz and 4.8 GHz: ranging, data transmission, monostatic radar and multistatic radar mainly use the ranging function of the module, and the distance between the two modules is calculated by measuring the arrival time, arrival time difference or arrival angle of electric waves.
IMU inertial measurement sensor: the method mainly realizes data processing in two aspects, on one hand, the pose of the unmanned aerial vehicle under a global coordinate system is estimated by fusing the UWB ranging module and the data of the positioning Gao Leida through the unscented Kalman filter, and the positioning function of UWB is realized; on the other hand, the detection, tracking and preprocessing of the environmental characteristic points observed by vision are realized by matching with a binocular camera, so that the vision autonomous positioning is realized.
Binocular camera: and collecting environment information, transmitting the data back to an onboard processor, extracting environment characteristic points, and matching with an Inertial Measurement Unit (IMU) to realize visual positioning.
Radar altimeter: and measuring the height data of the unmanned aerial vehicle, and correcting and adjusting the UWB positioning result according to the measured height data.
An embedded on-board processor: the onboard Intel eighth generation high-computation power processor can process data information of various sensors and calculate the pose of the unmanned aerial vehicle.
Unmanned aerial vehicle platform: the four-rotor unmanned aerial vehicle comprises a four-rotor unmanned aerial vehicle, a motor, a battery and a flight control component.
The UWB ranging module is combined with the IMU inertial measurement sensor and the radar altimeter to realize the positioning function of robust autonomous positioning and mapping technology based on the fusion of UWB and IMU; the IMU inertial measurement sensor and the binocular camera realize visual positioning through sensing and measuring the flight environment information; the algorithm of the two positioning technologies is realized through the operation of an embedded airborne processor, the two output results are fused, and the minimum value of the cost function is calculated by constructing a pose optimization diagram, so that the high-precision and drift-free global optimal pose of the unmanned aerial vehicle is obtained.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing is merely a preferred embodiment of the present invention, and it should be noted that modifications and variations could be made by those skilled in the art without departing from the technical principles of the present invention, and such modifications and variations should also be regarded as being within the scope of the invention.

Claims (10)

1. The tightly coupled odometer method based on UWB and vision fusion is characterized by comprising the following steps:
the Utility model unmanned aerial vehicle pose under the global system is estimated by fusing UWB, IMU and constant Gao Leida data through an unscented Kalman filter;
data fusion is carried out to form a visual inertial odometer, and the pose of the unmanned aerial vehicle under an inertial system is estimated; the data fusion algorithm is tightly coupled nonlinear optimization based on a sliding window;
And fusing the pose of the unmanned aerial vehicle under the global system with the pose of the unmanned aerial vehicle under the global system to obtain the high-precision and drift-free global optimal pose of the unmanned aerial vehicle.
2. The close-coupled odometer method based on UWB and vision fusion of claim 1, wherein the fusing the pose of the unmanned aerial vehicle under the global system with the pose of the unmanned aerial vehicle under the global system to obtain a high-precision, drift-free global optimal pose of the unmanned aerial vehicle comprises:
by constructing a pose optimization graph, taking the pose of the unmanned aerial vehicle under the global system as the angular point of a variable to be optimized, representing the pose result of the unmanned aerial vehicle under the inertial system measured by a visual inertial odometer into a pose increment form, drawing the pose increment form in the graph in a binary increment side form, using the global unitary pose side to represent the UWB positioning measurement result, defining the cost function of the graph optimization problem, and calculating the minimum value of the cost function to obtain the optimal estimation of the pose state vector in the sliding window, thereby obtaining the global optimal pose of the unmanned aerial vehicle with high precision and no drift.
3. The close-coupled odometer method based on UWB and visual fusion according to claim 1, wherein the data fusion is carried out to form a visual inertial odometer, and the pose of the unmanned aerial vehicle under an inertial system is estimated; the data fusion algorithm is tightly coupled nonlinear optimization based on a sliding window, and comprises the following steps:
Pre-integrating the IMU, wherein an IMU pre-integration term is used as a residual term for constructing a bundling adjustment equation during back-end optimization;
preprocessing the latest frame of visual observation, tracking the existing features by using a KLT sparse optical flow method, detecting the new features at the same time, screening out key frames, sending the key frames into a back-end optimizer for state estimation and depth recovery, and defining visual measurement residual errors on a unit sphere, wherein the visual measurement residual errors are re-projection residual errors;
the method of tight coupling fusion is used for guaranteeing the solving precision, the constraint information is converted into a state prior condition by adopting the idea of marginalization and added into the optimization, and nonlinear least square optimization is carried out by constructing a bundling adjustment equation;
and constructing a graph by taking the pose of the unmanned aerial vehicle under the global system as a corner point, namely, the variables to be optimized in the graph are the global pose of the unmanned aerial vehicle, including a unitary edge and a binary edge, and optimizing and solving based on the graph.
4. The close-coupled odometer method based on UWB and vision fusion of claim 3, wherein the IMU is pre-integrated in such a way as to reduce the amount of calculation, and the IMU pre-integration term is used as a residual term for constructing a bundle adjustment equation when the back-end is optimized, comprising:
for the time interval between two frames of images, there are more IMU measurement values, and in order to avoid repeat propagation, a mode of pre-integrating the IMU is adopted:
For IMU measurements between a kth frame image and its neighboring kth+1st frame image in a sliding window, the residual term for IMU pre-integration can be defined as:
the IMU pre-integral observation is obtained by using the measured values of the gyroscope and the accelerometer with noise between two continuous frames; />Representing the three-dimensional error state of the quaternion, [ ·] xyz Representing the imaginary part of the extracted quaternion q; accelerometer and gyroscope zero offsets are also included in the residual term for online correction.
5. The method of tightly coupled odometer based on UWB and visual fusion of claim 4, wherein preprocessing the latest frame of visual observation, tracking existing features by KLT sparse optical flow method, and simultaneously detecting new features, screening out key frames, wherein the key frames are sent to a back-end optimizer for state estimation and depth recovery, defining visual observation residual on a unit sphere, and the visual measurement residual is re-projection residual, comprising:
when the latest frame shot by the camera arrives, firstly, preprocessing for visual observation is carried out, the steps comprise tracking the existing features by using a KLT sparse optical flow method, in order to ensure that the number of feature points is maintained at a specific number, new features need to be detected simultaneously on the basis of tracking the original features, in addition, the steps are also responsible for screening out key frames, the key frames are sent to a back-end optimizer for state estimation and depth recovery, and two criteria are selected for the key frames: first, the average parallax with the previous key frame is larger than a certain threshold value k if the average parallax between the latest frame and the previous frame is larger than a certain threshold value k 1 Setting the latest frame as a new key frame; and secondly, the quality of feature tracking, if the number of feature tracking is lower than a certain threshold value kappa 2 The latest frame is set as a new keyA frame;
the vision measurement residual is re-projection residual, and the pixel is marked as follows under the assumption that the first feature point is observed when the ith frame imageWith the movement of the drone, the feature point l is observed again in the j-th frame image, whereby a visual measurement residual can be defined, here defined on a unit sphere, which is then defined as:
wherein the method comprises the steps ofIs the pixel coordinates of the first feature point observed for the first time in the ith frame,/>Is the pixel coordinates of the feature point in the j-th frame image,>is a back projection function containing camera internal parameters;
since the visual measurement residual has only two degrees of freedom, and the residual vector is a three-dimensional vector, the residual vector is projected onto a tangent plane, so that the correct degree of freedom is ensured; b 1 ,b 2 Is two arbitrarily selected orthogonal bases which can be stretched intoIs a tangential plane to the plane of the lens.
6. The close-coupled odometer method based on UWB and visual fusion according to claim 5, wherein the method is characterized in that a graph is constructed by taking the pose of the unmanned aerial vehicle under the global system as a corner point, namely, variables to be optimized in the graph are the global pose of the unmanned aerial vehicle, including a unitary edge and a binary edge, and the method comprises the following steps:
Taking the pose of the unmanned aerial vehicle under the global system as a corner point, namely, the variable to be optimized in the graph is the global pose of the unmanned aerial vehicle; edges in the figure include a unitary edge and a binary edge;
the transformation matrix form of SE (3) is adopted:
wherein R is k Representing the attitude of the unmanned aerial vehicle at the moment k, p k The position of the unmanned aerial vehicle at the moment k is represented;
the method comprises the steps that a constructed graph takes the pose of an unmanned aerial vehicle under a global system as a corner point, namely, a variable to be optimized in the graph is the global pose of the unmanned aerial vehicle; edges in the figure include a unitary edge and a binary edge;
(1) Incremental edge
Because the visual inertial odometer measures the pose of the unmanned aerial vehicle under the inertial systemThe variable to be optimized is the pose T of the unmanned aerial vehicle under the global system k The method comprises the steps of carrying out a first treatment on the surface of the In order to get rid of the limitation of the reference system, the form of pose increment is used here, +.>The increment between the pose at the moment k and the pose at the moment k-l of the unmanned aerial vehicle is represented, and the increment is represented under the body system at the moment k-l; the pose increment can be deduced according to the estimation result of the visual inertial odometer:
therefore, the increment edges in the graph are connected with pose angular points at two moments and represent the increment of the pose at the two moments, and the increment edges are binary edges;
(2) Global pose edge
The global pose edge is from a UWB positioning result, and represents global pose measurement of the unmanned aerial vehicle at a certain moment, and the global pose edge is a unitary edge;
The constructed graph G= { V, E }, a certain angular point at the moment k is connected with an angular point at an earlier moment by an incremental edge, a black connecting line represents a global pose edge, and the global pose edge is a unitary edge;
the system state vector in the sliding window is now defined as:
X=[T 0 ,T 1 ,…,T n ] (5)
at a certain moment k in the sliding window, after the unmanned aerial vehicle acquires a new measured value, a cost function of the graph optimization problem is defined as follows:
t in the formula (6) represents a set of global pose edges in the sliding window,representing global pose residuals:
s denotes the set of all delta edges in the sliding window,representing pose increment residual errors:
wherein the symbol-! Representing a generalized subtraction on SE (3), by arranging each term in a cost function (6):
minimizing the cost function (9) allows to find an optimal estimate X-of the state vector in the sliding window:
X^=argminF TS (X) (10)
when defining the cost function, adding a Robust Kernel function (Robust Kernel) into the residual term; the Huber kernel function is used to suppress the effect of outliers, expressed as:
7. the close-coupled odometer method based on UWB and visual fusion according to claim 1, wherein the pose of the unmanned aerial vehicle in the global system is estimated by fusing UWB, IMU and constant Gao Leida data through an unscented Kalman filter, comprising the following steps:
Step 1: acquiring UWB original data and IMU data;
step 2: performing distance calibration on the UWB original data to obtain calibrated UWB data;
step 3: judging whether the anchor point position is known or not, if not, carrying out anchor point calibration according to the calibrated UWB data, and then jumping to the step 4, if yes, jumping to the step 4;
step 4: performing outlier detection on the calibrated UWB data to obtain screened UWB data;
step 5: performing low-pass filtering on the IMU data to obtain filtered IMU data;
step 6: inputting the screened UWB data and the filtered IMU data into a state estimator to obtain the position, speed and attitude information of the unmanned aerial vehicle; the state estimator approximates probability distribution of states by taking unscented Kalman filtering as a core, so that mean and covariance are estimated;
the input of the state estimator is UWB ranging data after information calibration and outlier rejection and IMU data after noise processing through low-pass filtering; the output is the position, speed and attitude information of the unmanned aerial vehicle.
8. The close-coupled odometer method of claim 7, wherein step 2: performing distance calibration on the UWB original data, including:
Measuring UWB distanceModeling is linear relation with the true value of the distance d, namely:
wherein n is d Zero mean white noise;
a linear regression method is adopted, and a plurality of true value-measurement samples are obtained by placing two UWB modules at different positions for distance measurementThe measured value in each sample is the reading of UWB distance, and the distance true value is obtained by a high-precision indoor positioning system; a large number of samples are used for fitting the model, and a scale factor a and a static difference b under the current model are determined;
where a is the mean of the distance truth values,is the mean of the distance measurements.
9. The close-coupled odometer method based on UWB and visual fusion of claim 7, wherein in step 3, the anchor point calibration is performed according to the calibrated UWB data, comprising:
firstly, a global coordinate system is established, and the coordinate system is an anchor point A in the overlooking view 1 The position is the origin of the coordinate system, the anchor point A 2 And A is a 1 The connecting line of (a) is the direction of the x-axis, A 1 The height direction of the device is the z axis, and the y axis direction can be determined according to the right hand rule;
will A 3 Placed in a direction approximating the y-axis, A 4 Can be placed at any position;
the calibration stage makes the unmanned aerial vehicle keep still, the label periodically sends out a distance measurement request to the anchor points according to a set time slot table, and the distance from the label to each anchor point is analyzed according to the TW-TOF principle;
Starting an echo ranging function of the UWB module, namely enabling the tag to receive the distance information between the anchor points; the mutual distance measurement is started between the anchor points, and the distance measurement value is stored in the cache of the anchor points; after the tag sends a ranging request to the anchor point, the anchor point adds an echo ranging value into response information when replying to the tag, so that the tag receives the distance measurement between the two anchor points;
the to-be-estimated value of the anchor point calibration stage is the position p of the tag under the global system G T And anchor point position
Constructing a nonlinear least squares problem:
c in the formula (7) is a set of anchor points, and the positions of the labels and each anchor point can be calibrated by solving a cost function; offset of tag from origin of body systemKnown a priori information, the position p of the drone under the global system at this time can be deduced, which will be the initial iteration value p of the state estimator 0
10. A tightly coupled odometer system based on UWB and vision fusion, comprising:
UWB module: the method is used for fusing UWB, IMU and Gao Leida data through an unscented Kalman filter, and estimating to obtain the pose of the unmanned aerial vehicle under the global system;
the inertia system module: the method comprises the steps of performing data fusion to form a visual inertial odometer, and estimating to obtain the pose of the unmanned aerial vehicle under an inertial system; the data fusion algorithm is tightly coupled nonlinear optimization based on a sliding window;
And the pose fusion module is used for: the method is used for fusing the pose of the unmanned aerial vehicle under the global system with the pose of the unmanned aerial vehicle under the global system, and obtaining the high-precision and drift-free global optimal pose of the unmanned aerial vehicle.
CN202311710254.8A 2023-12-13 2023-12-13 Close coupling odometer method and system based on UWB and vision fusion Pending CN117629187A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311710254.8A CN117629187A (en) 2023-12-13 2023-12-13 Close coupling odometer method and system based on UWB and vision fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311710254.8A CN117629187A (en) 2023-12-13 2023-12-13 Close coupling odometer method and system based on UWB and vision fusion

Publications (1)

Publication Number Publication Date
CN117629187A true CN117629187A (en) 2024-03-01

Family

ID=90033928

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311710254.8A Pending CN117629187A (en) 2023-12-13 2023-12-13 Close coupling odometer method and system based on UWB and vision fusion

Country Status (1)

Country Link
CN (1) CN117629187A (en)

Similar Documents

Publication Publication Date Title
CN113945206B (en) Positioning method and device based on multi-sensor fusion
KR102581263B1 (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN109885080B (en) Autonomous control system and autonomous control method
CN113124856B (en) Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method
CN112113574B (en) Method, apparatus, computing device and computer-readable storage medium for positioning
Xiong et al. G-VIDO: A vehicle dynamics and intermittent GNSS-aided visual-inertial state estimator for autonomous driving
CN114526745B (en) Drawing construction method and system for tightly coupled laser radar and inertial odometer
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
Peng et al. UAV positioning based on multi-sensor fusion
CN113933818A (en) Method, device, storage medium and program product for calibrating laser radar external parameter
CN114019552A (en) Bayesian multi-sensor error constraint-based location reliability optimization method
CN115574816B (en) Bionic vision multi-source information intelligent perception unmanned platform
CN114323033A (en) Positioning method and device based on lane lines and feature points and automatic driving vehicle
CN115902930A (en) Unmanned aerial vehicle room built-in map and positioning method for ship detection
CN117685953A (en) UWB and vision fusion positioning method and system for multi-unmanned aerial vehicle co-positioning
CN116086445A (en) Multi-source information time delay fusion navigation method based on factor graph optimization
Li et al. Aerial-triangulation aided boresight calibration for a low-cost UAV-LiDAR system
CN117387604A (en) Positioning and mapping method and system based on 4D millimeter wave radar and IMU fusion
Wang et al. Micro aerial vehicle navigation with visual-inertial integration aided by structured light
CN117075158A (en) Pose estimation method and system of unmanned deformation motion platform based on laser radar
CN116202509A (en) Passable map generation method for indoor multi-layer building
CN115562076A (en) Simulation system, method and storage medium for unmanned mine car
Emter et al. Stochastic cloning and smoothing for fusion of multiple relative and absolute measurements for localization and mapping
CN117629187A (en) Close coupling odometer method and system based on UWB and vision fusion
CN117705104A (en) Robust autonomous positioning and mapping method, device and medium based on UWB and IMU fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination