CN114485623B - Focusing distance camera-IMU-UWB fusion accurate positioning method - Google Patents

Focusing distance camera-IMU-UWB fusion accurate positioning method Download PDF

Info

Publication number
CN114485623B
CN114485623B CN202210142487.1A CN202210142487A CN114485623B CN 114485623 B CN114485623 B CN 114485623B CN 202210142487 A CN202210142487 A CN 202210142487A CN 114485623 B CN114485623 B CN 114485623B
Authority
CN
China
Prior art keywords
uwb
imu
data
base station
vio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210142487.1A
Other languages
Chinese (zh)
Other versions
CN114485623A (en
Inventor
王庆
李明
杨高朝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202210142487.1A priority Critical patent/CN114485623B/en
Publication of CN114485623A publication Critical patent/CN114485623A/en
Application granted granted Critical
Publication of CN114485623B publication Critical patent/CN114485623B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Abstract

The invention relates to a camera-IMU-UWB fusion accurate positioning method for focusing distance, which utilizes propagation data calculated by a VIO method to process the view angle of the focusing distance measured by UWB, fully solves the time deviation between UWB camera sensors, allows all available UWB data to be used, provides a tight coupling fusion scheme of a monocular camera, a six-degree-of-freedom IMU and a single ultra-wideband base station, provides drift reduction mileage, embeds a UWB base station positioning module and estimates the position of an unknown base station.

Description

Focusing distance camera-IMU-UWB fusion accurate positioning method
Technical Field
The invention relates to the technical field of indoor positioning, in particular to a camera-IMU-UWB fusion accurate positioning method for focusing distance.
Background
Reliable and globally consistent positioning remains an open research problem for many indoor positioning applications. In recent years, visual Inertial Odometer (VIO) or simultaneous localization and mapping (VI-SLAM) has been a popular method for this purpose due to the complementarity of cameras and Inertial Measurement Unit (IMU) sensors. While the most advanced methods can achieve very accurate and high-rate attitude and velocity estimation, sensor noise and computational errors make the system prone to cumulative drift over time. A popular solution to this problem is to include an additional global sensor, such as GPS. Ultra Wideband (UWB) is an alternative option for small scale operation for situations where GPS is not available (indoor, tunnel, corridor, etc.).
Some studies have proposed methods of fusing UWB data and VIO for various applications and scenarios. These methods work in a loosely coupled manner, meaning that the UWB range and camera-IMU data are first calculated in separate positioning systems, and then the position estimates obtained by the UWB and camera-IMU subsystems are aligned and fused together. While these methods can be directly constructed, the results can be improved if all sensor data is fused at once to take advantage of the correlation between the available information. Furthermore, they require the provision of multiple known UWB base stations for range-based positioning, which can be costly and can limit applicability in many space-constrained scenarios (e.g., indoor, tunnel, corridor, etc.).
Some studies have introduced methods that use only a single UWB base station with an unknown location. Such a system would enjoy drift-free distance measurement for accurate positioning and ease of use in practical applications since no set time is required to calibrate the base station position. The results show that by tightly coupling UWB, camera and lidar or IMU measurements in a joint optimization problem, a more accurate and robust positioning can be achieved. However, these methods process UWB data in a similar manner as analog: each camera position is paired with a distance measurement and does not take into account any other range between two consecutive camera frames. This view does not reflect a real life sensor system for a number of reasons: the actual UWB sensor is independent of the camera/IMU sensor, so there is always a time offset between the distance/image messages; following the VINS-Mono marginalization strategy, the UWB factors, along with the visual and inertial factors, connected to the first keyframe are translated to a linear prior when the keyframe is marginalized. UWB ranging rates do not conform to standard cameras or IMU rates, UWB data rates tend to be several times higher than cameras; due to the non-line of sight, the UWB data rate may vary during actual operation, meaning that the amount of UWB data available may also vary.
There are many ways to introduce ultra wideband in existing positioning systems. UWB ranging can be independently positioned and combined with a monocular camera, an IMU, an RGB-D camera, an IMU, RGB-D, liDAR and the like, and accuracy and robustness of the SLAM system are improved. To obtain a unique 3D range positioning method, two methods are required:
1) UWB base stations having at least four known locations;
2) Three known base station positions and carrier height data.
This assumption limits the applicable scenarios of the system because the operating area requires the installation of UWB base stations and each new environment requires additional time and effort to calibrate the base station's location. To meet this requirement, some methods attempt to estimate the base station map during operation, as the carrier may use metric-scale odometers for additional inter-base station ranges, or just metric-scale odometers, even just standard-scale odometers. However, these solutions still handle UWB data in a suboptimal manner.
Disclosure of Invention
In order to solve the technical problems, the invention provides a camera-IMU-UWB fusion accurate positioning method for focusing distance, and the method provides a more effective method for fusing vision, inertia and ultra-wideband measurement. Essentially, UWB errors are effectively formulated for each ranging data using the existing state propagation process in the VIO method.
A camera-IMU-UWB fusion accurate positioning method of focusing distance includes:
s1, initializing a visual inertial odometer VIO, and waiting for the stable state of the VIO;
s2, calculating by using accurate mileage measurement provided by the VIO in the initial stage, and acquiring position information of the carrier in a coordinate system of the VIO;
s3, estimating the position of the UWB base station by utilizing distance information between the carrier and the UWB base station, which is obtained by the carrier at a plurality of positions. When UWB base station position uncertainty is below a certain threshold, the position of the base station is considered fixed;
s4, after the fixed base station position is obtained in the step S3, UWB measurement data and vision and inertia data can be fused together tightly, and accurate and drift-reduced long-term mileage measurement results are obtained through optimization based on the joint key frames.
As a further improvement of the present invention, in the step S3, the step of acquiring the position of the UWB base station is as follows:
in order to estimate the position ap of UWB base stations in the world frame, using short-term accurate VIO data including pose and velocity, the method creates an optimization problem over the data window:
distance data and corresponding IMU integral output position by VIO methodComposition, cost function to be minimized is
Where ρ is the Huber loss. UWB residual errorCalculating with IMU output position:
for IMU data that is noisy for a low cost system, the state propagation of IMU integrals may be unreliable, and therefore,use->Instead of making a more stable camera state estimation;
the estimation result will depend on the trajectory covering all three-dimensional axes and the distance of the base station relative to the radius of motion, representing the sample variance of the position on the x-axis asThe y-axis is similar to the z-axis, and each new position data is updated recursively as the carrier moves, in order to guarantee the performance of the optimization, the following conditions need to be checked when starting or skipping the optimization process:
checking whether the carrier is moving, if the carrier is static, the new data will be the same, and the optimization result will not be improved;
checking whether the sample variance of each axis position is large enough, i.e. whether the movement of the carrier covers all directions; v min And->Is a user-defined parameter. Checking the first condition for each new VIO data until the termination condition is met, checking the second condition until the first is met, estimating a satisfactory result by strengthening these conditions without laborious initial guessing of the location of the base station ∈>Nonetheless, a good initial guess increases convergence time and may be considered in practice to improve performance;
once started, the cost function is optimized using the standard Levenberg-Marquardt algorithm and the Ceres solver, and since the system is running on-line, a termination criterion is introduced to determine the uncertainty of the solution:
σ max <σ p #(4)
σ max is the maximum singular value, σ, of the covariance matrix p Is a given threshold, once the criteria are met, UWB base station positioning operations are completed,fixed due to sigma max Is not necessary, but because the rank-deficient checking and inversion of the jacobian matrix can be time consuming, this termination check runs at a slower speed in a separate thread, and thus, the optimization can be doneCan be run several additional times but the processing time is not affected.
As a further improvement of the present invention, in the step S4, the method for fusing UWB measurement data with visual and inertial data is as follows:
once the UWB base station location is found, the camera-IMU-UWB sensor can be tightly fused for a more reliable, accurate and drift-reducing odometer, for which purpose a cost function E using only VI is added VI (x) Forming a total cost function with range focused UWB residuals
E VIR (x)=E VI (x)+E R (x)#(5)
Wherein the method comprises the steps ofFor t in the sliding window k And t k+1 The distance dataset between two keyframes and the corresponding predicted position change are calculated by equation (6):
J k is used to create a connection to state x k UWB residual is defined by a predefined gamma r Factor re-weighting to amplify its impact on optimization and, secondly, the remainder of the ultra widebandAn improvement is made;
distance focusing UWB factor t j The position of the moment is related to x by one of the following methods k State association:
using formula (8), according to IMU data onlyThe position propagated from the VIO method calculated by (6) is used to predict the position of any timestamp. However, since there is only one position state +>Directly connected to multiple UWB measurements, the solution may overfit ultra wideband noise and IMU data. On the other hand, use (9) allowed position +.>And speed->The state is coupled with the distance measurement, however, the isokinetic model may not be applicable in agile maneuvers, in particular Δt j Possibly longer than the time between two frames;
it is proposed to use the prediction and motion model based on IMU data in combination with (8) and (9):
following the VINS-Mono marginalization strategy, the UWB factors, along with the visual and inertial factors, connected to the first keyframe are translated to a linear prior when the keyframe is marginalized.
The beneficial effects are that:
(1) The tight coupling fusion scheme of the monocular camera, the six-degree-of-freedom IMU and the single UWB base station is provided, so that drift errors of the visual odometer are reduced;
(2) The position of the base station is inverted by utilizing the short-term high precision of the visual odometer, so that the base station can be directly positioned when no prior position information of the UWB base station exists, and the base station can be quickly adapted to strange environments;
(3) The distance-centered thought of UWB measurement is processed by utilizing the propagation data calculated by the VIO scheme, so that the time deviation between UWB camera sensors is fully solved, all available UWB data is fully utilized, and a positioning system with higher precision is provided.
Drawings
FIG. 1 is a system workflow diagram;
FIG. 2 is a diagram of reference frames and measurements;
FIG. 3 is a key frame in time and recipe measured by a sensor;
FIG. 4 is a system UWB base station positioning flow chart;
FIG. 5 is a factor graph of the "position focus" and "range focus" methods, respectively;
fig. 6 is a complete system overview.
Detailed Description
The invention is described in further detail below with reference to the attached drawings and detailed description:
the general workflow of the invention is shown in figure 1, and the camera-IMU-UWB fusion accurate positioning method for focusing distance comprises the following steps:
s1, initializing a Visual Inertial Odometer (VIO), and waiting for the stable state of the VIO;
s2, calculating by using accurate mileage measurement provided by the VIO in the initial stage, and acquiring position information of the carrier in a coordinate system of the VIO;
s3, estimating the position of the UWB base station by utilizing distance information between the carrier and the UWB base station, which is obtained by the carrier at a plurality of positions. When UWB base station position uncertainty is below a certain threshold, the position of the base station is considered fixed;
s4, after the fixed base station position is obtained in the step S3, UWB measurement data and vision and inertia data can be fused together tightly, and accurate and drift-reduced long-term mileage measurement results are obtained through optimization based on the joint key frames.
Position focusing and distance focusing UWB residual
Fig. 2 and 3 show a camera and ultra wideband measurementExamples of time offsets between quantities. Let e r Is a UWB residual. e, e r Is called position focusing, which will slide the windowAssociated with one span measurement of the most recent timestamp:
however, this approach presents some practical problems. The method uses a distance focusing formula that relates each distance data to a location of the same timestamp:
by adjusting the position toIs defined, is a distance data for each of the two pairs of.
If the following condition (Deltat) is satisfied at the same time j =t j -t k ) It can be considered that (1) and (2) are the same:
1) At the position ofAnd d j Without time offset (Deltat) j =0), i.e. the camera and UWB sensor are synchronized;
2) Whether UWB and camera sensors have the same data rate or ignore all other additional distance measurements between two camera frames.
However, in reality, the camera and the ultra-wideband sensor always work independently of each other, so the time offset problem is inherent, not negligible. Furthermore, standard UWB data rates are typically many times higher than the data rate of cameras. Thus, previous approaches always discard a large amount of available range data, meaning that UWB sensors are still underutilized.
In step S1, the scheme of the visual odometer is as follows:
monocular vision inertial odometer based on optimization:
the sliding window X is composed of visual characteristic state X L And IMU state X B The composition is as follows:
X=X L +X B
X E =[x 1 …,x k ,…,x K ]
where K is the number of key frames. Each IMU at t k State x of time k Consisting of the position, direction, velocity of the IMU in the world coordinate system and the deviation of the accelerometer and gyroscope in the body coordinate system. The VIO method uses a binding adjustment formula to minimize cost:
including visual residual errorsIMU residual->And edge residual e p . C is the set of odometers observed in the nth and m key frames. The euclidean norm of the vector. The Huber norm ρ is applied to the visual residual to reduce the effect of tracking outliers.
IMU status update:
IMU pre-integration methods allow pre-integration of high frequency linear acceleration and angular velocity into pseudorange measurementsFrom t i To t i+1 Is measured by the IMU. These terms correspond to pre-integral inertial measurements and relative orientation measurements, respectively. Whenever at t k When a new key frame is created, all propagated states are reset:
for t i+1 Each new IMU measurement at that point, the state propagates recursively according to the following formula:
an IMU-rate state estimate is generated. With this operation, at t j When a new distance measurement is received (t j >t k ) The propagation state is easily obtainedThus, t in IMU-based world coordinate system k To t j The prediction of the position change becomes simple:
in step S3, the scheme of acquiring the position of the UWB base station and performing UWB, camera, IMU fusion is as follows:
ultra-wideband base station position location based on VIO data
In order to estimate the position ap of UWB base stations in the world frame, using short-term accurate VIO data (pose and velocity), the method creates an optimization problem over the data window:
distance data and corresponding IMU-rate output position by VIO methodComposition is prepared. Fig. 4 outlines the flow of UWB base station position determination. The cost function that needs to be minimized is
Where ρ is the Huber loss. UWB residual errorIMU-rate position calculation may be used:
for IMU data that is noisy for a low cost system, state propagation of IMU rates may be unreliable. Thus, the first and second substrates are bonded together,can use->Instead of making a more stable camera rate state estimation.
The full conditions are as follows: the carrier should not move directly towards the base station. In practice, the estimation will depend on the trajectory covering all three-dimensional axes and the distance of the base station with respect to the radius of motion. Representing the sample variance of the position on the x-axis as(y-axis and z-axis are similar) each new position data is updated recursively as the carrier moves. To ensure optimal performance, the following conditions need to be checked when starting or skipping the optimization process:
checking if the carrier is moving, if the carrier is static, newThe data of (2) will be the same and the optimization result will not be improved;
it is checked whether the sample variance of the shaft positions is large enough, i.e. whether the movement of the carrier covers all directions.
v min Andis a user-defined parameter. The first condition is checked for each new VIO data until a termination condition is met. The second condition is checked until the first is met. By enforcing these conditions, we found that it is estimated that satisfactory results can be obtained without laborious initial guessing of the anchor position +.>This is all our experimental setup. Nevertheless, a good initial guess may increase convergence time and may be considered in practice to increase performance.
Termination criteria: once started, the cost function (9) can be optimized using a standard Levenberg-Marquardt algorithm and a Ceres solver. Since the system is running online, a termination criterion is introduced to determine the uncertainty of the solution:
σ max <σ p #(11)
σ max is the maximum singular value, σ, of the covariance matrix p Is a given threshold. Once the criteria are met, the UWB base station positioning operation is complete,fixing. Due to sigma max This termination check may run at a slower speed in a separate thread (e.g., every 10 new UWB ranges) because of the rank deficiency check and inversion of the jacobian matrix, which may be time consuming. Thus, the optimization may run several times for additional time, but the processing time is not affected.
In step S4, the specific fusion scheme of UWB and visual inertial odometer is as follows:
visual inertial odometer positioning and mapping
Visual inertial distance measurement based on keyframes
Once the UWB base station location is found, the camera-IMU-UWB sensor can be tightly fused for a more reliable, accurate and drift-reducing odometer. To this end, a cost function E using VI alone is added VI (x) Forming a total cost function with distance focused UWB residuals
E VIR (x)=E VI (x)+E R (x)#(12)
Wherein the method comprises the steps ofFor t in the sliding window k And t k+1 Distance dataset between two keyframes and corresponding predicted position changes:
fig. 5 depicts a factor graph of the proposed system compared to previous location-focused approaches. J (J) k Is used to create a connection to state x k Is a UWB factor of (c). UWB residual is defined by a predefined gamma r The factors are re-weighted to amplify their impact on the optimization. Secondly, according to the main idea of (2), ultra wideband is remainedImprovements are made to improve performance.
Distance focusing UWB factor t j The position of the moment in time can be related to x by one of the following methods k State association:
using equation (16), based on IMU data aloneCalculated from (6) the position propagated from the VIO method is used to predict the position of any timestamp. However, since there is only one position state +>Directly connected to multiple UWB measurements, the solution may be overly suitable for noise ultra wideband and IMU data. On the other hand, use (16) of the allowed position +.>And speed->The state is coupled with the distance measurement. However, the isokinetic model may not be applicable in agile maneuvers, in particular Δt j Possibly longer than the time between two frames.
In the present approach, we propose to use the IMU data based prediction and motion model in combination with (15) and (16):
following the VINS-Mono marginalization strategy, the UWB factors, along with the visual and inertial factors, connected to the first keyframe are translated to a linear prior when the keyframe is marginalized.
While most current methods use VIO for on-board positioning and UWB alone for distance-based relative positioning, some studies suggest that it is possible to fuse visual, inertial and UWB data while obtaining a base station position estimate and an improved pose estimate, and propose EKF solutions, employing a pose map optimization framework. These methods, although different in terms of final objective and fusion methods, have the same rationale in terms of using ultra wideband data that residuals are represented from a position perspective in a state vector. This view leads to the problem that 1) one location is paired with one nearest UWB measurement, ignoring the time offset between the camera frame and the range data, 2) all other ranging between two consecutive camera frames is discarded. In contrast, the system calculates the ultra-wideband residual from the time stamp of the distance measurement so that the distance data can be used on the accuracy of the sensor. By using the results of the IMU state propagation process in the VIO method, UWB residuals for each range measurement can be obtained, so that all available ranges can be utilized in view of the time offset problem.
Overview of the System
Fig. 6 shows an overview of the system proposed by the present method. The mobile carrier is equipped with a monocular camera, a six-degree-of-freedom IMU and an ultra-wideband sensor fixed to the body frame, all internal and external parameters being calibrated. A single UWB base station placed in an unknown location may be ranging. In this work, a two-way time-of-flight (TW-ToF) ultra-wideband sensor is employed, as it does not require clock synchronization between sensors, and is therefore more suitable for many application scenarios.
The above description is only of the preferred embodiment of the present invention, and is not intended to limit the present invention in any other way, but is intended to cover any modifications or equivalent variations according to the technical spirit of the present invention, which fall within the scope of the present invention as defined by the appended claims.

Claims (2)

1. A camera-IMU-UWB fusion accurate positioning method of focusing distance includes the following steps:
s1, initializing a visual inertial odometer VIO, and waiting for the stable state of the VIO;
s2, calculating by using accurate mileage measurement provided by the VIO in the initial stage, and acquiring position information of the carrier in a coordinate system of the VIO;
s3, estimating the position of the UWB base station by utilizing distance information from the carrier to the UWB base station, which is obtained by the carrier at a plurality of positions, wherein when the uncertainty of the position of the UWB base station is lower than a certain threshold value, the position of the base station is considered to be fixed;
in the step S3, the step of acquiring the position of the UWB base station is as follows:
in order to estimate the position ap of UWB base stations in the world frame, using short-term accurate VIO data including pose and velocity, the method creates an optimization problem over the data window:
distance data and corresponding IMU integral output position by VIO methodComposition, cost function to be minimized is
Where ρ is Huber loss, UWB residual errorCalculating with IMU output position:
for IMU data that is noisy for a low cost system, the state propagation of IMU integrals may be unreliable, and therefore,use->Instead of making a more stable camera state estimation;
the estimation result will depend on the trajectory covering all three-dimensional axes and the distance of the base station relative to the radius of motion, representing the sample variance of the position on the x-axis asThe y-axis is similar to the z-axis, and each new position data is updated recursively as the carrier moves, in order to guarantee the performance of the optimization, the following conditions need to be checked when starting or skipping the optimization process:
checking whether the carrier is moving, if the carrier is static, the new data will be the same, and the optimization result will not be improved;
checking whether the sample variance of each axis position is large enough, i.e. whether the movement of the carrier covers all directions; v min And->Is a user-defined parameter, the first condition is checked for each new VIO data until the termination condition is met, the second condition is checked until the first is met, and by strengthening these conditions, a satisfactory result is estimated without laborious initial guessing of the location of the base station>Nonetheless, a good initial guess increases convergence time and may be considered in practice to improve performance;
once started, the cost function is optimized using the standard Levenberg-Marquardt algorithm and the Ceres solver, and since the system is running on-line, a termination criterion is introduced to determine the uncertainty of the solution:
σ maxp (4),σ max is the maximum singular value, σ, of the covariance matrix p Is a given threshold, once the criteria are met, UWB base station positioning operations are completed,fixed due to sigma max Is not necessary, but because the rank-deficient checking and inversion of the jacobian matrix can be time consuming, this termination check runs at a slower speed in a single thread, and therefore the optimization can run several times for additional time, but the processing time is not affected;
s4, after the fixed base station position is obtained in the step S3, UWB measurement data and vision and inertia data can be fused together tightly, and accurate and drift-reduced long-term mileage measurement results are obtained through optimization based on the joint key frames.
2. The camera-IMU-UWB fusion accurate positioning method of focusing distance according to claim 1, wherein: in the step S4, the method for fusing UWB measurement data with visual and inertial data is as follows:
once the UWB base station location is found, the camera-IMU-UWB sensor close fusion provides a more reliable, accurate and drift-reducing odometer, for which purpose the cost function E using VI alone is increased VI (x) Forming a total cost function with range focused UWB residuals
E VIR (x)=E VI (x)+E R (x) (5)
Wherein the method comprises the steps ofFor t in the sliding window k And t k+1 Number of distances between two keyframesThe data set and corresponding predicted position changes are calculated by equation (6):
J k is used to create a connection to state x k UWB residual is defined by a predefined gamma r Factor re-weighting to amplify its impact on optimization and, secondly, the remainder of the ultra widebandAn improvement is made;
distance focusing UWB factor t j The position of the moment is related to x by one of the following methods k State association:
using equation (8), based on IMU data onlyThe position propagated from the VIO method calculated from (6) is used to predict the position of any timestamp, however, since there is only one position state +.>Directly connected to a plurality of UWB measurements, so that the solution may overfit ultra wideband noise and IMU data, on the other hand, use (9) allowed position +.>And speed->The state is coupled with the distance measurement, however, the isokinetic model may not be applicable in agile maneuvers, in particular Δt j Possibly longer than the time between two frames;
it is proposed to use the prediction and motion model based on IMU data in combination with (8) and (9):
following the VINS-Mono marginalization strategy, the UWB factors, along with the visual and inertial factors, connected to the first keyframe are translated to a linear prior when the keyframe is marginalized.
CN202210142487.1A 2022-02-16 2022-02-16 Focusing distance camera-IMU-UWB fusion accurate positioning method Active CN114485623B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210142487.1A CN114485623B (en) 2022-02-16 2022-02-16 Focusing distance camera-IMU-UWB fusion accurate positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210142487.1A CN114485623B (en) 2022-02-16 2022-02-16 Focusing distance camera-IMU-UWB fusion accurate positioning method

Publications (2)

Publication Number Publication Date
CN114485623A CN114485623A (en) 2022-05-13
CN114485623B true CN114485623B (en) 2024-02-23

Family

ID=81481175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210142487.1A Active CN114485623B (en) 2022-02-16 2022-02-16 Focusing distance camera-IMU-UWB fusion accurate positioning method

Country Status (1)

Country Link
CN (1) CN114485623B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115540854A (en) * 2022-12-01 2022-12-30 成都信息工程大学 Active positioning method, equipment and medium based on UWB assistance

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110514225A (en) * 2019-08-29 2019-11-29 中国矿业大学 The calibrating external parameters and precise positioning method of Multi-sensor Fusion under a kind of mine
CN111024066A (en) * 2019-12-10 2020-04-17 中国航空无线电电子研究所 Unmanned aerial vehicle vision-inertia fusion indoor positioning method
CN112378396A (en) * 2020-10-29 2021-02-19 江苏集萃未来城市应用技术研究所有限公司 Hybrid high-precision indoor positioning method based on robust LM visual inertial odometer and UWB
CN112529962A (en) * 2020-12-23 2021-03-19 苏州工业园区测绘地理信息有限公司 Indoor space key positioning technical method based on visual algorithm
CN112837374A (en) * 2021-03-09 2021-05-25 中国矿业大学 Space positioning method and system
CN113124856A (en) * 2021-05-21 2021-07-16 天津大学 Visual inertia tight coupling odometer based on UWB online anchor point and metering method
CN113758488A (en) * 2021-09-27 2021-12-07 同济大学 Indoor positioning method and equipment based on UWB and VIO
CN113790726A (en) * 2021-09-07 2021-12-14 中国科学院合肥物质科学研究院 Robot indoor positioning method fusing camera, wheel speed meter and single UWB information
CN113865584A (en) * 2021-08-24 2021-12-31 知微空间智能科技(苏州)有限公司 UWB three-dimensional object finding method and device based on visual inertial odometer
PL434503A1 (en) * 2020-06-30 2022-01-03 Vistom Spółka Z Ograniczoną Odpowiedzialnością Mobile measuring system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110514225A (en) * 2019-08-29 2019-11-29 中国矿业大学 The calibrating external parameters and precise positioning method of Multi-sensor Fusion under a kind of mine
CN111024066A (en) * 2019-12-10 2020-04-17 中国航空无线电电子研究所 Unmanned aerial vehicle vision-inertia fusion indoor positioning method
PL434503A1 (en) * 2020-06-30 2022-01-03 Vistom Spółka Z Ograniczoną Odpowiedzialnością Mobile measuring system
CN112378396A (en) * 2020-10-29 2021-02-19 江苏集萃未来城市应用技术研究所有限公司 Hybrid high-precision indoor positioning method based on robust LM visual inertial odometer and UWB
CN112529962A (en) * 2020-12-23 2021-03-19 苏州工业园区测绘地理信息有限公司 Indoor space key positioning technical method based on visual algorithm
CN112837374A (en) * 2021-03-09 2021-05-25 中国矿业大学 Space positioning method and system
CN113124856A (en) * 2021-05-21 2021-07-16 天津大学 Visual inertia tight coupling odometer based on UWB online anchor point and metering method
CN113865584A (en) * 2021-08-24 2021-12-31 知微空间智能科技(苏州)有限公司 UWB three-dimensional object finding method and device based on visual inertial odometer
CN113790726A (en) * 2021-09-07 2021-12-14 中国科学院合肥物质科学研究院 Robot indoor positioning method fusing camera, wheel speed meter and single UWB information
CN113758488A (en) * 2021-09-27 2021-12-07 同济大学 Indoor positioning method and equipment based on UWB and VIO

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
GIS辅助的室内定位技术研究进展;李清泉;周宝定;马威;薛卫星;;测绘学报(12);全文 *
PLS-VINS: Visual Inertial State Estimator With Point-Line Features Fusion and Structural Constraints;Gaochao Yang;IEEE;第21卷(第24期);全文 *
基于抗差LM 的视觉惯性里程计与伪卫星混合高精度室内定位;杨高朝;测绘学报;第51卷(第1期);全文 *
基于无人机可见光与激光雷达的甜菜株高定量评估;王庆;农业机械学报;第52卷(第3期);全文 *

Also Published As

Publication number Publication date
CN114485623A (en) 2022-05-13

Similar Documents

Publication Publication Date Title
CN111207774B (en) Method and system for laser-IMU external reference calibration
US10295365B2 (en) State estimation for aerial vehicles using multi-sensor fusion
Georgiev et al. Localization methods for a mobile robot in urban environments
US9071829B2 (en) Method and system for fusing data arising from image sensors and from motion or position sensors
CN111220153B (en) Positioning method based on visual topological node and inertial navigation
EP3454008A1 (en) Survey data processing device, survey data processing method, and survey data processing program
CN110702091B (en) High-precision positioning method for moving robot along subway rail
Cai et al. Mobile robot localization using gps, imu and visual odometry
CN110412596A (en) A kind of robot localization method based on image information and laser point cloud
CN114019552A (en) Bayesian multi-sensor error constraint-based location reliability optimization method
CN112254729A (en) Mobile robot positioning method based on multi-sensor fusion
CN114485623B (en) Focusing distance camera-IMU-UWB fusion accurate positioning method
CN112797985A (en) Indoor positioning method and indoor positioning system based on weighted extended Kalman filtering
Xian et al. Fusing stereo camera and low-cost inertial measurement unit for autonomous navigation in a tightly-coupled approach
CN114690229A (en) GPS-fused mobile robot visual inertial navigation method
CN112762929B (en) Intelligent navigation method, device and equipment
CN112731503B (en) Pose estimation method and system based on front end tight coupling
CN112113564B (en) Positioning method and system based on image sensor and inertial sensor
CN113358117A (en) Visual inertial indoor positioning method using map
Berefelt et al. Collaborative gps/ins navigation in urban environment
Huai et al. Segway DRIVE benchmark: Place recognition and SLAM data collected by a fleet of delivery robots
CN116380070A (en) Visual inertial positioning method based on time stamp optimization
CN116242372A (en) UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment
CN115930977A (en) Method and system for positioning characteristic degradation scene, electronic equipment and readable storage medium
Gang et al. Robust tightly coupled pose estimation based on monocular vision, inertia, and wheel speed

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant