CN113758488B - Indoor positioning method and equipment based on UWB and VIO - Google Patents

Indoor positioning method and equipment based on UWB and VIO Download PDF

Info

Publication number
CN113758488B
CN113758488B CN202111134020.4A CN202111134020A CN113758488B CN 113758488 B CN113758488 B CN 113758488B CN 202111134020 A CN202111134020 A CN 202111134020A CN 113758488 B CN113758488 B CN 113758488B
Authority
CN
China
Prior art keywords
positioning
vio
uwb
positioning result
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111134020.4A
Other languages
Chinese (zh)
Other versions
CN113758488A (en
Inventor
申炳琦
张志明
舒少龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202111134020.4A priority Critical patent/CN113758488B/en
Publication of CN113758488A publication Critical patent/CN113758488A/en
Application granted granted Critical
Publication of CN113758488B publication Critical patent/CN113758488B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Numerical Control (AREA)

Abstract

The invention relates to an indoor positioning method and equipment based on UWB and VIO, wherein the positioning method comprises the following steps: obtaining a UWB positioning result by adopting a DS-TWR algorithm and a Kalman filtering algorithm; obtaining a VIO positioning result by adopting an S-MSCKF algorithm; performing time synchronization processing on the UWB positioning result and the VIO positioning result; and performing data fusion on the UWB positioning result and the VIO positioning result after time synchronization by using an ES-EKF algorithm to obtain the optimal position estimation of the robot. Compared with the prior art, the invention has the advantages of flexible and accurate positioning in complex indoor environment, and the like.

Description

Indoor positioning method and equipment based on UWB and VIO
Technical Field
The invention belongs to the field of SLAM and indoor positioning, relates to an indoor positioning method and equipment, and particularly relates to an indoor positioning method and equipment based on UWB and VIO.
Background
In recent years, indoor mobile robots typified by service robots have been rapidly popularized in daily life. How to flexibly and accurately locate in a complex indoor environment and comprehensively model the indoor environment is a primary problem that a mobile robot needs to face before completing indoor tasks.
UWB (ultra-wide band) technology is a carrierless communication technology with low power consumption and high data rate, which transmits data through high frequency non-sinusoidal narrow pulses, so that UWB systems occupy a wide frequency band, and also have a low transmit power spectral density due to the very short duration of the transmitted signal pulses. Compared with wireless technologies such as WLAN and Bluetooth, the positioning system based on the UWB technology has the advantages of low cost, strong penetrating power, insensitivity to channel fading, high time stamp precision and high ranging positioning precision, and is suitable for high-speed wireless data communication and wireless positioning in indoor places. Such as: prorok et al in Accurate indoor localization with ultra-wideband using spatial models and collaboration propose a positioning model based on a TDOA model (Time Difference of Arrival, arrival time difference) for the first time, and although the method is simple in terms of UWB base station deployment and strong in expandability, positioning accuracy is to be improved, and positioning result robustness is not strong; chinese patent publication No. CN107566065a discloses a UWB positioning technology based on a TOF algorithm, which has a fast response speed, but a large error occurs in a positioning result due to environmental interference of the TOF algorithm.
The UWB positioning has the defects that the positioning precision is required to be improved, the positioning result has larger error and the like at present, and the improvement is needed. VIO (visual-inertial odometry, visual inertial odometer) is an algorithm that fuses camera and IMU data to implement SLAM, which is further divided into tight coupling and loose coupling according to the distinction of the fusion framework. The visual motion estimation and inertial navigation motion estimation system in loose coupling is two independent modules, the output results of each module are fused, the tight coupling is to jointly estimate a group of variables by using the original data of two sensors, the sensor noise is also mutually influenced, the tight coupling algorithm is relatively complex, the sensor data are fully utilized, and a better effect can be realized. Chinese patent publication No. CN110487267a discloses a positioning method based on loose coupling of UWB and VIO (visual inertial odometry, visual inertial odometer), and the algorithm is a loose coupling framework that fuses UWB information, visual information and inertial data by kalman filtering. However, this method is not good in terms of positioning error due to loose coupling.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide an indoor positioning method and equipment based on UWB and VIO, which can flexibly and accurately position in a complex indoor environment.
The aim of the invention can be achieved by the following technical scheme:
the invention provides an indoor positioning method based on UWB and VIO, which comprises the following steps:
obtaining a UWB positioning result by adopting a DS-TWR algorithm and a Kalman filtering algorithm;
obtaining a VIO positioning result by adopting an S-MSCKF algorithm;
performing time synchronization processing on the UWB positioning result and the VIO positioning result;
and performing data fusion on the UWB positioning result and the VIO positioning result after time synchronization by using an ES-EKF algorithm to obtain the optimal position estimation of the robot.
Further, the UWB positioning result is obtained by:
performing base station tag ranging by adopting a DS-TWR algorithm to obtain ranging data;
and smoothing the ranging data by adopting Kalman filtering to obtain optimal coordinate estimation as a UWB positioning result.
Further, the VIO location result is obtained by:
adopting the S-MSCKF front end to detect characteristic points in the image, tracking the extracted characteristic points, and performing elimination treatment;
and performing extended Kalman filtering on the S-MSCKF rear end and IMU data according to the feature point coordinates transmitted from the front end, so as to obtain an optimal estimated value of the current pose of the camera, and using the optimal estimated value as a VIO positioning result.
Further, the removing processing includes removing binocular mismatching points or front and rear frame mismatching points.
Further, the data fusion specifically includes:
the independent coordinate system used in the UWB positioning process is used as a world coordinate system of VIO positioning, and the position and the speed of the mobile robot are used as system state vectors;
predicting the error state and covariance of the k moment by taking the VIO positioning result as the nominal value of the system state vector;
using the difference value of the positioning information of the UWB positioning result and the VIO positioning result as a system observation vector to update the error state and the covariance;
and obtaining an optimal estimated value of the real state at the moment k according to the nominal value and the updated error state, and taking the optimal estimated value as the optimal position estimation of the robot.
The invention also provides an indoor positioning device based on UWB and VIO, comprising:
the UWB positioning module is used for obtaining a UWB positioning result by adopting a DS-TWR algorithm and a Kalman filtering algorithm;
the VIO positioning module is used for obtaining a VIO positioning result by adopting an S-MSCKF algorithm;
and the fusion module is used for carrying out time synchronization processing on the UWB positioning result and the VIO positioning result, and carrying out data fusion on the UWB positioning result and the VIO positioning result after time synchronization by using an ES-EKF algorithm to obtain the optimal position estimation of the robot.
Further, the UWB positioning module includes a plurality of UWB base stations, a UWB tag attached to the mobile robot, and a UWB result acquisition unit configured to:
performing base station tag ranging by adopting a DS-TWR algorithm to obtain ranging data;
and smoothing the ranging data by adopting Kalman filtering to obtain optimal coordinate estimation as a UWB positioning result.
Further, the VIO location module includes a camera, an IMU, and a VIO result acquisition unit configured to:
detecting characteristic points in an image obtained by a camera by adopting the S-MSCKF front end, tracking the extracted characteristic points, and performing rejection processing;
and performing extended Kalman filtering on the S-MSCKF rear end and IMU data according to the feature point coordinates transmitted from the front end, so as to obtain an optimal estimated value of the current pose of the camera, and using the optimal estimated value as a VIO positioning result.
Further, the removing processing includes removing binocular mismatching points or front and rear frame mismatching points.
Further, the fusion module is configured to:
the independent coordinate system used in the UWB positioning process is used as a world coordinate system of VIO positioning, and the position and the speed of the mobile robot are used as system state vectors;
predicting the error state and covariance of the k moment by taking the VIO positioning result as the nominal value of the system state vector;
using the difference value of the positioning information of the UWB positioning result and the VIO positioning result as a system observation vector to update the error state and the covariance;
and obtaining an optimal estimated value of the real state at the moment k according to the nominal value and the updated error state, and taking the optimal estimated value as the optimal position estimation of the robot.
Compared with the prior art, the invention has the following beneficial effects:
1. data fusion is carried out on UWB positioning and VIO positioning results, and the defect that UWB positioning is easily affected by non-line-of-sight errors and stable positioning information cannot be provided is overcome; meanwhile, the problems that the error of VIO positioning can be accumulated along with the time, so that accurate positioning information cannot be provided, the limitation of light conditions is large, the working in a dark environment cannot be realized and the like are solved, a long-term, stable and accurate positioning result is provided, and the positioning accuracy and robustness of the mobile robot are effectively improved.
2. The invention realizes UWB positioning function by DS-TWR algorithm, and has high ranging accuracy and strong robustness.
3. The final experimental result proves that the method and the device can reduce the maximum error of positioning by about 4.4 percent and the mean square error by about 6.3 percent in the indoor environment with barriers, and overcome the limitation of positioning by a single method of UWB and VIO.
4. The invention uses UWB positioning and VIO positioning technology, and fuses the positioning data of the UWB positioning technology and the VIO positioning technology through an ES-EKF algorithm to provide a positioning result which is closer to a real track, and can serve the future indoor mobile robot field.
Drawings
FIG. 1 is a schematic overall diagram of the present invention;
FIG. 2 is a schematic view of UWB positioning;
FIG. 3 is a schematic diagram of the DS-TWR algorithm of the present invention;
FIG. 4 is a graph showing the comparison of UWB ranging data before and after Kalman filtering, wherein (4 a) is an overall comparison, (4 b) is an enlarged view at 1 in (4 a), and (4 c) is an enlarged view at 2 in (4 a);
FIG. 5 is a UWB positioning flow chart;
FIG. 6 is a front-end visual flow chart of the S-MSCKF algorithm;
FIG. 7 is a flowchart of a first frame of initialization of the visual front end of the S-MSCKF algorithm;
FIG. 8 is a flow chart of the S-MSCKF algorithm visual front end tracking feature points;
FIG. 9 is a back-end filtering flow chart of the S-MSCKF algorithm;
FIG. 10 is a flowchart of the ES-EKF algorithm;
FIG. 11 is a graph comparing the experimental positioning results of a small field without an obstacle;
FIG. 12 is a graph comparing the results of experimental positioning of large sites with obstacles.
Detailed Description
The invention will now be described in detail with reference to the drawings and specific examples. The present embodiment is implemented on the premise of the technical scheme of the present invention, and a detailed implementation manner and a specific operation process are given, but the protection scope of the present invention is not limited to the following examples.
Example 1
Referring to fig. 1, the present embodiment provides an indoor positioning method based on UWB and VIO, including: obtaining a UWB positioning result by adopting a DS-TWR algorithm and a Kalman filtering algorithm; obtaining a VIO positioning result by adopting an S-MSCKF algorithm; performing time synchronization processing on the UWB positioning result and the VIO positioning result; and performing data fusion on the UWB positioning result and the VIO positioning result after time synchronization by using an ES-EKF algorithm to obtain the optimal position estimation of the robot.
1. UWB positioning
UWB positioning is used as an active positioning mode, and the positioning process mainly comprises two steps of ranging between a base station and a tag and coordinate resolving of the tag on the premise that the coordinates of the UWB base station are known. The UWB positioning part schematic is shown in fig. 2. After the base station and the label are deployed, the system adopts a DS-TWR algorithm to measure the distance between the base station and the label, then adopts a Kalman filtering algorithm to carry out smoothing treatment on the distance, and then can obtain the coordinate of the label under the current coordinate system through a solving equation.
(1) Base station tag ranging
In this embodiment, the base station tag Ranging is implemented by using a DS-TWR (Double-side Two-way Ranging) algorithm, which is a wireless signal Ranging algorithm that establishes an equation by making the UWB base station and the tag perform signal propagation twice, so as to solve the distance between the base station and the tag. The algorithm has the main advantages of high ranging accuracy and high robustness.
The principle of the DS-TWR algorithm is shown in FIG. 3: device a actively transmits TX data while recording a transmission time stamp; the device B records a receiving time stamp while receiving the data; time delay T reply Then, the device B transmits data and records a transmission time stamp at the same time; after receiving the data, the device A records the receiving time stamp and adds a transmission on the basis of the receiving time stamp to calculate the time of flight T of UWB signals between the device A and the device B prop . And then multiplying the speed of light c to obtain the distance between the base station tags.
(2) Label coordinate solution
Considering that there are many hopping points in the original data of UWB ranging, this affects the coordinate calculation later, so in this embodiment, kalman filtering is used to smooth the ranging data. The kalman filter is a data processing method that uses the observed quantity of a system as the input quantity of the filter, uses the estimated value of the state quantity of the system as the output quantity of the filter, and uses the statistical characteristics of the system noise and the observed noise to perform optimal estimation. Furthermore, in the nonlinear form, the extended form of the kalman filter is an extended kalman filter (Extended Kalman Filter, EKF); and the Kalman filtering also derives an Error state Kalman filter (ESKF, error-State Kalmann Filter) according to the different system states.
Record x k Z is the true coordinates of the tag k For the tag to observe coordinates through UWB, state transition matrix A k And observation matrix C k Are unit arrays, and the input quantity u of the system k Is a zero matrix. So that the optimal estimation of the covariance according to the label coordinate state at the moment k-1 can be realizedPredicting the coordinate state and covariance of the current K moment label by the formula (1)>
On the basis, the Kalman gain K is calculated according to the formula (2), and the label coordinate state and covariance are updated according to the formula (3) to obtain the optimal estimation
Wherein I is an identity matrix.
The result of the Kalman filtering process is shown in FIG. 4.
2. VIO positioning
Considering that the data fusion has strict requirements on time synchronization, the embodiment adopts the double-mesh version S-MSCKF algorithm which has high operation efficiency and high speed and is suitable for the MSCKF algorithm operated on the embedded platform with limited computing resources, and performs VIO indoor positioning based on image data and IMU data.
The S-MSCKF algorithm is divided into a front-end vision part and a back-end filtering part. The front-end vision is mainly used for detecting characteristic points in the image and issuing data such as coordinates of the characteristic points in a camera coordinate system to the rear end. In the process, the extracted characteristic points are tracked and binocular mismatching points and front and back frame mismatching points are removed. The main function of the S-MSCKF rear end is to perform extended Kalman filtering with the IMU state according to the feature point coordinates transmitted by the vision front end, so as to obtain the optimal estimated value of the current pose of the camera.
(1) Front end vision
The algorithm flow of front-end vision is shown in fig. 6, which includes:
(1) initializing a first frame: after the binocular camera acquires the first frame of picture, FAST feature points in the left-eye picture are detected, and then the left-eye feature points are projected into the right eye according to internal parameters between the left-eye camera and the right-eye camera to carry out binocular feature point matching. Then LK optical flow tracking, epipolar geometric constraint and other treatments are carried out through the image pyramid, and feature points tracked outside the image are removed. And finally, carrying out grid division on the feature points and outputting FAST feature point information. This process is shown in fig. 7.
(2) Tracking characteristic points: different frame images are received successively in the moving process of the camera, the characteristic points extracted from the previous frame image are tracked in the next frame image, and new characteristic points are extracted. This requires us to continuously reject right and left eye mismatching feature points and feature points beyond the image boundary using epipolar geometry constraints, LK optical flow tracking, and Two-points RANSAC methods. This process is shown in fig. 8.
(3) Adding new feature points: if the original feature points are always tracked, some features disappear, some may have accumulated errors as time passes, so new features need to be added to ensure that the program can run all the time.
(2) Back-end filtering
The algorithm of the back-end filtering is shown in fig. 9: the front-end image characteristic point information is received by subscribing related topics, and then related quantities such as gravity, deviation and the like are initialized. The IMU data is then processed to construct a state matrix F and an input matrix G in the differential equation set, and a state transition matrix Φ is found. And then carrying out state prediction, state amplification and state update in sequence, and removing the historical camera state data to obtain the current camera state. And finally issuing pose information topics of the camera.
3. Data fusion
In order to obtain long-term accurate and reliable positioning data, the positioning data of UWB and VIO are fused. The information fusion method mainly comprises the following steps: weighted averaging, kalman filtering, bayesian reasoning, neural network algorithms, etc. Although the weighted average method is simple and visual, it is difficult to obtain the optimal weighted average, and a lot of time is required for calculating the optimal weighted average; information in Bayesian inference in multi-sensor fusion is described as probability distribution, prior probability and likelihood function are needed, and analysis and calculation are complex; the neural network algorithm trains and adjusts the weight of the network according to the input data sample, but the neural network needs a large amount of data, and the real-time performance is poor.
The method considers nonlinear factors, and adopts an ES-EKF algorithm for processing nonlinear problems to perform information fusion in order to avoid possible problems of covariance matrix singularity, universal lock and the like.
In the combined positioning, an independent coordinate system used in the UWB positioning process is used as a world coordinate system of the VIO positioning. Then the position and the speed of the mobile robot are used as the state vector of the system, the state vector at the moment k is thatWherein (x) k ,y k ) Representing the position coordinates of the combination system at the kth moment on a two-dimensional plane; />And->Representing the speed of the combined system in x-direction and y-direction at time k, respectively. Let the nominal value of the state vector at time k be +.>Error value delta X k ,/>The error state model of the system is:
wherein t is the sampling interval time of the combined system;and->Representing the acceleration of the combined system in x-direction and y-direction at time k-1, respectively. The ES-EKF algorithm flow is shown in FIG. 10. Taking VIO positioning data of a received combined system as a nominal value of a state vector>After which an error status update is performed. When the formula (4) is sorted, the prediction of the k-time error state and covariance is shown in the formula (5):
wherein:
wherein P is k-1 The optimal estimation of the covariance of the error state at the moment k-1; q is the noise covariance. Considering that the state vector of the combined system is only used to track the position and velocity of the system, we use acceleration as the random noise of the system.
The difference value of the positioning information obtained by UWB and VIO is used as the observation vector Y of the system k The observation equation for the combined system is:
Y k =HδX k +V k (6)
wherein V is k Is observation noise; y is Y k The values with H are as follows:
then the error state and covariance matrix can be updated, including calculating Kalman gain K k Updating error state δX k|k Updating covariance matrix P k|k . The specific equation is as formula (7):
and then can be according toAnd obtaining the optimal estimated value of the real state at the moment k.
The pseudo code of the above fusion process is as follows:
in order to verify the effectiveness of the method provided by the invention, positioning experiments are respectively carried out on two places of a small place and a large place, wherein no obstacle exists in the scene of the small place; indoor struts are used as obstacle interference in the positioning process of the large field; the experimental time was chosen at 14 pm: around 00, the illumination environment is good at this time, and the experiment is facilitated to develop. The experiment adopts three base stations with known coordinates and a label to be positioned. The three UWB base stations are arranged at equal intervals as two by two as far as possible, so that the accuracy of UWB positioning is ensured. The indoor mobile robot moves along a known fixed trajectory in a nearly uniform motion.
(1) And (3) carrying out barrier-free small-field positioning experiments:
in the small-place positioning experiment without the obstacle, the moving process of the robot has no obstacle, taking the data of one positioning experiment as an example, the result pairs of the three positioning methods are shown in fig. 11. In the figure, a line segment a represents the actual running track of the mobile robot; line segment b represents a UWB-based positioning solution trajectory; line segment c represents a VIO-based positioning solution; line segment d represents the positioning track after the ES-EKF fusion of the positioning data obtained by the two methods. The positioning error index pair is shown in table 1, where MAX represents the maximum error of the positioning result compared to the real result and RMSE represents the mean square error, averaged over 10 experiments.
TABLE 1 statistical table of positioning errors of small field experiments without barriers
It can be seen that under the condition that the indoor space is clear, the indoor positioning method only depends on UWB can achieve centimeter-level positioning accuracy, and the VIO positioning method is satisfactory in positioning result in the initial stage, but along with the advancing of the positioning process, the positioning result in the second half section of the track drifts due to the existence of accumulated errors, so that the positioning result error is overlarge. The accuracy of the combined positioning result is inferior to that of the UWB single positioning method, but is superior to that of the VIO single positioning method.
(2) And (3) carrying out a large-field positioning experiment with an obstacle:
in large-field positioning experiments with obstacles, the moving track of the robot comprises an obstacle, namely an indoor pillar, which can cause certain interference to UWB positioning. The result pairs of the three positioning methods are shown in fig. 12. In this case, a large jump and error occur in the positioning accuracy of UWB, as indicated by the oval dotted line area. The visual inertial odometer positioning result is smooth overall, and the effect is satisfactory in the initial positioning stage. However, the positioning result of the second half of the track drifts due to the existence of the accumulated error, and the positioning result is not ideal due to the large scene change of the camera at the corner. In this case, compared with the two single-sensor positioning methods, the positioning result after the ES-EKF fusion is higher in positioning precision, higher in reliability and closer to the actual track. The positioning result and the maximum error and root mean square error of the actual track are improved, as shown in table 2.
TABLE 2 statistical table of experimental positioning errors for large sites with obstacles
The method adopts an ES-EKF algorithm to fuse UWB positioning data and VIO positioning data, takes a ground mobile robot as a test object, and builds a UWB/VIO hybrid positioning system based on a Ubuntu system and an ROS platform. Through two groups of comparison experiments, the method provided by the invention can improve the accuracy and the robustness of the mobile robot positioning in the indoor environment containing the obstacle, and shows the practical application prospect of the algorithm.
Example 2
The embodiment provides indoor positioning equipment based on UWB and VIO, which comprises a UWB positioning module, a VIO positioning module and a fusion module, wherein the UWB positioning module obtains UWB positioning results by adopting a DS-TWR algorithm and a Kalman filtering algorithm; the VIO positioning module obtains a VIO positioning result by adopting an S-MSCKF algorithm; and the fusion module performs time synchronization processing on the UWB positioning result and the VIO positioning result, and performs data fusion on the UWB positioning result and the VIO positioning result after time synchronization by using an ES-EKF algorithm to obtain the optimal position estimation of the robot.
In this embodiment, the UWB positioning module mainly includes three UWB base stations, a UWB tag attached to the robot, and a UWB result obtaining unit.
In this embodiment, the VIO positioning module includes a camera, an IMU, and a VIO result obtaining unit, and mainly fuses the visual odometer information and the inertial measurement data to obtain the current coordinate. The hardware portion of the VIO positioning module employs an Intel T265 camera, and also includes a set of wide field-of-view binocular fisheye lenses (OV 9282) with a circular field of view of about 165 degrees, integrated internally with a Bosch BMI055 frequency of 200Hz gyroscope and 62.5Hz accelerometer synchronized between hardware and a powerful Intel Movidius Myriad VPU (Video Processing Unit ) so that it can run on an embedded processor.
The IMU is collectively referred to as an inertial navigation system (Inertial Measurement Unit), with gyroscopes, accelerometers, and magnetometers as the primary elements. The gyroscope can obtain the acceleration of each axis, the accelerometer can obtain the acceleration of x, y and z directions, and the magnetometer can obtain the information of the surrounding magnetic field. The main work of the IMU is to fuse the data of the three sensors to obtain more accurate attitude information.
The specific process of realizing positioning by the indoor positioning device based on UWB and VIO in the embodiment is as described in the positioning method of embodiment 1.
The foregoing describes in detail preferred embodiments of the present invention. It should be understood that numerous modifications and variations can be made in accordance with the concepts of the invention by one of ordinary skill in the art without undue burden. Therefore, all technical solutions which can be obtained by logic analysis, reasoning or limited experiments based on the prior art by the person skilled in the art according to the inventive concept shall be within the scope of protection defined by the claims.

Claims (10)

1. An indoor positioning method based on UWB and VIO is characterized by comprising the following steps:
obtaining a UWB positioning result by adopting a DS-TWR algorithm and a Kalman filtering algorithm;
obtaining a VIO positioning result by adopting an S-MSCKF algorithm;
performing time synchronization processing on the UWB positioning result and the VIO positioning result;
performing data fusion on the UWB positioning result and the VIO positioning result after time synchronization by using an ES-EKF algorithm to obtain the optimal position estimation of the robot;
in combined positioning, an independent coordinate system used in the UWB positioning process is used as a world coordinate system of VIO positioning, and then the position and the speed of the mobile robot are used as state vectors of a system, namely the state vector at the moment kWherein (x) k ,y k ) Representing the position coordinates of the combination system at the kth moment on a two-dimensional plane; />And->Representing the speed of the combined system in the x direction and the y direction at the kth moment respectively;
assume that the nominal value of the state vector at time k isError value delta X k ,/>The error state model of the system is:
wherein t is the sampling interval time of the combined system;and->Respectively representing the acceleration of the combined system in the x direction and the y direction at the k-1 time, and taking the VIO positioning data received by the combined system as the nominal value of the state vector +.>Then, updating the error state;
when the above equations are sorted, the prediction of the k-time error state and covariance is as follows:
wherein:
wherein P is k-1 The optimal estimation of the covariance of the error state at the moment k-1; q is the noise covariance, taking the acceleration as the random noise of the system into consideration that the state vector of the combined system is only used for tracking the position and the speed of the system;
the difference value of the positioning information obtained by UWB and VIO is used as the observation vector Y of the system k The observation equation for the combined system is:
Y k =HδX k +V k
wherein V is k Is observation noise; y is Y k The values with H are as follows:
updating error states and covariance matrices, including computing Kalman gain K k Updating error state δX k|k Updating covariance matrix P k|k
2. The indoor positioning method based on UWB and VIO according to claim 1, wherein the UWB positioning result is obtained by:
performing base station tag ranging by adopting a DS-TWR algorithm to obtain ranging data;
and smoothing the ranging data by adopting Kalman filtering to obtain optimal coordinate estimation as a UWB positioning result.
3. The indoor positioning method based on UWB and VIO according to claim 1, wherein the VIO positioning result is obtained by:
adopting the S-MSCKF front end to detect characteristic points in the image, tracking the extracted characteristic points, and performing elimination treatment;
and performing extended Kalman filtering on the S-MSCKF rear end and IMU data according to the feature point coordinates transmitted from the front end, so as to obtain an optimal estimated value of the current pose of the camera, and using the optimal estimated value as a VIO positioning result.
4. The indoor positioning method based on UWB and VIO according to claim 3, wherein the rejection process includes rejecting binocular mismatch points or front-to-back frame mismatch points.
5. The indoor positioning method based on UWB and VIO according to claim 1, wherein the data fusion specifically comprises:
the independent coordinate system used in the UWB positioning process is used as a world coordinate system of VIO positioning, and the position and the speed of the mobile robot are used as system state vectors;
predicting the error state and covariance of the k moment by taking the VIO positioning result as the nominal value of the system state vector;
using the difference value of the positioning information of the UWB positioning result and the VIO positioning result as a system observation vector to update the error state and the covariance;
and obtaining an optimal estimated value of the real state at the moment k according to the nominal value and the updated error state, and taking the optimal estimated value as the optimal position estimation of the robot.
6. An indoor positioning device based on UWB and VIO, comprising:
the UWB positioning module is used for obtaining a UWB positioning result by adopting a DS-TWR algorithm and a Kalman filtering algorithm;
the VIO positioning module is used for obtaining a VIO positioning result by adopting an S-MSCKF algorithm;
the fusion module is used for carrying out time synchronization processing on the UWB positioning result and the VIO positioning result, and carrying out data fusion on the UWB positioning result and the VIO positioning result after time synchronization by using an ES-EKF algorithm to obtain the optimal position estimation of the robot;
in combined positioning, an independent coordinate system used in the UWB positioning process is used as a world coordinate system of VIO positioning, and then the position and the speed of the mobile robot are used as state vectors of a system, namely the state vector at the moment kWherein (x) k ,y k ) Representing the position coordinates of the combination system at the kth moment on a two-dimensional plane; />And->Representing the speed of the combined system in the x direction and the y direction at the kth moment respectively;
assume that the nominal value of the state vector at time k isError value delta X k ,/>The error state model of the system is:
wherein t is the sampling space of the combined systemTime interval;and->Respectively representing the acceleration of the combined system in the x direction and the y direction at the k-1 time, and taking the VIO positioning data received by the combined system as the nominal value of the state vector +.>Then, updating the error state;
when the above equations are sorted, the prediction of the k-time error state and covariance is as follows:
wherein:
wherein P is k-1 The optimal estimation of the covariance of the error state at the moment k-1; q is the noise covariance, taking the acceleration as the random noise of the system into consideration that the state vector of the combined system is only used for tracking the position and the speed of the system;
the difference value of the positioning information obtained by UWB and VIO is used as the observation vector Y of the system k The observation equation for the combined system is:
Y k =HδX k +V k
wherein V is k Is observation noise; y is Y k The values with H are as follows:
updating error states and covariance matrices, including computing Kalman gain K k Updating error state δX k|k Updating covariance matrix P k|k
7. The UWB and VIO based indoor positioning device of claim 6 wherein the UWB positioning module comprises a plurality of UWB base stations, a UWB tag attached to the mobile robot, and a UWB result acquisition unit configured to:
performing base station tag ranging by adopting a DS-TWR algorithm to obtain ranging data;
and smoothing the ranging data by adopting Kalman filtering to obtain optimal coordinate estimation as a UWB positioning result.
8. The indoor UWB and VIO-based positioning device of claim 6, wherein the VIO positioning module comprises a camera, an IMU, and a VIO result acquisition unit configured to:
detecting characteristic points in an image obtained by a camera by adopting the S-MSCKF front end, tracking the extracted characteristic points, and performing rejection processing;
and performing extended Kalman filtering on the S-MSCKF rear end and IMU data according to the feature point coordinates transmitted from the front end, so as to obtain an optimal estimated value of the current pose of the camera, and using the optimal estimated value as a VIO positioning result.
9. The UWB and VIO based indoor positioning device of claim 8, wherein the culling process comprises culling binocular or front to back frame mismatch points.
10. The UWB and VIO based indoor positioning device of claim 6, wherein the fusion module is configured to:
the independent coordinate system used in the UWB positioning process is used as a world coordinate system of VIO positioning, and the position and the speed of the mobile robot are used as system state vectors;
predicting the error state and covariance of the k moment by taking the VIO positioning result as the nominal value of the system state vector;
using the difference value of the positioning information of the UWB positioning result and the VIO positioning result as a system observation vector to update the error state and the covariance;
and obtaining an optimal estimated value of the real state at the moment k according to the nominal value and the updated error state, and taking the optimal estimated value as the optimal position estimation of the robot.
CN202111134020.4A 2021-09-27 2021-09-27 Indoor positioning method and equipment based on UWB and VIO Active CN113758488B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111134020.4A CN113758488B (en) 2021-09-27 2021-09-27 Indoor positioning method and equipment based on UWB and VIO

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111134020.4A CN113758488B (en) 2021-09-27 2021-09-27 Indoor positioning method and equipment based on UWB and VIO

Publications (2)

Publication Number Publication Date
CN113758488A CN113758488A (en) 2021-12-07
CN113758488B true CN113758488B (en) 2023-08-29

Family

ID=78797670

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111134020.4A Active CN113758488B (en) 2021-09-27 2021-09-27 Indoor positioning method and equipment based on UWB and VIO

Country Status (1)

Country Link
CN (1) CN113758488B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114485623B (en) * 2022-02-16 2024-02-23 东南大学 Focusing distance camera-IMU-UWB fusion accurate positioning method
CN114701544B (en) * 2022-03-16 2023-09-26 中国矿业大学 Method and system for accurately positioning multi-source information fusion of underground monorail crane of coal mine
CN115711616B (en) * 2022-06-09 2024-08-30 同济大学 Smooth positioning method and device for indoor and outdoor traversing unmanned aerial vehicle
CN116222556B (en) * 2023-01-13 2024-03-26 浙江大学 Indoor positioning method and system based on multi-source sensor fusion
CN118310523B (en) * 2024-04-01 2024-09-10 广东经纬天地科技有限公司 Indoor positioning method, system, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107389063A (en) * 2017-07-26 2017-11-24 重庆邮电大学 The indoor fusion and positioning method of high accuracy based on GSM/MEMS fusions
CN110315540A (en) * 2019-07-15 2019-10-11 南京航空航天大学 One kind being based on the tightly coupled robot localization method and system of UWB and binocular VO
CN110487267A (en) * 2019-07-10 2019-11-22 湖南交工智能技术有限公司 A kind of UAV Navigation System and method based on VIO&UWB pine combination
CN112378396A (en) * 2020-10-29 2021-02-19 江苏集萃未来城市应用技术研究所有限公司 Hybrid high-precision indoor positioning method based on robust LM visual inertial odometer and UWB
CN113074739A (en) * 2021-04-09 2021-07-06 重庆邮电大学 UWB/INS fusion positioning method based on dynamic robust volume Kalman

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10495763B2 (en) * 2016-02-09 2019-12-03 Qualcomm Incorporated Mobile platform positioning using satellite positioning system and visual-inertial odometry

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107389063A (en) * 2017-07-26 2017-11-24 重庆邮电大学 The indoor fusion and positioning method of high accuracy based on GSM/MEMS fusions
CN110487267A (en) * 2019-07-10 2019-11-22 湖南交工智能技术有限公司 A kind of UAV Navigation System and method based on VIO&UWB pine combination
CN110315540A (en) * 2019-07-15 2019-10-11 南京航空航天大学 One kind being based on the tightly coupled robot localization method and system of UWB and binocular VO
CN112378396A (en) * 2020-10-29 2021-02-19 江苏集萃未来城市应用技术研究所有限公司 Hybrid high-precision indoor positioning method based on robust LM visual inertial odometer and UWB
CN113074739A (en) * 2021-04-09 2021-07-06 重庆邮电大学 UWB/INS fusion positioning method based on dynamic robust volume Kalman

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
UWB和IMU技术融合的室内定位算法研究;王嘉欣;李桂林;曹海东;;单片机与嵌入式系统应用(第08期);全文 *

Also Published As

Publication number Publication date
CN113758488A (en) 2021-12-07

Similar Documents

Publication Publication Date Title
CN113758488B (en) Indoor positioning method and equipment based on UWB and VIO
Zhao et al. A robust laser-inertial odometry and mapping method for large-scale highway environments
US11953910B2 (en) Autonomous platform guidance systems with task planning and obstacle avoidance
US20240202938A1 (en) Fault-tolerance to provide robust tracking for autonomous and non-autonomous positional awareness
US10832056B1 (en) Visual-inertial positional awareness for autonomous and non-autonomous tracking
US10571926B1 (en) Autonomous platform guidance systems with auxiliary sensors and obstacle avoidance
US10571925B1 (en) Autonomous platform guidance systems with auxiliary sensors and task planning
US10390003B1 (en) Visual-inertial positional awareness for autonomous and non-autonomous device
EP3447448B1 (en) Fault-tolerance to provide robust tracking for autonomous and non-autonomous positional awareness
US20180364731A1 (en) Monocular Modes for Autonomous Platform Guidance Systems with Auxiliary Sensors
CN111288989B (en) Visual positioning method for small unmanned aerial vehicle
EP3428760B1 (en) Mapping optimization in autonomous and non-autonomous platforms
CN110726406A (en) Improved nonlinear optimization monocular inertial navigation SLAM method
CN109855621A (en) A kind of composed chamber's one skilled in the art's navigation system and method based on UWB and SINS
Nguyen et al. Flexible and resource-efficient multi-robot collaborative visual-inertial-range localization
CN106767791A (en) A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing
CN112529962A (en) Indoor space key positioning technical method based on visual algorithm
US11774983B1 (en) Autonomous platform guidance systems with unknown environment mapping
US20230314548A1 (en) Unmanned aerial vehicle and localization method for unmanned aerial vehicle
Zachariah et al. Self-motion and wind velocity estimation for small-scale UAVs
CN113066129A (en) Visual positioning and mapping system based on target detection in dynamic environment
Gao et al. Localization of mobile robot based on multi-sensor fusion
CN113763548A (en) Poor texture tunnel modeling method and system based on vision-laser radar coupling
Stuckey et al. An optical spatial localization system for tracking unmanned aerial vehicles using a single dynamic vision sensor
Kong et al. An accurate and reliable positioning methodology for land vehicles in tunnels based on UWB/INS integration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant