CN112702699A - Indoor positioning method fusing UWB and LiDAR - Google Patents

Indoor positioning method fusing UWB and LiDAR Download PDF

Info

Publication number
CN112702699A
CN112702699A CN202011520518.XA CN202011520518A CN112702699A CN 112702699 A CN112702699 A CN 112702699A CN 202011520518 A CN202011520518 A CN 202011520518A CN 112702699 A CN112702699 A CN 112702699A
Authority
CN
China
Prior art keywords
positioning
uwb
lidar
information
base station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011520518.XA
Other languages
Chinese (zh)
Other versions
CN112702699B (en
Inventor
申富饶
高可攀
刘小亮
李俊
赵健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Heguang Intelligent Manufacturing Research Institute Co ltd
Nanjing University
Original Assignee
Nanjing Heguang Intelligent Manufacturing Research Institute Co ltd
Nanjing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Heguang Intelligent Manufacturing Research Institute Co ltd, Nanjing University filed Critical Nanjing Heguang Intelligent Manufacturing Research Institute Co ltd
Priority to CN202011520518.XA priority Critical patent/CN112702699B/en
Publication of CN112702699A publication Critical patent/CN112702699A/en
Application granted granted Critical
Publication of CN112702699B publication Critical patent/CN112702699B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0252Radio frequency fingerprinting
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/14Determining absolute distances from a plurality of spaced points of known location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/33Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Navigation (AREA)

Abstract

The invention discloses an indoor positioning method fusing UWB and LiDAR, comprising the following steps: deploying UWB and LiDAR for relative measurement; acquiring the input of a UWB sensor, and resolving approximate positioning information of the UWB sensor; obtaining input of LiDAR, obtaining approximate positioning information of LiDAR; calculating coordinate conversion, and converting the positioning information of different sensors to the same coordinate system; fitting the motion states of the two devices to obtain UWB positioning orientation information, and supplementing UWB positioning information; judging positioning abnormity; repositioning under the condition of abnormal positioning; and outputting a real-time positioning result. Compared with the prior art, the robot binding frame has the advantages that the positioning precision can be improved, the problem of robot binding frame is solved, and the positioning efficiency is guaranteed.

Description

Indoor positioning method fusing UWB and LiDAR
Technical Field
The invention relates to the technical field of indoor positioning, in particular to an indoor positioning method fusing UWB (Ultra Wideband) and LiDAR (Light Detection and Ranging).
Background
In recent years, with the development of the information age, Location Based Services (LBS) have become an indispensable part of people's daily lives, and in real life, people need to perform positioning in places such as parking lots, airports, and shopping malls. The positioning technology is also an important part of the robot technology, and various service robots can perform accurate navigation, movement and fine operation only by acquiring pose information with higher precision.
The existing positioning scheme can be divided into outdoor positioning and indoor positioning, outdoor positioning services are very popular, positioning accuracy of a Global Navigation Satellite System (GNSS) can reach a meter level, and people can conveniently acquire the positioning services through intelligent terminals such as mobile phones. However, GNSS signals are highly attenuated indoors, which results in a serious reduction in positioning accuracy, and therefore, in terms of indoor positioning, it is necessary to find another technical means for obtaining high-accuracy positioning. Currently, indoor positioning technologies commonly used at home and abroad include infrared, bluetooth, camera, Wireless Fidelity (WiFi), Radio Frequency Identification (RFID), Ultra Wide Band (UWB), and laser Detection and Ranging (LiDAR). Compared with other technologies, the UWB has the advantages of high positioning precision, low power consumption, certain penetrability of signals and the like, and is easily influenced by factors such as non-line-of-sight and the like to generate positioning errors; meanwhile, the LiDAR has the advantages of high resolution, large information amount, strong anti-interference capability and the like, but has larger dependence on environmental characteristics, and can generate larger accumulated errors in an environment without characteristics or with single characteristics.
In addition, the problem of robot kidnapping is also a great challenge of indoor positioning, and the problem of robot kidnapping means that the robot is disturbed by the outside or the positioning fails in the operation process. In case of losing the previous positioning information, quick and accurate repositioning is needed to solve the robot kidnapping problem.
For the indoor positioning problem, many positioning methods based on a single sensor exist, but the methods are all limited by the visual range and other defects of the specific sensor, so that the positioning accuracy is poor or the robot binding problem cannot be solved.
Disclosure of Invention
The invention provides an indoor positioning method fusing UWB and LiDAR, aiming at solving the problem that the existing indoor positioning method is limited by the sight distance of a specific sensor and other defects, so that the positioning accuracy is poor or the robot binding frame cannot be solved.
In order to solve the technical problem, the invention discloses an indoor positioning method fusing UWB and LiDAR, which comprises the following steps:
step 1, deploying a UWB base station indoors, deploying LiDAR on a main body to be positioned, and acquiring coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system respectively;
step 2, obtaining TDOA information output by the UWB base station, and obtaining first positioning information of the main body to be positioned by adopting a resolving method;
step 3, acquiring input information of the LiDAR, and acquiring second positioning information of the main body to be positioned by adopting AMCL;
step 4, calculating coordinate conversion according to the coordinate information, and converting the first positioning information calculated by the UWB base station into a LiDAR map coordinate system;
step 5, fitting a motion track generated by the first positioning information and a motion track generated by the second positioning information within a period of time in the past to obtain orientation information of the positioning of the UWB base station, and completing the first positioning information according to the orientation information;
step 6, judging whether the positioning of the main body to be positioned is abnormal or not, and adjusting the positioning weight according to the judgment result;
step 7, if the positioning of the main body to be positioned is abnormal, initializing the AMCL according to the first positioning information, and repositioning the main body to be positioned;
and 8, calculating and outputting a real-time positioning result of the main body to be positioned according to the first positioning information, the second positioning information and the positioning weight.
Further, in one implementation, the step 1 includes:
step 1-1, deploying at least 3 UWB base stations located on the same plane indoors, selecting any position in the plane as an origin, selecting any direction in the plane as a coordinate axis, establishing a two-dimensional UWB map coordinate system, and measuring coordinate information of each UWB base station in the UWB map coordinate system;
step 1-2, a UWB tag and a LiDAR are deployed on the main body to be positioned, so that when the main body to be positioned moves, the relative position between the main body to be positioned and the UWB tag and the relative position between the main body to be positioned and the LiDAR are kept unchanged;
step 1-3, drawing a map of the place where the main body to be positioned is located, selecting any position in the map as an origin, selecting any direction in the map as a coordinate axis, establishing a two-dimensional LiDAR map coordinate system, and marking the position of each UWB base station in the LiDAR map coordinate system in the map, namely acquiring the coordinate information of the UWB base station in the UWB map coordinate system.
Further, in one implementation, the step 2 includes:
step 2-1, obtaining TDOA information output by the UWB base station, establishing a TDOA fingerprint map in an indoor area based on the TDOA information, matching the received TDOA information on the TDOA fingerprint map by using a k-Nearest Neighbor algorithm (k-Nearest Neighbor, knn), and matching the k-Nearest Neighbor matching result
Figure BDA0002849318470000031
As a result of the approximate positioning;
step 2-2, taylor iteration is carried out to enable the approximate positioning result to be more accurate, and a nonlinear equation of the position to be measured is constructed by using TDOA data:
Figure BDA0002849318470000032
matching result (x) with the k nearest neighborv,yv) For the expansion point, performing taylor expansion on the equation by a taylor iteration method:
fi(xv,yv)+a1,iδx+a2,iδy
Figure BDA0002849318470000033
wherein i denotes the number of the UWB base station,
Figure BDA0002849318470000034
coordinates representing the ith UWB base station, riIndicating the euclidean distance of the k nearest neighbor matching result to the ith UWB base station,
Figure BDA0002849318470000035
a1,irepresenting a first approximation of the x direction, a2,iRepresenting a first order approximation of the y-direction,
Figure BDA0002849318470000036
a correction amount representing a k nearest neighbor matching result;
calculating the k nearest neighbor matching result (x) by the equationv,yv) Correction amount (δ)xy) Continuously iterating this process to obtain a more accurate positioning result, i.e. the first positioning information (x, y) — (x)v,yv)+(δxy)。
The TDOA signal is adopted, the requirement of time synchronization among different sensors can be reduced by using the TDOA signal, and at least 3 base stations need to be deployed in an actual scene.
Further, in one implementation, the step 3 includes:
step 3-1, acquiring input information of the LiDAR, if particle initialization is not performed, performing particle initialization, and scattering a plurality of particles at random positions in space, wherein the total number of the particles is M; if the particle initialization is carried out, updating all the particle poses, and according to the last moment of the ith particle, namely the posterior pose of the ith particle at the moment of t-1
Figure BDA0002849318470000041
And a motion model, which is used for obtaining the current time of the ith particle by sampling, namely the prior pose of the ith particle at the time t
Figure BDA0002849318470000042
The motion model refers to a probabilistic robot 5.4 odometer motion model.
Step 3-2, updating the weight of each particle according to the likelihood domain model
Figure BDA0002849318470000043
And calculating the weight of each particle
Figure BDA0002849318470000044
Mean value of
Figure BDA0002849318470000045
The likelihood domain model references the likelihood domain of the probabilistic robot 6.4 rangefinder.
Step 3-3, according to the weight of each particle
Figure BDA0002849318470000046
Mean value w ofavgUpdating the short-term smoothed estimation parameter wfastAnd a long-term smoothing estimation parameter wslow
wfast=wfastfast(wavg-wfast),
wslow=wslowslow(wavg-wslow);
Wherein alpha isfastEstimating parameters w for short term smoothingfastAttenuation ratio of alphaslowEstimating parameters w for long-term smoothingslowThe attenuation rate of (d);
in particular, alphafastValues of 0.1, alphaslowThe value is 0.001.
Step 3-4, for each of said particles, of
Figure BDA0002849318470000047
Adding a randomly sampled pose to the pose of each of the particles, calculating a weighted sum of all the particles
Figure BDA0002849318470000048
And taking the weighted sum of all the particle poses as second positioning information.
Further, in an implementation manner, the step 4 includes calculating an amount of rotation and an amount of translation of a coordinate transformation, where the coordinate transformation is used to transform the first positioning information to a LiDAR coordinate system where the second positioning information is located, specifically, since an error may be generated when coordinates of the UWB base station are manually calibrated, in order to reduce an influence of the error on coordinate transformation between the UWB map coordinate system and the LiDAR map coordinate system, multiple calculations need to be performed:
4-1, selecting two different UWB base stations as a base station pair each time, wherein the base station pair comprises a first UWB base station and a second UWB base station; the coordinate of the first UWB base station in the UWB map coordinate system is pu1The coordinate of the second UWB base station in the UWB map coordinate system is pu2(ii) a The coordinates of the first UWB base station in the LiDAR map coordinate system are p respectivelyl1The coordinate of the second UWB base station in the LiDAR map coordinate system is pl2(ii) a Calculating a rotation amount (yaw, pitch, roll) from the coordinates of the base station pair;
for indoor 2D positioning, only the yaw angle, the pitch angle and the roll angle are both 0, and the calculation formula is simplified to
Figure BDA0002849318470000051
Wherein the first coordinate difference p1=pu1-pu2Second difference of coordinates p2=pl1-pl2
Step 4-2, selecting different base stations to execute the operation of the step 4-1 for multiple times, and taking the average of the obtained different rotation amounts as the rotation amount of coordinate conversion; specifically, the number of times of execution is selected from the number of times of UWB base stations in this embodiment.
4-3, selecting a calibration base station, wherein the calibration base station is in a UWB map coordinate system puThe coordinate of the calibration base station in the LiDAR map coordinate system is plP is rotated by a rotation (yaw, pitch, roll)uRotation to pu' obtaining p as the translation amount (x, y, z)l-pu′;
And 4-4, selecting different calibration base stations to execute the operation of the step 4-3 for multiple times, and averaging the obtained different translation amounts to be used as the translation amount of coordinate conversion.
Further, in one implementation, since the orientation information cannot be obtained by a solution method due to the low-cost UWB positioning scheme, the orientation information of the UWB positioning is obtained by fitting the motion states output by the UWB and the LiDAR, and the step 5 includes:
step 5-1, recording positioning coordinates of a main body to be positioned, which are acquired by UWB and LiDAR at the current time and past n historical times; setting the positioning coordinate output by the UWB at the current moment as pl0And the output positioning coordinate of the LiDAR at the current moment is pu0And the positioning coordinate output by the UWB at the ith moment before the current moment is pliAnd the locating coordinate output by the LiDAR at the ith moment before the current moment is puiWherein p isli
Figure BDA0002849318470000052
Complementing the coordinates with 0 for 3 dimensions when the coordinates are not sufficient for 3 dimensions;
step 5-2, obtaining a rotation matrix
Figure BDA0002849318470000053
Fitting the positioning coordinate output by the UWB under the action of the rotation matrix R by using the positioning coordinate output by the LiDAR;
and 5-3, converting the pose output by the LiDAR at the current moment according to the rotation matrix R, taking the orientation part of the conversion result as the orientation information of the UWB positioning at the current moment, and supplementing the first positioning information in the step 2 according to the orientation information.
Further, in one implementation, the step 6 includes:
step 6-1, judging whether the positioning of the main body to be positioned is abnormal or not according to the pose difference of the UWB positioning and the LiDAR positioning;
step 6-2, if the judgment result is abnormal positioning, the LiDAR positioning weight w is usedlDecrease by 0.1, wl∈[0,1]Continuing to execute the step 7 for relocation, wherein if LiDAR location weight wlIf the value is 0, the value is not reduced, and the step 7 is continuously executed for relocation; if the positioning is normal, the LiDAR positioning weight w is determinedlIncreasing by 0.1, the step 8 is executed, wherein, if LiDAR positioning weight wlIf 1, the value is not increased, and the step 8 is executed.
Further, in one implementation, the step 6-1 includes:
step 6-1-1, if the particles in the AMCL do not converge, calculating the Euclidean distance between the result of the conversion of the first positioning information to the LiDAR coordinate system and the second positioning information as a pose difference;
if the particles in the AMCL have converged, calculating the Euler angle (theta) corresponding to the rotation matrix R obtained in the step 5xyz) The mode forming the vector is used as a pose difference;
the conversion relation between the rotation matrix R and the Euler angle is as follows:
is provided with
Figure BDA0002849318470000061
Then thetax=atan2(r33,r33),
Figure BDA0002849318470000062
θz=atan2(r21,r11);
And 6-1-2, if the pose difference is larger than a manually set threshold diff _ threshold, determining that the positioning is abnormal and needing to be repositioned.
Further, in one implementation, the step 7 includes: and initializing the particles in the AMCL by using the result of converting the first positioning information into a LiDAR coordinate system, and collecting the input of the LiDAR along with the movement of the body to be positioned so as to continuously converge the particles in the AMCL. Specifically, during relocation, all particles in the AMCL are initialized according to a gaussian distribution, the distribution uses the pose of the current UWB output as a mean value, uses 0.5 as a variance on each coordinate, and uses pi/12 as a variance in each orientation to position the subject motion, and after the LiDAR receives a new input, the particles gradually converge and can output a positioning result with higher precision in step 3.
Further, in one implementation, the step 8 includes: with the movement of the positioning main body, calculating and outputting a real-time positioning result according to the positioning information and the positioning weight of UWB and LiDAR, wherein the real-time positioning result is expressed as a position:
pose=(1-wl)*poseuwb+wl*poselidar
wherein, the positionuwbFor the completed first positioning information, pos, obtained in step 5lidarThe second positioning information obtained in step 3. In this embodiment, the above steps 2 to 8 are continuously performed at a frequency accepted by hardware, and a real-time positioning result is output, where the hardware includes UWB, LiDAR, computing devices, and the like.
According to the technical scheme, the embodiment of the invention provides an indoor positioning method fusing UWB and LiDAR, which comprises the following steps: the method comprises the steps that an indoor UWB base station is deployed, LiDAR is deployed on a main body to be positioned, and coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system is obtained; acquiring TDOA information output by a UWB base station, and acquiring first positioning information of a main body to be positioned by adopting a resolving method; acquiring input information of the LiDAR, and acquiring second positioning information of a main body to be positioned by adopting AMCL; calculating coordinate conversion according to the coordinate information, and converting the first positioning information calculated by the UWB base station into a LiDAR map coordinate system; fitting a motion track generated by the first positioning information and a motion track generated by the second positioning information within a period of time in the past to obtain orientation information of UWB base station positioning, and completing the first positioning information according to the orientation information; judging whether the positioning of the main body to be positioned is abnormal or not, and adjusting the positioning weight according to the judgment result; if the positioning of the main body to be positioned is abnormal, initializing the AMCL according to the first positioning information, and repositioning the main body to be positioned; and finally, calculating and outputting a real-time positioning result of the main body to be positioned according to the first positioning information, the second positioning information and the positioning weight.
In the prior art, the problem of poor positioning accuracy or incapability of solving the problem of robot binding is caused by the visual range and other defects of a specific sensor. By adopting the method, the two low-cost sensors UWB and LiDAR are fused for positioning, real-time indoor positioning is carried out on embedded equipment with lower cost and poorer performance, the advantages of UWB and LiDAR positioning are utilized for complementation, the respective defects are overcome, compared with the prior art, the positioning precision can be improved, the problem of robot binding is solved, and the positioning efficiency is ensured.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained based on these drawings without creative efforts.
FIG. 1 is a schematic workflow diagram of an indoor positioning method incorporating UWB and LiDAR provided in part by an embodiment of the present invention;
FIG. 2 is a schematic workflow diagram for processing input LiDAR information using an adaptive Monte Carlo location method in an indoor location method that incorporates UWB and LiDAR as provided in the examples section of the present invention;
FIG. 3a is a schematic diagram of an experimental scene in an indoor positioning method incorporating UWB and LiDAR according to an embodiment of the present invention;
FIG. 3b is a schematic plan view of an experimental scene in an indoor positioning method incorporating UWB and LiDAR according to an embodiment of the present invention;
FIG. 4a is a schematic diagram of a preset path under a general motion scene of an indoor positioning method fusing UWB and LiDAR provided in part by an embodiment of the invention;
FIG. 4b is a schematic diagram of first positioning information and real-time positioning results in a general motion scene for an indoor positioning method fusing UWB and LiDAR according to an embodiment of the present invention;
FIG. 5a is a schematic diagram of a preset robot motion and manual handling path in a scenario of a kidnapping problem simulation for a UWB and LiDAR fused indoor positioning method according to an embodiment of the present invention;
FIG. 5b is a schematic diagram of first positioning information and real-time positioning results before a kidnapping in a scenario of robot kidnapping problem simulation according to an indoor positioning method that combines UWB and LiDAR provided in the embodiments of the present invention;
FIG. 5c is a schematic diagram of first positioning information and real-time positioning results at the end of a kidnapping in a scenario simulating a kidnapping problem for a UWB and LiDAR fused indoor positioning method provided in the embodiments of the present invention;
fig. 5d is a schematic diagram of first positioning information and a real-time positioning result after a kidnapping is finished in a robot kidnapping problem simulation scenario according to an indoor positioning method fusing UWB and LiDAR provided by an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The embodiment of the invention discloses an indoor positioning method fusing UWB and LiDAR, which is applied to stable indoor positioning in a complex scene, obstacles between a UWB base station and a label in the environment often cause NLOS errors of UWB signals, for example, signal measurement time is inaccurate due to refraction or reflection; in the long-term continuous positioning process, temporary functional faults of the sensors and rapid movement of the positioning main body can cause the problem of the robot kidnapping, so that the common positioning method is invalid.
As shown in fig. 1, the indoor positioning method fusing UWB and LiDAR according to this embodiment includes the following steps:
step 1, deploying a UWB base station indoors, deploying LiDAR on a main body to be positioned, and acquiring coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system respectively; in this embodiment, it needs to be ensured that the relative position between the UWB base station and the subject to be positioned and the relative position between the LiDAR and the subject to be positioned do not change greatly with the movement of the positioning subject.
Step 2, obtaining TDOA information output by the UWB base station, and obtaining first positioning information of the main body to be positioned by adopting a resolving method;
in the embodiment, the UWB bs provides a plurality of ranging signals, and the ranging signals include a Time Difference of Arrival (TDOA) signal and a Time of Arrival (TOA) signal, and the TDOA signal is adopted in the present invention, so that the requirement of Time synchronization between different sensors can be reduced, and specifically, at least 3 bss need to be deployed in a practical scenario.
Step 3, acquiring input information of the LiDAR, and acquiring second positioning information of the main body to be positioned by adopting AMCL;
step 4, calculating coordinate conversion according to the coordinate information, and converting the first positioning information calculated by the UWB base station into a LiDAR map coordinate system;
step 5, fitting a motion track generated by the first positioning information and a motion track generated by the second positioning information within a period of time in the past to obtain orientation information of the positioning of the UWB base station, and completing the first positioning information according to the orientation information;
step 6, judging whether the positioning of the main body to be positioned is abnormal or not, and adjusting the positioning weight according to the judgment result;
step 7, if the positioning of the main body to be positioned is abnormal, initializing the AMCL according to the first positioning information, and repositioning the main body to be positioned;
and 8, calculating and outputting a real-time positioning result of the main body to be positioned according to the first positioning information, the second positioning information and the positioning weight.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 1 includes:
step 1-1, deploying at least 3 UWB base stations located on the same plane indoors, selecting any position in the plane as an origin, selecting any direction in the plane as a coordinate axis, establishing a two-dimensional UWB map coordinate system, and measuring coordinate information of each UWB base station in the UWB map coordinate system;
step 1-2, a UWB tag and a LiDAR are deployed on the main body to be positioned, so that when the main body to be positioned moves, the relative position between the main body to be positioned and the UWB tag and the relative position between the main body to be positioned and the LiDAR are kept unchanged;
step 1-3, drawing a map of the place where the main body to be positioned is located, selecting any position in the map as an origin, selecting any direction in the map as a coordinate axis, establishing a two-dimensional LiDAR map coordinate system, and marking the position of each UWB base station in the LiDAR map coordinate system in the map, namely acquiring the coordinate information of the UWB base station in the UWB map coordinate system.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 2 includes:
step 2-1, obtaining TDOA information output by the UWB base station, establishing a TDOA fingerprint map in an indoor area based on the TDOA information, matching the received TDOA information on the TDOA fingerprint map by using a k-Nearest Neighbor algorithm (k-Nearest Neighbor, knn), and matching the k-Nearest Neighbor matching result
Figure BDA0002849318470000106
As a result of the approximate positioning;
step 2-2, taylor iteration is carried out to enable the approximate positioning result to be more accurate, and a nonlinear equation of the position to be measured is constructed by using TDOA data:
Figure BDA0002849318470000101
matching result (x) with the k nearest neighborv,yv) For the expansion point, performing taylor expansion on the equation by a taylor iteration method:
fi(xv,yv)+a1,iδx+a2,iδy
Figure BDA0002849318470000102
wherein i denotes the number of the UWB base station,
Figure BDA0002849318470000103
coordinates representing the ith UWB base station, riIndicating the euclidean distance of the k nearest neighbor matching result to the ith UWB base station,
Figure BDA0002849318470000104
a1,irepresenting a first approximation of the x direction, a2,iRepresenting a first order approximation of the y-direction,
Figure BDA0002849318470000105
a correction amount representing a k nearest neighbor matching result;
calculating the k nearest neighbor matching result (x) by the equationv,yv) Correction amount (δ)xy) Continuously iterating this process to obtain a more accurate positioning result, i.e. the first positioning information (x, y) — (x)v,yv)+(δxy)。
As shown in fig. 2, in the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 3 includes:
step 3-1, acquiring input information of the LiDAR, if particle initialization is not performed, performing particle initialization, and scattering a plurality of particles at random positions in space, wherein the total number of the particles is M; if the particle initialization is carried out, updating all the particle poses,according to the posterior position of the ith particle at the last moment, namely the ith particle at the moment t-1
Figure BDA0002849318470000111
And a motion model, which is used for obtaining the current time of the ith particle by sampling, namely the prior pose of the ith particle at the time t
Figure BDA0002849318470000112
Specifically, the motion model refers to a probabilistic robot 5.4 odometer motion model.
Step 3-2, updating the weight of each particle according to the likelihood domain model
Figure BDA0002849318470000113
And calculating the weight of each particle
Figure BDA0002849318470000114
Mean value of
Figure BDA0002849318470000115
Specifically, the likelihood domain model refers to a likelihood domain of a probabilistic robot 6.4 range finder.
Step 3-3, according to the weight of each particle
Figure BDA0002849318470000116
Mean value w ofavgUpdating the short-term smoothed estimation parameter wfastAnd a long-term smoothing estimation parameter wslow
wfast=wfastfast(wavg-wfast),
wslow=wslowslow(wavg-wslow);
Wherein alpha isfastEstimating parameters w for short term smoothingfastAttenuation ratio of alphaslowEstimating parameters w for long-term smoothingslowThe attenuation rate of (d);
in particular, alphafastValues of 0.1, alphaslowValue taking0.001。
Step 3-4, for each of said particles, of
Figure BDA0002849318470000117
Adding a randomly sampled pose to the pose of each of the particles, calculating a weighted sum of all the particles
Figure BDA0002849318470000118
And taking the weighted sum of all the particle poses as second positioning information.
Because the coordinates of the UWB base station are calibrated manually, errors are generated, and in order to reduce the influence of the errors on the coordinate conversion between the coordinate system of the UWB map and the coordinate system of the LiDAR map, calculation needs to be carried out for many times. Therefore, in the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 4 includes calculating the rotation amount and the translation amount of the coordinate transformation, which is used to transform the first positioning information to the LiDAR coordinate system in which the second positioning information is located:
4-1, selecting two different UWB base stations as a base station pair each time, wherein the base station pair comprises a first UWB base station and a second UWB base station; the coordinate of the first UWB base station in the UWB map coordinate system is pu1The coordinate of the second UWB base station in the UWB map coordinate system is pu2(ii) a The coordinates of the first UWB base station in the LiDAR map coordinate system are p respectivelyl1The coordinate of the second UWB base station in the LiDAR map coordinate system is pl2(ii) a Calculating a rotation amount (yaw, pitch, roll) from the coordinates of the base station pair;
for indoor 2D positioning, only the yaw angle, the pitch angle and the roll angle are both 0, and the calculation formula is simplified to
Figure BDA0002849318470000121
Wherein the first coordinate difference p1=pu1-pu2Second difference of coordinates p2=pl1-pl2
Step 4-2, selecting different base stations to execute the operation of the step 4-1 for multiple times, and taking the average of the obtained different rotation amounts as the rotation amount of coordinate conversion; specifically, in this embodiment, the number of UWB base stations is selected.
4-3, selecting a calibration base station, wherein the calibration base station is in a UWB map coordinate system puThe coordinate of the calibration base station in the LiDAR map coordinate system is plP is rotated by a rotation (yaw, pitch, roll)uRotation to pu' obtaining p as the translation amount (x, y, z)l-pu′;
And 4-4, selecting different calibration base stations to execute the operation of the step 4-3 for multiple times, and averaging the obtained different translation amounts to be used as the translation amount of coordinate conversion.
Because a low-cost UWB positioning scheme is adopted, the orientation information cannot be obtained through a resolving method, and the orientation information of UWB positioning is obtained through the motion state fitting output by UWB and LiDAR. Therefore, in the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 5 includes:
step 5-1, recording positioning coordinates of a main body to be positioned, which are acquired by UWB and LiDAR at the current time and past n historical times; setting the positioning coordinate output by the UWB at the current moment as pl0And the output positioning coordinate of the LiDAR at the current moment is pu0And the positioning coordinate output by the UWB at the ith moment before the current moment is pliAnd the locating coordinate output by the LiDAR at the ith moment before the current moment is puiWherein p isli
Figure BDA0002849318470000122
Complementing the coordinates with 0 for 3 dimensions when the coordinates are not sufficient for 3 dimensions;
step 5-2, obtaining a rotation matrix
Figure BDA0002849318470000123
Fitting the positioning coordinate output by the UWB under the action of the rotation matrix R by using the positioning coordinate output by the LiDAR;
and 5-3, converting the pose output by the LiDAR at the current moment according to the rotation matrix R, taking the orientation part of the conversion result as the orientation information of the UWB positioning at the current moment, and supplementing the first positioning information in the step 2 according to the orientation information.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 6 includes:
step 6-1, judging whether the positioning of the main body to be positioned is abnormal or not according to the pose difference of the UWB positioning and the LiDAR positioning;
step 6-2, if the judgment result is abnormal positioning, the LiDAR positioning weight w is usedlDecrease by 0.1, wl∈[0,1]Continuing to execute the step 7 for relocation, wherein if LiDAR location weight wlIf the value is 0, the value is not reduced, and the step 7 is continuously executed for relocation; if the positioning is normal, the LiDAR positioning weight w is determinedlIncreasing by 0.1, the step 8 is executed, wherein, if LiDAR positioning weight wlIf 1, the value is not increased, and the step 8 is executed.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 6-1 includes:
step 6-1-1, if the particles in the AMCL do not converge, calculating the Euclidean distance between the result of the conversion of the first positioning information to the LiDAR coordinate system and the second positioning information as a pose difference;
if the particles in the AMCL have converged, calculating the Euler angle (theta) corresponding to the rotation matrix R obtained in the step 5xyz) The mode forming the vector is used as a pose difference;
the conversion relation between the rotation matrix R and the Euler angle is as follows:
is provided with
Figure BDA0002849318470000131
Then thetax=atan2(r33,r33),
Figure BDA0002849318470000132
θz=atan2(r21,r11);
And 6-1-2, if the pose difference is larger than a manually set threshold diff _ threshold, determining that the positioning is abnormal and needing to be repositioned.
Specifically, in this embodiment, when the particles in the AMCL are not converged, diff _ threshold is 1; when the particles in the AMCL have converged, let diff _ threshold be pi/6.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 7 includes: and initializing the particles in the AMCL by using the result of converting the first positioning information into a LiDAR coordinate system, and collecting the input of the LiDAR along with the movement of the body to be positioned so as to continuously converge the particles in the AMCL. Specifically, in this embodiment, all particles in the AMCL are initialized at the time of relocation with a gaussian distribution having the pose of the current UWB output as a mean, a variance of 0.5 at each coordinate, and a variance of pi/12 at each orientation. The subject is positioned for motion, and after the LiDAR receives a new input, the particles gradually converge and a higher accuracy positioning result can be output in step 3.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 8 includes: with the movement of the positioning main body, calculating and outputting a real-time positioning result according to the positioning information and the positioning weight of UWB and LiDAR, wherein the real-time positioning result is expressed as a position:
pose=(1-wl)*poseuwb+wl*poselidar
wherein, the positionuwbFor the completed first positioning information, pos, obtained in step 5lidarThe second positioning information obtained in step 3.
In this embodiment, the above steps 2 to 8 are continuously performed at a frequency accepted by hardware, and a real-time positioning result is output, where the hardware includes a UWB base station, a LiDAR, a computing device, and the like.
Examples
In order to verify the effectiveness of the indoor positioning method fusing UWB and LiDAR described in this embodiment, example verification is performed in an actual scene, an experimental scene is set to be a room with a length of 6 meters and a width of 4 meters, some obstacles are arranged in the room, the experimental scene is shown in fig. 3a and 3b, fig. 3a is arrangement and grid labeling of the experimental scene, wherein the length and the width of each grid are 1 meter, fig. 3b is a schematic plan view of the experimental scene, and the obstacles are marked by black blocks. Deploying a mobile robot in a scene, and performing indoor positioning according to the following steps:
step 1, in this embodiment, UWB base stations are deployed at four corners of a scene, LiDAR is installed on a main body to be positioned, that is, a mobile robot, the position of the UWB base station is determined, coordinates of each UWB base station are marked in a pre-drawn scene two-dimensional map, that is, coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system, respectively, is obtained, and the mobile robot is caused to travel according to a preset path in fig. 4a by sending a motion instruction;
step 2, obtaining TDOA information output by UWB, and obtaining first positioning information of the mobile robot by adopting a resolving method;
step 3, acquiring input information of the LiDAR, and acquiring second positioning information of the mobile robot by using an Adaptive Monte Carlo positioning method (AMCL);
step 4, calculating coordinate conversion according to a UWB map coordinate system CS _ UWB and a LiDAR map coordinate system CS _ LiDAR, and converting the positioning information of different sensors to be in the same coordinate system, namely converting the first positioning information calculated by the UWB base station to be in the LiDAR map coordinate system;
step 5, respectively summarizing historical positioning information of the UWB and the LiDAR to obtain respective motion state information of the UWB and the LiDAR, fitting the motion states of the UWB and the LiDAR to obtain orientation information of UWB positioning, namely fitting a motion track generated by first positioning information and a motion track generated by second positioning information in a past period of time, and supplementing the first positioning information according to the orientation information;
step 6, judging whether the mobile robot is abnormally positioned or not according to the pose difference of UWB positioning and LiDAR positioning, and reducing the LiDAR positioning weight w if the mobile robot is abnormally positionedlContinuing to execute the step 7 for repositioning, otherwise increasing the LiDAR positioning weight wlExecuting step 8;
step 7, if the mobile robot is abnormally positioned, initializing the AMCL according to the first positioning information, repositioning the mobile robot, namely initializing particles in the AMCL, and collecting LiDAR input along with the movement of the mobile robot to enable the particles in the AMCL to be continuously converged;
and 8, calculating the current pose according to the first positioning information, the second positioning information, the positioning weight and the positioning weight, and outputting a real-time positioning result of the mobile robot along with the motion of the mobile robot.
And connecting the fusion positioning results in the walking process of the mobile robot, and drawing the walking route of the mobile robot, as shown by the smooth line in fig. 4 b. The line with large deviation and jitter in fig. 4b is an original positioning route obtained by calculation only using UWB signals, and the error of the result of fusion positioning is less than 10 cm through manual measurement.
Meanwhile, the invention carries out example verification aiming at the problem of robot kidnapping, and the specific flow is as follows: in the example verification process, when the mobile robot moves to the "move away" position shown in fig. 5a, an instruction is sent to stop the movement of the mobile robot, the mobile robot is manually moved to the "place" position shown in fig. 5a, the positioning process is entered again, and after the repositioning process in step 7 is completed, the robot continues to move according to the predetermined route. The manual handling simulates the situation of robot positioning failure, and after the temporary failure, the invention obtains more accurate positioning again.
In the example verification of the robot kidnapping problem, the positioning result before moving is shown in fig. 5b, the positioning result before moving to the robot repositioning completion is shown in fig. 5c, the positioning result after repositioning to the operation completion is shown in fig. 5d, the smooth lines in the figures represent the fusion positioning result, and the repositioning process is completed within 5 seconds. The result shows that the mobile robot can be quickly and accurately repositioned after being transported to a new position, and the problem of robot kidnapping can be effectively solved.
According to the technical scheme, the embodiment of the invention provides an indoor positioning method fusing UWB and LiDAR, which comprises the following steps: the method comprises the steps that an indoor UWB base station is deployed, LiDAR is deployed on a main body to be positioned, and coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system is obtained; acquiring TDOA information output by a UWB base station, and acquiring first positioning information of a main body to be positioned by adopting a resolving method; acquiring input information of the LiDAR, and acquiring second positioning information of a main body to be positioned by adopting AMCL; calculating coordinate conversion according to the coordinate information, and converting the first positioning information calculated by the UWB base station into a LiDAR map coordinate system; fitting a motion track generated by the first positioning information and a motion track generated by the second positioning information within a period of time in the past to obtain orientation information of UWB base station positioning, and completing the first positioning information according to the orientation information; judging whether the positioning of the main body to be positioned is abnormal or not, and adjusting the positioning weight according to the judgment result; if the positioning of the main body to be positioned is abnormal, initializing the AMCL according to the first positioning information, and repositioning the main body to be positioned; and finally, calculating and outputting a real-time positioning result of the main body to be positioned according to the first positioning information, the second positioning information and the positioning weight.
In the prior art, the problem of poor positioning accuracy or incapability of solving the problem of robot binding is caused by the visual range and other defects of a specific sensor. By adopting the method, the two low-cost sensors UWB and LiDAR are fused for positioning, real-time indoor positioning is carried out on embedded equipment with lower cost and poorer performance, the advantages of UWB and LiDAR positioning are utilized for complementation, the respective defects are overcome, compared with the prior art, the positioning precision can be improved, the problem of robot binding is solved, and the positioning efficiency is ensured.
In particular implementations, the present invention also provides a computer storage medium, where the computer storage medium may store a program that, when executed, may include some or all of the steps in embodiments of a method for UWB and LiDAR fused indoor positioning provided by the present invention. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts in the various embodiments in this specification may be referred to each other. The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention.

Claims (10)

1. An indoor positioning method fusing UWB and LiDAR is characterized by comprising the following steps:
step 1, deploying a UWB base station indoors, deploying LiDAR on a main body to be positioned, and acquiring coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system respectively;
step 2, obtaining TDOA information output by the UWB base station, and obtaining first positioning information of the main body to be positioned by adopting a resolving method;
step 3, acquiring input information of the LiDAR, and acquiring second positioning information of the main body to be positioned by adopting AMCL;
step 4, calculating coordinate conversion according to the coordinate information, and converting the first positioning information calculated by the UWB base station into a LiDAR map coordinate system;
step 5, fitting a motion track generated by the first positioning information and a motion track generated by the second positioning information within a period of time in the past to obtain orientation information of the positioning of the UWB base station, and completing the first positioning information according to the orientation information;
step 6, judging whether the positioning of the main body to be positioned is abnormal or not, and adjusting the positioning weight according to the judgment result;
step 7, if the positioning of the main body to be positioned is abnormal, initializing the AMCL according to the first positioning information, and repositioning the main body to be positioned;
and 8, calculating and outputting a real-time positioning result of the main body to be positioned according to the first positioning information, the second positioning information and the positioning weight.
2. The method of claim 1, wherein step 1 comprises:
step 1-1, deploying at least 3 UWB base stations located on the same plane indoors, selecting any position in the plane as an origin, selecting any direction in the plane as a coordinate axis, establishing a two-dimensional UWB map coordinate system, and measuring coordinate information of each UWB base station in the UWB map coordinate system;
step 1-2, a UWB tag and a LiDAR are deployed on the main body to be positioned, so that when the main body to be positioned moves, the relative position between the main body to be positioned and the UWB tag and the relative position between the main body to be positioned and the LiDAR are kept unchanged;
step 1-3, drawing a map of the place where the main body to be positioned is located, selecting any position in the map as an origin, selecting any direction in the map as a coordinate axis, establishing a two-dimensional LiDAR map coordinate system, and marking the position of each UWB base station in the LiDAR map coordinate system in the map, namely acquiring the coordinate information of the UWB base station in the UWB map coordinate system.
3. The method of claim 2, wherein step 2 comprises:
step 2-1, obtaining TDOA information output by the UWB base station, establishing a TDOA fingerprint map in an indoor area based on the TDOA information, matching the received TDOA information on the TDOA fingerprint map by using a k nearest neighbor algorithm, and matching the k nearest neighbor result
Figure FDA0002849318460000021
As a result of the approximate positioning;
step 2-2, taylor iteration is carried out to enable the approximate positioning result to be more accurate, and a nonlinear equation of the position to be measured is constructed by using TDOA data:
Figure FDA0002849318460000022
matching result (x) with the k nearest neighborv,yv) For the expansion point, performing taylor expansion on the equation by a taylor iteration method:
fi(xv,yv)+a1,iδx+a2,iδy
Figure FDA0002849318460000023
wherein i denotes the number of the UWB base station,
Figure FDA0002849318460000024
coordinates representing the ith UWB base station, riIndicating the euclidean distance of the k nearest neighbor matching result to the ith UWB base station,
Figure FDA0002849318460000025
a1,irepresenting a first approximation of the x direction, a2,iRepresenting a first order approximation of the y-direction,
Figure FDA0002849318460000026
a correction amount representing a k nearest neighbor matching result;
calculating the k nearest neighbor matching result (x) by the equationv,yv) Correction amount (δ)x,δy) Continuously iterating this process to obtain a more accurate positioning result, i.e. the first positioning information (x, y) — (x)v,yv)+(δx,δy)。
4. The method of claim 3, wherein step 3 comprises:
step 3-1, acquiring input information of the LiDAR, if particle initialization is not performed, performing particle initialization, and scattering a plurality of particles at random positions in space, wherein the total number of the particles is M; if the particle initialization is carried out, updating all the particle poses, and according to the last moment of the ith particle, namely the posterior pose of the ith particle at the moment of t-1
Figure FDA0002849318460000031
And a motion model, which is used for obtaining the current time of the ith particle by sampling, namely the prior pose of the ith particle at the time t
Figure FDA0002849318460000032
Step 3-2, updating the weight of each particle according to the likelihood domain model
Figure FDA0002849318460000033
And calculating the weight of each particle
Figure FDA0002849318460000034
Mean value of
Figure FDA0002849318460000035
Step 3-3, according to the weight of each particle
Figure FDA0002849318460000036
Mean value w ofavgUpdating the short-term smoothed estimation parameter wfastAnd a long-term smoothing estimation parameter wslow
wfast=wfastfast(wavg-wfast),
wslow=wslowslow(wavg-wslow);
Wherein alpha isfastEstimating parameters w for short term smoothingfastAttenuation ratio of alphaslowEstimating parameters w for long-term smoothingslowThe attenuation rate of (d);
step 3-4, for each of said particles, of
Figure FDA0002849318460000037
Adding a randomly sampled pose to the pose of each of the particles, calculating a weighted sum of all the particles
Figure FDA0002849318460000038
And taking the weighted sum of all the particle poses as second positioning information.
5. The method of claim 4, wherein step 4 comprises calculating an amount of rotation and an amount of translation of a coordinate transformation used to transform the first positioning information to the LiDAR coordinate system in which the second positioning information is located:
4-1, selecting two different UWB base stations as a base station pair each time, wherein the base station pair comprises a first UWB base station and a second UWB base station; the coordinate of the first UWB base station in the UWB map coordinate system is pu1The coordinate of the second UWB base station in the UWB map coordinate system is pu2(ii) a The coordinates of the first UWB base station in the LiDAR map coordinate system are p respectivelyl1The coordinate of the second UWB base station in the LiDAR map coordinate system is pl2(ii) a Calculating a rotation amount (yaw, pitch, roll) from the coordinates of the base station pair;
for indoor 2D positioning, only the yaw angle, the pitch angle and the roll angle are both 0, and the calculation formula is simplified to
Figure FDA0002849318460000039
Wherein the first coordinate difference p1=pu1-pu2Second difference of coordinates p2=pl1-pl2
Step 4-2, selecting different base stations to execute the operation of the step 4-1 for multiple times, and taking the average of the obtained different rotation amounts as the rotation amount of coordinate conversion;
4-3, selecting a calibration base station, wherein the calibration base station is in a UWB map coordinate system puThe coordinate of the calibration base station in the LiDAR map coordinate system is plP is rotated by a rotation (yaw, pitch, roll)uRotation to pu' obtaining p as the translation amount (x, y, z)l-pu′;
And 4-4, selecting different calibration base stations to execute the operation of the step 4-3 for multiple times, and averaging the obtained different translation amounts to be used as the translation amount of coordinate conversion.
6. The method of claim 5, wherein the step 5 comprises:
step 5-1, recording positioning coordinates of a main body to be positioned, which are acquired by UWB and LiDAR at the current time and past n historical times; setting the positioning coordinate output by the UWB at the current moment as p10And the output positioning coordinate of the LiDAR at the current moment is pu0And the positioning coordinate output by the UWB at the ith moment before the current moment is pliAnd the locating coordinate output by the LiDAR at the ith moment before the current moment is puiWherein p isli
Figure FDA0002849318460000041
Complementing the coordinates with 0 for 3 dimensions when the coordinates are not sufficient for 3 dimensions;
step 5-2, obtaining a rotation matrix
Figure FDA0002849318460000042
Fitting the positioning coordinate output by the UWB under the action of the rotation matrix R by using the positioning coordinate output by the LiDAR;
and 5-3, converting the pose output by the LiDAR at the current moment according to the rotation matrix R, taking the orientation part of the conversion result as the orientation information of the UWB positioning at the current moment, and supplementing the first positioning information in the step 2 according to the orientation information.
7. The method of claim 6, wherein the step 6 comprises:
step 6-1, judging whether the positioning of the main body to be positioned is abnormal or not according to the pose difference of the UWB positioning and the LiDAR positioning;
step 6-2, if the judgment result is abnormal positioning, the LiDAR positioning weight w is usedlDecrease by 0.1, wl∈[0,1]Continuing to execute the step 7 for relocation, wherein if LiDAR location weight wlIf the value is 0, the value is not reduced, and the step 7 is continuously executed for relocation; if the positioning is normal, the LiDAR positioning weight w is determinedlIncreasing by 0.1, the step 8 is executed, wherein, if LiDAR positioning weight wlIf 1, the value is not increased, and the step 8 is executed.
8. The method of claim 7, wherein step 6-1 comprises:
step 6-1-1, if the particles in the AMCL do not converge, calculating the Euclidean distance between the result of the conversion of the first positioning information to the LiDAR coordinate system and the second positioning information as a pose difference;
if the particles in the AMCL have converged, calculating the Euler angle (theta) corresponding to the rotation matrix R obtained in the step 5x,θy,θz) The mode forming the vector is used as a pose difference;
the conversion relation between the rotation matrix R and the Euler angle is as follows:
is provided with
Figure FDA0002849318460000051
Then thetax=atan2(r33,r33),
Figure FDA0002849318460000052
θz=atan2(r21,r11);
And 6-1-2, if the pose difference is larger than a manually set threshold diff threshold, determining that the positioning is abnormal and needing repositioning.
9. The method of claim 8, wherein the step 7 comprises: and initializing the particles in the AMCL by using the result of converting the first positioning information into a LiDAR coordinate system, and collecting the input of the LiDAR along with the movement of the body to be positioned so as to continuously converge the particles in the AMCL.
10. The method of claim 9, wherein step 8 comprises: with the movement of the positioning main body, calculating and outputting a real-time positioning result according to the positioning information and the positioning weight of UWB and LiDAR, wherein the real-time positioning result is expressed as a position:
pose=(1-wl)*poseuwb+wl*poselidar
wherein, the positionuwbFor the completed first positioning information, pos, obtained in step 5lidarThe second positioning information obtained in step 3.
CN202011520518.XA 2020-12-21 2020-12-21 Indoor positioning method fusing UWB and LiDAR Active CN112702699B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011520518.XA CN112702699B (en) 2020-12-21 2020-12-21 Indoor positioning method fusing UWB and LiDAR

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011520518.XA CN112702699B (en) 2020-12-21 2020-12-21 Indoor positioning method fusing UWB and LiDAR

Publications (2)

Publication Number Publication Date
CN112702699A true CN112702699A (en) 2021-04-23
CN112702699B CN112702699B (en) 2021-12-03

Family

ID=75509742

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011520518.XA Active CN112702699B (en) 2020-12-21 2020-12-21 Indoor positioning method fusing UWB and LiDAR

Country Status (1)

Country Link
CN (1) CN112702699B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113238186A (en) * 2021-05-08 2021-08-10 珠海市一微半导体有限公司 Mobile robot repositioning method, system and chip
CN113342057A (en) * 2021-08-05 2021-09-03 上海特金信息科技有限公司 Track fusion method and device, unmanned aerial vehicle detection system, equipment and medium
CN114721001A (en) * 2021-11-17 2022-07-08 长春理工大学 Mobile robot positioning method based on multi-sensor fusion
CN115561703A (en) * 2022-09-30 2023-01-03 中国测绘科学研究院 Three-dimensional positioning method and system for single UWB (ultra wide band) base station assisted by laser radar in closed space

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106454722A (en) * 2016-09-09 2017-02-22 华南理工大学 Dynamic differential positioning method based on map matching for mobile communication terminals
CN109275093A (en) * 2018-10-08 2019-01-25 高子庆 Based on UWB positioning and the matched localization method of laser map and mobile terminal
US20200033857A1 (en) * 2018-07-24 2020-01-30 Huili Yu Autonomous target following method and device
CN110849367A (en) * 2019-10-08 2020-02-28 杭州电子科技大学 Indoor positioning and navigation method based on visual SLAM fused with UWB
CN111121754A (en) * 2019-12-31 2020-05-08 深圳市优必选科技股份有限公司 Mobile robot positioning navigation method and device, mobile robot and storage medium
CN111240341A (en) * 2020-02-14 2020-06-05 南京理工大学 Vehicle omnibearing following method based on UWB and laser radar sensor
KR20200082219A (en) * 2018-12-28 2020-07-08 한서대학교 산학협력단 Collision avoidance system of inner aerial drone and method thereof
KR20200082234A (en) * 2018-12-28 2020-07-08 한서대학교 산학협력단 Indoor Flight System for Unmanned Aerial Vehicle and Method Thereof
CN112082553A (en) * 2020-07-24 2020-12-15 广州易来特自动驾驶科技有限公司 Indoor positioning method and positioning device based on WIFI and laser radar and robot

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106454722A (en) * 2016-09-09 2017-02-22 华南理工大学 Dynamic differential positioning method based on map matching for mobile communication terminals
US20200033857A1 (en) * 2018-07-24 2020-01-30 Huili Yu Autonomous target following method and device
CN109275093A (en) * 2018-10-08 2019-01-25 高子庆 Based on UWB positioning and the matched localization method of laser map and mobile terminal
KR20200082219A (en) * 2018-12-28 2020-07-08 한서대학교 산학협력단 Collision avoidance system of inner aerial drone and method thereof
KR20200082234A (en) * 2018-12-28 2020-07-08 한서대학교 산학협력단 Indoor Flight System for Unmanned Aerial Vehicle and Method Thereof
CN110849367A (en) * 2019-10-08 2020-02-28 杭州电子科技大学 Indoor positioning and navigation method based on visual SLAM fused with UWB
CN111121754A (en) * 2019-12-31 2020-05-08 深圳市优必选科技股份有限公司 Mobile robot positioning navigation method and device, mobile robot and storage medium
CN111240341A (en) * 2020-02-14 2020-06-05 南京理工大学 Vehicle omnibearing following method based on UWB and laser radar sensor
CN112082553A (en) * 2020-07-24 2020-12-15 广州易来特自动驾驶科技有限公司 Indoor positioning method and positioning device based on WIFI and laser radar and robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YANG SONG; MINGYANG GUAN; WEE PENG TAY; CHOI LOOK LAW; CHANGYUN: "UWB/LiDAR Fusion For Cooperative Range-Only SLAM", 《2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA)》 *
陈志键等: "室内UWB/LiDAR组合定位算法", 《导航定位学报》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113238186A (en) * 2021-05-08 2021-08-10 珠海市一微半导体有限公司 Mobile robot repositioning method, system and chip
CN113342057A (en) * 2021-08-05 2021-09-03 上海特金信息科技有限公司 Track fusion method and device, unmanned aerial vehicle detection system, equipment and medium
CN114721001A (en) * 2021-11-17 2022-07-08 长春理工大学 Mobile robot positioning method based on multi-sensor fusion
CN115561703A (en) * 2022-09-30 2023-01-03 中国测绘科学研究院 Three-dimensional positioning method and system for single UWB (ultra wide band) base station assisted by laser radar in closed space

Also Published As

Publication number Publication date
CN112702699B (en) 2021-12-03

Similar Documents

Publication Publication Date Title
CN112702699B (en) Indoor positioning method fusing UWB and LiDAR
US20230077304A1 (en) Cooperative positioning method and apparatus, device, and storage medium
CN102395192B (en) Method and device for locating wireless sensor terminal
US20180329022A1 (en) Method, apparatus and system for locating an object using cluster-type magnetic field
CN109275093A (en) Based on UWB positioning and the matched localization method of laser map and mobile terminal
Wang et al. TOA-based NLOS error mitigation algorithm for 3D indoor localization
CN108759835B (en) Positioning method, positioning device, readable storage medium and mobile terminal
US11047708B2 (en) Method of estimating reliability of measurement distance of laser rangefinder, and localizating method of mobile robot using laser rangefinder
CN107255795A (en) Localization Approach for Indoor Mobile and device based on EKF/EFIR mixed filterings
US11353574B2 (en) System and method for tracking motion of target in indoor environment
US7268723B2 (en) System and method for locating targets using measurements from a space based radar
CN112367614B (en) LSTM-based Wi-Fi and geomagnetic field fusion indoor positioning algorithm
Xu et al. An improved indoor localization method for mobile robot based on WiFi fingerprint and AMCL
Lu et al. Robot indoor location modeling and simulation based on Kalman filtering
WO2020192182A1 (en) Indoor positioning method and system, and electronic device
US20200309896A1 (en) Indoor positioning method and system and electronic device
Tian et al. Application of a long short-term memory neural network algorithm fused with Kalman filter in UWB indoor positioning
CN111426321B (en) Positioning method and device for indoor robot
CN116805047A (en) Uncertainty expression method and device for multi-sensor fusion positioning and electronic equipment
Silva et al. An approach to improve location accuracy in non-line-of-sight scenarios using floor plans
Yang et al. Vision and UWB-based anchor self-localisation system for UAV in GPS-denied environment
Zhan et al. Fast Self-calibration Method for Massive UWB Anchors Aided by Odometry
Long et al. A Modified Hybrid PDA/FIR Localization Algorithm for Wireless Sensor Network
KR102239506B1 (en) Reliable precise ranging method and program for performing the analysis
CN116939815B (en) UWB positioning base station selection method based on laser point cloud map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant