Disclosure of Invention
The invention provides an indoor positioning method fusing UWB and LiDAR, aiming at solving the problem that the existing indoor positioning method is limited by the sight distance of a specific sensor and other defects, so that the positioning accuracy is poor or the robot binding frame cannot be solved.
In order to solve the technical problem, the invention discloses an indoor positioning method fusing UWB and LiDAR, which comprises the following steps:
step 1, deploying a UWB base station indoors, deploying LiDAR on a main body to be positioned, and acquiring coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system respectively;
step 2, obtaining TDOA information output by the UWB base station, and obtaining first positioning information of the main body to be positioned by adopting a resolving method;
step 3, acquiring input information of the LiDAR, and acquiring second positioning information of the main body to be positioned by adopting AMCL;
step 4, calculating coordinate conversion according to the coordinate information, and converting the first positioning information calculated by the UWB base station into a LiDAR map coordinate system;
step 5, fitting a motion track generated by the first positioning information and a motion track generated by the second positioning information within a period of time in the past to obtain orientation information of the positioning of the UWB base station, and completing the first positioning information according to the orientation information;
step 6, judging whether the positioning of the main body to be positioned is abnormal or not, and adjusting the positioning weight according to the judgment result;
step 7, if the positioning of the main body to be positioned is abnormal, initializing the AMCL according to the first positioning information, and repositioning the main body to be positioned;
and 8, calculating and outputting a real-time positioning result of the main body to be positioned according to the first positioning information, the second positioning information and the positioning weight.
Further, in one implementation, the step 1 includes:
step 1-1, deploying at least 3 UWB base stations located on the same plane indoors, selecting any position in the plane as an origin, selecting any direction in the plane as a coordinate axis, establishing a two-dimensional UWB map coordinate system, and measuring coordinate information of each UWB base station in the UWB map coordinate system;
step 1-2, a UWB tag and a LiDAR are deployed on the main body to be positioned, so that when the main body to be positioned moves, the relative position between the main body to be positioned and the UWB tag and the relative position between the main body to be positioned and the LiDAR are kept unchanged;
step 1-3, drawing a map of the place where the main body to be positioned is located, selecting any position in the map as an origin, selecting any direction in the map as a coordinate axis, establishing a two-dimensional LiDAR map coordinate system, and marking the position of each UWB base station in the LiDAR map coordinate system in the map, namely acquiring the coordinate information of the UWB base station in the UWB map coordinate system.
Further, in one implementation, the step 2 includes:
step 2-1, obtaining TDOA information output by the UWB base station, establishing a TDOA fingerprint map in an indoor area based on the TDOA information, matching the received TDOA information on the TDOA fingerprint map by using a k-Nearest Neighbor algorithm (k-Nearest Neighbor, knn), and matching the k-Nearest Neighbor matching result
As a result of the approximate positioning;
step 2-2, taylor iteration is carried out to enable the approximate positioning result to be more accurate, and a nonlinear equation of the position to be measured is constructed by using TDOA data:
matching result (x) with the k nearest neighborv,yv) For the expansion point, performing taylor expansion on the equation by a taylor iteration method:
fi(xv,yv)+a1,iδx+a2,iδy,
wherein i denotes the number of the UWB base station,
coordinates representing the ith UWB base station, r
iIndicating the euclidean distance of the k nearest neighbor matching result to the ith UWB base station,
a
1,irepresenting a first approximation of the x direction, a
2,iRepresenting a first order approximation of the y-direction,
a correction amount representing a k nearest neighbor matching result;
calculating the k nearest neighbor matching result (x) by the equationv,yv) Correction amount (δ)x,δy) Continuously iterating this process to obtain a more accurate positioning result, i.e. the first positioning information (x, y) — (x)v,yv)+(δx,δy)。
The TDOA signal is adopted, the requirement of time synchronization among different sensors can be reduced by using the TDOA signal, and at least 3 base stations need to be deployed in an actual scene.
Further, in one implementation, the step 3 includes:
step 3-1, acquiring input information of the LiDAR, if particle initialization is not performed, performing particle initialization, and scattering a plurality of particles at random positions in space, wherein the total number of the particles is M; if the particle initialization is carried out, updating all the particle poses, and according to the last moment of the ith particle, namely the posterior pose of the ith particle at the moment of t-1
And a motion model, which is used for obtaining the current time of the ith particle by sampling, namely the prior pose of the ith particle at the time t
The motion model refers to a probabilistic robot 5.4 odometer motion model.
Step 3-2, updating the weight of each particle according to the likelihood domain model
And calculating the weight of each particle
Mean value of
The likelihood domain model references the likelihood domain of the probabilistic robot 6.4 rangefinder.
Step 3-3, according to the weight of each particle
Mean value w of
avgUpdating the short-term smoothed estimation parameter w
fastAnd a long-term smoothing estimation parameter w
slow:
wfast=wfast+αfast(wavg-wfast),
wslow=wslow+αslow(wavg-wslow);
Wherein alpha isfastEstimating parameters w for short term smoothingfastAttenuation ratio of alphaslowEstimating parameters w for long-term smoothingslowThe attenuation rate of (d);
in particular, alphafastValues of 0.1, alphaslowThe value is 0.001.
Step 3-4, for each of said particles, of
Adding a randomly sampled pose to the pose of each of the particles, calculating a weighted sum of all the particles
And taking the weighted sum of all the particle poses as second positioning information.
Further, in an implementation manner, the step 4 includes calculating an amount of rotation and an amount of translation of a coordinate transformation, where the coordinate transformation is used to transform the first positioning information to a LiDAR coordinate system where the second positioning information is located, specifically, since an error may be generated when coordinates of the UWB base station are manually calibrated, in order to reduce an influence of the error on coordinate transformation between the UWB map coordinate system and the LiDAR map coordinate system, multiple calculations need to be performed:
4-1, selecting two different UWB base stations as a base station pair each time, wherein the base station pair comprises a first UWB base station and a second UWB base station; the coordinate of the first UWB base station in the UWB map coordinate system is pu1The coordinate of the second UWB base station in the UWB map coordinate system is pu2(ii) a The coordinates of the first UWB base station in the LiDAR map coordinate system are p respectivelyl1The coordinate of the second UWB base station in the LiDAR map coordinate system is pl2(ii) a Calculating a rotation amount (yaw, pitch, roll) from the coordinates of the base station pair;
for indoor 2D positioning, only the yaw angle, the pitch angle and the roll angle are both 0, and the calculation formula is simplified to
Wherein the first coordinate difference p
1=p
u1-p
u2Second difference of coordinates p
2=p
l1-p
l2;
Step 4-2, selecting different base stations to execute the operation of the step 4-1 for multiple times, and taking the average of the obtained different rotation amounts as the rotation amount of coordinate conversion; specifically, the number of times of execution is selected from the number of times of UWB base stations in this embodiment.
4-3, selecting a calibration base station, wherein the calibration base station is in a UWB map coordinate system puThe coordinate of the calibration base station in the LiDAR map coordinate system is plP is rotated by a rotation (yaw, pitch, roll)uRotation to pu' obtaining p as the translation amount (x, y, z)l-pu′;
And 4-4, selecting different calibration base stations to execute the operation of the step 4-3 for multiple times, and averaging the obtained different translation amounts to be used as the translation amount of coordinate conversion.
Further, in one implementation, since the orientation information cannot be obtained by a solution method due to the low-cost UWB positioning scheme, the orientation information of the UWB positioning is obtained by fitting the motion states output by the UWB and the LiDAR, and the step 5 includes:
step 5-1, recording positioning coordinates of a main body to be positioned, which are acquired by UWB and LiDAR at the current time and past n historical times; setting the positioning coordinate output by the UWB at the current moment as p
l0And the output positioning coordinate of the LiDAR at the current moment is p
u0And the positioning coordinate output by the UWB at the ith moment before the current moment is p
liAnd the locating coordinate output by the LiDAR at the ith moment before the current moment is p
uiWherein p is
li,
Complementing the coordinates with 0 for 3 dimensions when the coordinates are not sufficient for 3 dimensions;
step 5-2, obtaining a rotation matrix
Fitting the positioning coordinate output by the UWB under the action of the rotation matrix R by using the positioning coordinate output by the LiDAR;
and 5-3, converting the pose output by the LiDAR at the current moment according to the rotation matrix R, taking the orientation part of the conversion result as the orientation information of the UWB positioning at the current moment, and supplementing the first positioning information in the step 2 according to the orientation information.
Further, in one implementation, the step 6 includes:
step 6-1, judging whether the positioning of the main body to be positioned is abnormal or not according to the pose difference of the UWB positioning and the LiDAR positioning;
step 6-2, if the judgment result is abnormal positioning, the LiDAR positioning weight w is usedlDecrease by 0.1, wl∈[0,1]Continuing to execute the step 7 for relocation, wherein if LiDAR location weight wlIf the value is 0, the value is not reduced, and the step 7 is continuously executed for relocation; if the positioning is normal, the LiDAR positioning weight w is determinedlIncreasing by 0.1, the step 8 is executed, wherein, if LiDAR positioning weight wlIf 1, the value is not increased, and the step 8 is executed.
Further, in one implementation, the step 6-1 includes:
step 6-1-1, if the particles in the AMCL do not converge, calculating the Euclidean distance between the result of the conversion of the first positioning information to the LiDAR coordinate system and the second positioning information as a pose difference;
if the particles in the AMCL have converged, calculating the Euler angle (theta) corresponding to the rotation matrix R obtained in the step 5x,θy,θz) The mode forming the vector is used as a pose difference;
the conversion relation between the rotation matrix R and the Euler angle is as follows:
Then theta
x=atan2(r
33,r
33),
θ
z=atan2(r
21,r
11);
And 6-1-2, if the pose difference is larger than a manually set threshold diff _ threshold, determining that the positioning is abnormal and needing to be repositioned.
Further, in one implementation, the step 7 includes: and initializing the particles in the AMCL by using the result of converting the first positioning information into a LiDAR coordinate system, and collecting the input of the LiDAR along with the movement of the body to be positioned so as to continuously converge the particles in the AMCL. Specifically, during relocation, all particles in the AMCL are initialized according to a gaussian distribution, the distribution uses the pose of the current UWB output as a mean value, uses 0.5 as a variance on each coordinate, and uses pi/12 as a variance in each orientation to position the subject motion, and after the LiDAR receives a new input, the particles gradually converge and can output a positioning result with higher precision in step 3.
Further, in one implementation, the step 8 includes: with the movement of the positioning main body, calculating and outputting a real-time positioning result according to the positioning information and the positioning weight of UWB and LiDAR, wherein the real-time positioning result is expressed as a position:
pose=(1-wl)*poseuwb+wl*poselidar
wherein, the positionuwbFor the completed first positioning information, pos, obtained in step 5lidarThe second positioning information obtained in step 3. In this embodiment, the above steps 2 to 8 are continuously performed at a frequency accepted by hardware, and a real-time positioning result is output, where the hardware includes UWB, LiDAR, computing devices, and the like.
According to the technical scheme, the embodiment of the invention provides an indoor positioning method fusing UWB and LiDAR, which comprises the following steps: the method comprises the steps that an indoor UWB base station is deployed, LiDAR is deployed on a main body to be positioned, and coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system is obtained; acquiring TDOA information output by a UWB base station, and acquiring first positioning information of a main body to be positioned by adopting a resolving method; acquiring input information of the LiDAR, and acquiring second positioning information of a main body to be positioned by adopting AMCL; calculating coordinate conversion according to the coordinate information, and converting the first positioning information calculated by the UWB base station into a LiDAR map coordinate system; fitting a motion track generated by the first positioning information and a motion track generated by the second positioning information within a period of time in the past to obtain orientation information of UWB base station positioning, and completing the first positioning information according to the orientation information; judging whether the positioning of the main body to be positioned is abnormal or not, and adjusting the positioning weight according to the judgment result; if the positioning of the main body to be positioned is abnormal, initializing the AMCL according to the first positioning information, and repositioning the main body to be positioned; and finally, calculating and outputting a real-time positioning result of the main body to be positioned according to the first positioning information, the second positioning information and the positioning weight.
In the prior art, the problem of poor positioning accuracy or incapability of solving the problem of robot binding is caused by the visual range and other defects of a specific sensor. By adopting the method, the two low-cost sensors UWB and LiDAR are fused for positioning, real-time indoor positioning is carried out on embedded equipment with lower cost and poorer performance, the advantages of UWB and LiDAR positioning are utilized for complementation, the respective defects are overcome, compared with the prior art, the positioning precision can be improved, the problem of robot binding is solved, and the positioning efficiency is ensured.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The embodiment of the invention discloses an indoor positioning method fusing UWB and LiDAR, which is applied to stable indoor positioning in a complex scene, obstacles between a UWB base station and a label in the environment often cause NLOS errors of UWB signals, for example, signal measurement time is inaccurate due to refraction or reflection; in the long-term continuous positioning process, temporary functional faults of the sensors and rapid movement of the positioning main body can cause the problem of the robot kidnapping, so that the common positioning method is invalid.
As shown in fig. 1, the indoor positioning method fusing UWB and LiDAR according to this embodiment includes the following steps:
step 1, deploying a UWB base station indoors, deploying LiDAR on a main body to be positioned, and acquiring coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system respectively; in this embodiment, it needs to be ensured that the relative position between the UWB base station and the subject to be positioned and the relative position between the LiDAR and the subject to be positioned do not change greatly with the movement of the positioning subject.
Step 2, obtaining TDOA information output by the UWB base station, and obtaining first positioning information of the main body to be positioned by adopting a resolving method;
in the embodiment, the UWB bs provides a plurality of ranging signals, and the ranging signals include a Time Difference of Arrival (TDOA) signal and a Time of Arrival (TOA) signal, and the TDOA signal is adopted in the present invention, so that the requirement of Time synchronization between different sensors can be reduced, and specifically, at least 3 bss need to be deployed in a practical scenario.
Step 3, acquiring input information of the LiDAR, and acquiring second positioning information of the main body to be positioned by adopting AMCL;
step 4, calculating coordinate conversion according to the coordinate information, and converting the first positioning information calculated by the UWB base station into a LiDAR map coordinate system;
step 5, fitting a motion track generated by the first positioning information and a motion track generated by the second positioning information within a period of time in the past to obtain orientation information of the positioning of the UWB base station, and completing the first positioning information according to the orientation information;
step 6, judging whether the positioning of the main body to be positioned is abnormal or not, and adjusting the positioning weight according to the judgment result;
step 7, if the positioning of the main body to be positioned is abnormal, initializing the AMCL according to the first positioning information, and repositioning the main body to be positioned;
and 8, calculating and outputting a real-time positioning result of the main body to be positioned according to the first positioning information, the second positioning information and the positioning weight.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 1 includes:
step 1-1, deploying at least 3 UWB base stations located on the same plane indoors, selecting any position in the plane as an origin, selecting any direction in the plane as a coordinate axis, establishing a two-dimensional UWB map coordinate system, and measuring coordinate information of each UWB base station in the UWB map coordinate system;
step 1-2, a UWB tag and a LiDAR are deployed on the main body to be positioned, so that when the main body to be positioned moves, the relative position between the main body to be positioned and the UWB tag and the relative position between the main body to be positioned and the LiDAR are kept unchanged;
step 1-3, drawing a map of the place where the main body to be positioned is located, selecting any position in the map as an origin, selecting any direction in the map as a coordinate axis, establishing a two-dimensional LiDAR map coordinate system, and marking the position of each UWB base station in the LiDAR map coordinate system in the map, namely acquiring the coordinate information of the UWB base station in the UWB map coordinate system.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 2 includes:
step 2-1, obtaining TDOA information output by the UWB base station, establishing a TDOA fingerprint map in an indoor area based on the TDOA information, matching the received TDOA information on the TDOA fingerprint map by using a k-Nearest Neighbor algorithm (k-Nearest Neighbor, knn), and matching the k-Nearest Neighbor matching result
As a result of the approximate positioning;
step 2-2, taylor iteration is carried out to enable the approximate positioning result to be more accurate, and a nonlinear equation of the position to be measured is constructed by using TDOA data:
matching result (x) with the k nearest neighborv,yv) For the expansion point, performing taylor expansion on the equation by a taylor iteration method:
fi(xv,yv)+a1,iδx+a2,iδy,
wherein i denotes the number of the UWB base station,
coordinates representing the ith UWB base station, r
iIndicating the euclidean distance of the k nearest neighbor matching result to the ith UWB base station,
a
1,irepresenting a first approximation of the x direction, a
2,iRepresenting a first order approximation of the y-direction,
a correction amount representing a k nearest neighbor matching result;
calculating the k nearest neighbor matching result (x) by the equationv,yv) Correction amount (δ)x,δy) Continuously iterating this process to obtain a more accurate positioning result, i.e. the first positioning information (x, y) — (x)v,yv)+(δx,δy)。
As shown in fig. 2, in the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 3 includes:
step 3-1, acquiring input information of the LiDAR, if particle initialization is not performed, performing particle initialization, and scattering a plurality of particles at random positions in space, wherein the total number of the particles is M; if the particle initialization is carried out, updating all the particle poses,according to the posterior position of the ith particle at the last moment, namely the ith particle at the moment t-1
And a motion model, which is used for obtaining the current time of the ith particle by sampling, namely the prior pose of the ith particle at the time t
Specifically, the motion model refers to a probabilistic robot 5.4 odometer motion model.
Step 3-2, updating the weight of each particle according to the likelihood domain model
And calculating the weight of each particle
Mean value of
Specifically, the likelihood domain model refers to a likelihood domain of a probabilistic robot 6.4 range finder.
Step 3-3, according to the weight of each particle
Mean value w of
avgUpdating the short-term smoothed estimation parameter w
fastAnd a long-term smoothing estimation parameter w
slow:
wfast=wfast+αfast(wavg-wfast),
wslow=wslow+αslow(wavg-wslow);
Wherein alpha isfastEstimating parameters w for short term smoothingfastAttenuation ratio of alphaslowEstimating parameters w for long-term smoothingslowThe attenuation rate of (d);
in particular, alphafastValues of 0.1, alphaslowValue taking0.001。
Step 3-4, for each of said particles, of
Adding a randomly sampled pose to the pose of each of the particles, calculating a weighted sum of all the particles
And taking the weighted sum of all the particle poses as second positioning information.
Because the coordinates of the UWB base station are calibrated manually, errors are generated, and in order to reduce the influence of the errors on the coordinate conversion between the coordinate system of the UWB map and the coordinate system of the LiDAR map, calculation needs to be carried out for many times. Therefore, in the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 4 includes calculating the rotation amount and the translation amount of the coordinate transformation, which is used to transform the first positioning information to the LiDAR coordinate system in which the second positioning information is located:
4-1, selecting two different UWB base stations as a base station pair each time, wherein the base station pair comprises a first UWB base station and a second UWB base station; the coordinate of the first UWB base station in the UWB map coordinate system is pu1The coordinate of the second UWB base station in the UWB map coordinate system is pu2(ii) a The coordinates of the first UWB base station in the LiDAR map coordinate system are p respectivelyl1The coordinate of the second UWB base station in the LiDAR map coordinate system is pl2(ii) a Calculating a rotation amount (yaw, pitch, roll) from the coordinates of the base station pair;
for indoor 2D positioning, only the yaw angle, the pitch angle and the roll angle are both 0, and the calculation formula is simplified to
Wherein the first coordinate difference p
1=p
u1-p
u2Second difference of coordinates p
2=p
l1-p
l2;
Step 4-2, selecting different base stations to execute the operation of the step 4-1 for multiple times, and taking the average of the obtained different rotation amounts as the rotation amount of coordinate conversion; specifically, in this embodiment, the number of UWB base stations is selected.
4-3, selecting a calibration base station, wherein the calibration base station is in a UWB map coordinate system puThe coordinate of the calibration base station in the LiDAR map coordinate system is plP is rotated by a rotation (yaw, pitch, roll)uRotation to pu' obtaining p as the translation amount (x, y, z)l-pu′;
And 4-4, selecting different calibration base stations to execute the operation of the step 4-3 for multiple times, and averaging the obtained different translation amounts to be used as the translation amount of coordinate conversion.
Because a low-cost UWB positioning scheme is adopted, the orientation information cannot be obtained through a resolving method, and the orientation information of UWB positioning is obtained through the motion state fitting output by UWB and LiDAR. Therefore, in the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 5 includes:
step 5-1, recording positioning coordinates of a main body to be positioned, which are acquired by UWB and LiDAR at the current time and past n historical times; setting the positioning coordinate output by the UWB at the current moment as p
l0And the output positioning coordinate of the LiDAR at the current moment is p
u0And the positioning coordinate output by the UWB at the ith moment before the current moment is p
liAnd the locating coordinate output by the LiDAR at the ith moment before the current moment is p
uiWherein p is
li,
Complementing the coordinates with 0 for 3 dimensions when the coordinates are not sufficient for 3 dimensions;
step 5-2, obtaining a rotation matrix
Fitting the positioning coordinate output by the UWB under the action of the rotation matrix R by using the positioning coordinate output by the LiDAR;
and 5-3, converting the pose output by the LiDAR at the current moment according to the rotation matrix R, taking the orientation part of the conversion result as the orientation information of the UWB positioning at the current moment, and supplementing the first positioning information in the step 2 according to the orientation information.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 6 includes:
step 6-1, judging whether the positioning of the main body to be positioned is abnormal or not according to the pose difference of the UWB positioning and the LiDAR positioning;
step 6-2, if the judgment result is abnormal positioning, the LiDAR positioning weight w is usedlDecrease by 0.1, wl∈[0,1]Continuing to execute the step 7 for relocation, wherein if LiDAR location weight wlIf the value is 0, the value is not reduced, and the step 7 is continuously executed for relocation; if the positioning is normal, the LiDAR positioning weight w is determinedlIncreasing by 0.1, the step 8 is executed, wherein, if LiDAR positioning weight wlIf 1, the value is not increased, and the step 8 is executed.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 6-1 includes:
step 6-1-1, if the particles in the AMCL do not converge, calculating the Euclidean distance between the result of the conversion of the first positioning information to the LiDAR coordinate system and the second positioning information as a pose difference;
if the particles in the AMCL have converged, calculating the Euler angle (theta) corresponding to the rotation matrix R obtained in the step 5x,θy,θz) The mode forming the vector is used as a pose difference;
the conversion relation between the rotation matrix R and the Euler angle is as follows:
Then theta
x=atan2(r
33,r
33),
θ
z=atan2(r
21,r
11);
And 6-1-2, if the pose difference is larger than a manually set threshold diff _ threshold, determining that the positioning is abnormal and needing to be repositioned.
Specifically, in this embodiment, when the particles in the AMCL are not converged, diff _ threshold is 1; when the particles in the AMCL have converged, let diff _ threshold be pi/6.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 7 includes: and initializing the particles in the AMCL by using the result of converting the first positioning information into a LiDAR coordinate system, and collecting the input of the LiDAR along with the movement of the body to be positioned so as to continuously converge the particles in the AMCL. Specifically, in this embodiment, all particles in the AMCL are initialized at the time of relocation with a gaussian distribution having the pose of the current UWB output as a mean, a variance of 0.5 at each coordinate, and a variance of pi/12 at each orientation. The subject is positioned for motion, and after the LiDAR receives a new input, the particles gradually converge and a higher accuracy positioning result can be output in step 3.
In the indoor positioning method fusing UWB and LiDAR according to this embodiment, the step 8 includes: with the movement of the positioning main body, calculating and outputting a real-time positioning result according to the positioning information and the positioning weight of UWB and LiDAR, wherein the real-time positioning result is expressed as a position:
pose=(1-wl)*poseuwb+wl*poselidar
wherein, the positionuwbFor the completed first positioning information, pos, obtained in step 5lidarThe second positioning information obtained in step 3.
In this embodiment, the above steps 2 to 8 are continuously performed at a frequency accepted by hardware, and a real-time positioning result is output, where the hardware includes a UWB base station, a LiDAR, a computing device, and the like.
Examples
In order to verify the effectiveness of the indoor positioning method fusing UWB and LiDAR described in this embodiment, example verification is performed in an actual scene, an experimental scene is set to be a room with a length of 6 meters and a width of 4 meters, some obstacles are arranged in the room, the experimental scene is shown in fig. 3a and 3b, fig. 3a is arrangement and grid labeling of the experimental scene, wherein the length and the width of each grid are 1 meter, fig. 3b is a schematic plan view of the experimental scene, and the obstacles are marked by black blocks. Deploying a mobile robot in a scene, and performing indoor positioning according to the following steps:
step 1, in this embodiment, UWB base stations are deployed at four corners of a scene, LiDAR is installed on a main body to be positioned, that is, a mobile robot, the position of the UWB base station is determined, coordinates of each UWB base station are marked in a pre-drawn scene two-dimensional map, that is, coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system, respectively, is obtained, and the mobile robot is caused to travel according to a preset path in fig. 4a by sending a motion instruction;
step 2, obtaining TDOA information output by UWB, and obtaining first positioning information of the mobile robot by adopting a resolving method;
step 3, acquiring input information of the LiDAR, and acquiring second positioning information of the mobile robot by using an Adaptive Monte Carlo positioning method (AMCL);
step 4, calculating coordinate conversion according to a UWB map coordinate system CS _ UWB and a LiDAR map coordinate system CS _ LiDAR, and converting the positioning information of different sensors to be in the same coordinate system, namely converting the first positioning information calculated by the UWB base station to be in the LiDAR map coordinate system;
step 5, respectively summarizing historical positioning information of the UWB and the LiDAR to obtain respective motion state information of the UWB and the LiDAR, fitting the motion states of the UWB and the LiDAR to obtain orientation information of UWB positioning, namely fitting a motion track generated by first positioning information and a motion track generated by second positioning information in a past period of time, and supplementing the first positioning information according to the orientation information;
step 6, judging whether the mobile robot is abnormally positioned or not according to the pose difference of UWB positioning and LiDAR positioning, and reducing the LiDAR positioning weight w if the mobile robot is abnormally positionedlContinuing to execute the step 7 for repositioning, otherwise increasing the LiDAR positioning weight wlExecuting step 8;
step 7, if the mobile robot is abnormally positioned, initializing the AMCL according to the first positioning information, repositioning the mobile robot, namely initializing particles in the AMCL, and collecting LiDAR input along with the movement of the mobile robot to enable the particles in the AMCL to be continuously converged;
and 8, calculating the current pose according to the first positioning information, the second positioning information, the positioning weight and the positioning weight, and outputting a real-time positioning result of the mobile robot along with the motion of the mobile robot.
And connecting the fusion positioning results in the walking process of the mobile robot, and drawing the walking route of the mobile robot, as shown by the smooth line in fig. 4 b. The line with large deviation and jitter in fig. 4b is an original positioning route obtained by calculation only using UWB signals, and the error of the result of fusion positioning is less than 10 cm through manual measurement.
Meanwhile, the invention carries out example verification aiming at the problem of robot kidnapping, and the specific flow is as follows: in the example verification process, when the mobile robot moves to the "move away" position shown in fig. 5a, an instruction is sent to stop the movement of the mobile robot, the mobile robot is manually moved to the "place" position shown in fig. 5a, the positioning process is entered again, and after the repositioning process in step 7 is completed, the robot continues to move according to the predetermined route. The manual handling simulates the situation of robot positioning failure, and after the temporary failure, the invention obtains more accurate positioning again.
In the example verification of the robot kidnapping problem, the positioning result before moving is shown in fig. 5b, the positioning result before moving to the robot repositioning completion is shown in fig. 5c, the positioning result after repositioning to the operation completion is shown in fig. 5d, the smooth lines in the figures represent the fusion positioning result, and the repositioning process is completed within 5 seconds. The result shows that the mobile robot can be quickly and accurately repositioned after being transported to a new position, and the problem of robot kidnapping can be effectively solved.
According to the technical scheme, the embodiment of the invention provides an indoor positioning method fusing UWB and LiDAR, which comprises the following steps: the method comprises the steps that an indoor UWB base station is deployed, LiDAR is deployed on a main body to be positioned, and coordinate information of the UWB base station in a UWB map coordinate system and a LiDAR map coordinate system is obtained; acquiring TDOA information output by a UWB base station, and acquiring first positioning information of a main body to be positioned by adopting a resolving method; acquiring input information of the LiDAR, and acquiring second positioning information of a main body to be positioned by adopting AMCL; calculating coordinate conversion according to the coordinate information, and converting the first positioning information calculated by the UWB base station into a LiDAR map coordinate system; fitting a motion track generated by the first positioning information and a motion track generated by the second positioning information within a period of time in the past to obtain orientation information of UWB base station positioning, and completing the first positioning information according to the orientation information; judging whether the positioning of the main body to be positioned is abnormal or not, and adjusting the positioning weight according to the judgment result; if the positioning of the main body to be positioned is abnormal, initializing the AMCL according to the first positioning information, and repositioning the main body to be positioned; and finally, calculating and outputting a real-time positioning result of the main body to be positioned according to the first positioning information, the second positioning information and the positioning weight.
In the prior art, the problem of poor positioning accuracy or incapability of solving the problem of robot binding is caused by the visual range and other defects of a specific sensor. By adopting the method, the two low-cost sensors UWB and LiDAR are fused for positioning, real-time indoor positioning is carried out on embedded equipment with lower cost and poorer performance, the advantages of UWB and LiDAR positioning are utilized for complementation, the respective defects are overcome, compared with the prior art, the positioning precision can be improved, the problem of robot binding is solved, and the positioning efficiency is ensured.
In particular implementations, the present invention also provides a computer storage medium, where the computer storage medium may store a program that, when executed, may include some or all of the steps in embodiments of a method for UWB and LiDAR fused indoor positioning provided by the present invention. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. Based on such understanding, the technical solutions in the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The same and similar parts in the various embodiments in this specification may be referred to each other. The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention.