CN111190191A - Scanning matching method based on laser SLAM - Google Patents

Scanning matching method based on laser SLAM Download PDF

Info

Publication number
CN111190191A
CN111190191A CN201911269204.4A CN201911269204A CN111190191A CN 111190191 A CN111190191 A CN 111190191A CN 201911269204 A CN201911269204 A CN 201911269204A CN 111190191 A CN111190191 A CN 111190191A
Authority
CN
China
Prior art keywords
pose
map
grid
scanning
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911269204.4A
Other languages
Chinese (zh)
Inventor
任彧
夏天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN201911269204.4A priority Critical patent/CN111190191A/en
Publication of CN111190191A publication Critical patent/CN111190191A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to a scanning matching method based on laser SLAM. The invention utilizes the laser radar to collect the point cloud information of the angle and the distance between an object and the sensor, and calculates the pose change of the relative movement of the sensor by matching the point clouds at different moments, and the process is called scanning matching. The scan matching method can estimate the pose in combination with odometer information to obtain a better pose estimate. However, in practical applications such as unmanned planes, sweeping robots, etc., there is usually no odometer information. This makes the positioning and mapping capabilities of the robot highly dependent on the accuracy of the scan matching technique. The invention can achieve better pose estimation precision under the condition of no mileometer.

Description

Scanning matching method based on laser SLAM
Technical Field
The invention relates to the technical field of positioning and mapping, in particular to a scanning matching method based on laser SLAM.
Background
In practical applications, the robot needs to perform effective path planning and autonomously explore the environment in an unknown environment, which makes it necessary to have the ability to autonomously create an environment map and achieve accurate self-localization. This problem is called SLAM (instant positioning and mapping). SLAM is a challenging problem because there is an interdependent relationship between the quality of the map and the accuracy of the robot pose estimation: inaccurate pose estimation will result in inaccurate maps and vice versa. However, there are many factors that actually cause pose or map errors, such as the accuracy and noise of the sensor itself, too simple map representation, and positioning that cannot guarantee convergence to a global extreme.
The laser SLAM utilizes a laser radar to collect point cloud information of angles and distances of objects and a sensor, and calculates the pose change of relative movement of the sensor by matching point clouds at different moments, wherein the process is called scanning matching. The scan matching method can estimate the pose in combination with odometer information to obtain a better pose estimate. However, in practical applications such as unmanned planes, sweeping robots, etc., there is usually no odometer information. This makes the positioning and mapping capabilities of the robot highly dependent on the accuracy of the scan matching technique.
Disclosure of Invention
The invention provides a scanning matching method based on laser SLAM in order to overcome the defects of the existing scanning matching technology, overcomes the defects of the original method that the calculation result is not converged and the local approximation is not accurate enough to a certain extent under the condition of no mileometer, achieves better pose estimation precision, and effectively improves the construction quality of an environment map.
The invention is realized by adopting the following technical scheme, which comprises the following steps:
step one, representing an environment map by adopting an occupation grid map, and determining the occupation probability of each grid in the point cloud at each moment; the grid map divides the environment map into a finite number of grids with equal size, and the grids where each point is located in the actual environment have only two states, namely, the grids occupy the state where an obstacle exists or are idle, namely, the grids do not have the obstacle;
namely: m ═ Σimi,i=1...n;
Wherein m isiM represents the whole environment map for grids in the grid map;
for the grid map, each grid m is estimated by the observation data of the laser and the pose of the robotiState p at time t: p (m)i|z1:t,x1:t) (ii) a Wherein z is1:tIs a laser fromAll observations from time 1 to time t, x1:tAll the poses of the robot from 1 moment to t moment; and grid miThe state at the moment t is determined by the state of the state at the previous moment and the observation data and the pose at the moment t; the probability value of the occupied grid is stored in the grid of the grid map, the grid value is continuously updated by using a binary Bayesian filtering method, so that the final state of the grid is determined, wherein p is 1 for occupying and p is 0 for idle;
matching the point clouds between two moments to maximize the matching, thereby obtaining the relative pose of the robot; the positioning problem in the laser SLAM is solved by scanning matching, and the scanning matching method is used for solving the relative translation T and rotation theta between the current pose of the robot and the pose at the last moment by maximally overlapping the existing map and the current point cloud; in order to solve the relative pose between two moments of the robot, the current scanning point p is generally setiProjecting the pose q at the last moment to the existing map M after rotation transformation, and scanning a point { p by using Euclidean distance measurementi-degree of matching to map M; this problem is a minimization problem, described as:
Figure BDA0002313695990000021
european projection of the figure q ⊕ piThe scanning point is converted through the rotation of the pose:
Figure BDA0002313695990000022
wherein R (theta) is a rotation matrix and T is a translation matrix;
solving the minimization problem by using a Gauss-Newton method, and determining the direction of gradient reduction each time; for a certain point P in the map MmThe map value is M (P)m) While the derivative thereof is expressed as
Figure BDA0002313695990000023
Figure BDA0002313695990000024
The map value at the point may be determined from the distanceFour integer point coordinates P with nearest points00、P01、P10、P11Estimating and calculating a map value of; the calculation formula is as follows:
Figure BDA0002313695990000025
Figure BDA0002313695990000026
wherein x and y are points PmThe coordinates of (a);
in the representation of the grid map, the distance between two adjacent grids is 1; in the grid map, for a robot pose q at a certain time, a minimization function is defined as:
Figure BDA0002313695990000027
q*the method comprises the steps of representing a transformation when the optimal matching between a laser scanning point and a map is found, wherein the transformation is the pose change of a robot between two laser scanning; si(q) represents the scanning end point s of the robot under the current poseiWorld coordinates of (a): si(q)=R(θ)si+ T; firstly, the scanning end points are subjected to rotation transformation through a rotation matrix: r (theta) siThen adding the shift matrix T to obtain the scanning end point siThe world coordinates of (a); q is to be*And converting into a pose increment delta q when the solving error is minimum:
Figure BDA0002313695990000031
by deriving it and letting the derivative be 0:
Figure BDA0002313695990000032
wherein:
Figure BDA0002313695990000033
representing the scanning end point siMap value derivatives of the grid in which it is located; while the H matrix is expressed as the square of the derivative;
Figure BDA0002313695990000034
step three, searching the optimal step length of each gradient descent by using a line search method; the method is used for solving the optimal step length of the gradient descent direction by a line search method, and the problem is described as solving a unitary optimization problem:
Figure BDA0002313695990000035
wherein f (x) 1-M (S)i(x) Then, then
Figure BDA0002313695990000036
H is the direction of gradient descent, which can be obtained by Gauss-Newton method, and α is the step length, the idea of this method is to solve the function
Figure BDA0002313695990000037
The minimum step length α, the optimal solution α of the target function is used for the robot posture to obtain q as x + α h, and Si(x + α h) represents the scanning endpoint s at the pose x + α hiThe world coordinates of (a); m (S)i(x + α h)) represents the scanning end point siα is required to satisfy the strong Wolfe condition line search method:
Figure BDA0002313695990000038
wherein 0<c1<c2<1,c1,c2Are all custom parameters, usually let c1=0.001,c2=0.1;
When the searching direction is selected as the 'steepest descent direction', namely the negative gradient direction, the line searching method can reach a 'global convergence' state, and the convergence speed is the linear convergence speed at the moment;
fourthly, solving a relative pose according to a scanning matching result; updating the current robot pose q ═ delta q + q according to the pose increment obtained by scanning matching, and updating the current map according to the current pose; and returning to the step two to solve the pose at the next moment.
Drawings
FIG. 1 is a flow chart of the present invention.
Detailed Description
The following implementation procedures are adopted to realize the functions of the method.
In order to realize the method, the scheme needs the following equipment: the system comprises an AGV, a PC and a laser radar. Laser radar carries on the AGV dolly, and the AGV dolly is arranged in carrying laser radar and moves in the environment, and the PC is used for controlling AGV dolly motion, and the environment scanning data that simultaneously analytic and visual laser radar acquireed. The laser radar requires the acquisition of environmental scanning information; the PC machine is required to be capable of remotely controlling the AGV, and meanwhile, a corresponding radar drive and ROS robot operation system are required to be installed so as to be used for visualizing the mapping result of the laser radar.
A scanning matching method based on laser SLAM comprises the following steps:
step one, representing an environment map by adopting an occupation grid map, and determining the occupation probability of each grid in the point cloud at each moment.
The grid map divides the environment map into a finite number of grids with equal size, and the grid where each point in the actual environment is located has only two states, namely, the grid occupies the space where the obstacle exists or is idle, namely, the space where the obstacle does not exist.
Namely: m ═ Σimi,i=1...n;
Wherein m isiFor a grid in the grid map, m represents the entire environment map.
For the grid map, each grid m is estimated by the observation data of the laser and the pose of the robotiState p at time t: p (m)i|z1:t,x1:t) (ii) a Wherein z is1:tIs the total observed data of the laser from 1 to t, x1:tAll the poses of the robot from the moment 1 to the moment t are shown. And grid miThe state at time t is determined by its state at the previous time and the pose and observation data at time t. Storing the outline occupied by the grid in the grid of the grid mapAnd (4) determining the final state of the grid by continuously updating and calculating the grid value by using a binary Bayesian filtering method (p is 1 to represent occupation, and p is 0 to represent idle).
And step two, matching the point clouds between two moments to maximize the matching, thereby obtaining the relative pose of the robot. The positioning problem in laser SLAM is solved by scanning matching, and the scanning matching method solves the relative translation T and rotation theta between the current pose of the robot and the pose at the last moment by maximally overlapping the existing map and the current point cloud. In order to solve the relative pose between two moments of the robot, the current scanning point p is generally setiProjecting the pose q at the last moment to the existing map M after rotation transformation, and scanning a point { p by using Euclidean distance measurementiThe degree of match to the map M. This problem is a minimization problem, described as:
Figure BDA0002313695990000041
Figure BDA0002313695990000042
wherein
Figure BDA0002313695990000043
Is the Euclidean projection of the transformed points onto the map.
Figure BDA0002313695990000044
Figure BDA0002313695990000045
The scanning point is converted through the rotation of the pose:
Figure BDA0002313695990000046
where R (θ) is a rotation matrix and T is a translation matrix.
The minimization problem is solved using gauss-newton and the direction of each gradient descent is determined.
Discrete values in a grid map do not allow direct calculation of their derivatives, while also limiting the accuracy with which the map can be implemented. For this reason, bilinear interpolation is often employed to estimate the probability values and derivatives of the grid. The grid map can thus be seen as having values with a continuous probability distribution.
For a certain point P in the map MmThe map value is M (P)m) While the derivative thereof is expressed as
Figure BDA0002313695990000051
The map value at the point may be represented by the four integer point coordinates P closest to the point00、P01、P10、P11The map value estimation and calculation. The calculation formula is as follows:
Figure BDA0002313695990000052
Figure BDA0002313695990000053
wherein x and y are points PmThe coordinates of (a).
In the representation of the grid map, the distance between two adjacent grids is 1 (referring to the map coordinate system, the actual distance between them is determined by the map resolution, i.e. the grid size). In the grid map, for a robot pose q at a certain time, a minimization function is defined as:
Figure BDA0002313695990000054
q*and (3) representing a transformation when the optimal matching between the laser scanning points and the map is found, wherein the transformation is the pose change of the robot between two laser scanning. Si(q) represents the scanning end point s of the robot under the current poseiWorld coordinates of (a): si(q)=R(θ)si+ T; firstly, the scanning end points are subjected to rotation transformation through a rotation matrix: r (theta) siThen adding the shift matrix T to obtain the scanning end point siThe world coordinates of (a). Q is to be*And converting into a pose increment delta q when the solving error is minimum:
Figure BDA0002313695990000055
by making a derivative thereof and commandingThe number 0 gives:
Figure BDA0002313695990000056
wherein:
Figure BDA0002313695990000057
representing the scanning end point siThe map value derivative of the grid in which it is located. While the H matrix is expressed as the square of the derivative.
Figure BDA0002313695990000058
And step three, searching the optimal step length of each gradient descent by using a line search method.
The line search method is usually used to solve the optimal step size of the gradient descent direction, and the problem is described as solving a unitary optimization problem:
Figure BDA0002313695990000059
wherein f (x) 1-M (S)i(x) Then, then
Figure BDA00023136959900000510
H is the direction of gradient descent, and is obtained by Gaussian Newton's method α as the step length, the idea of this method is to solve the function
Figure BDA00023136959900000511
The minimum step length α, the robot pose is obtained through the optimal solution α of the objective function at the moment, q is x + α h, and Si(x + α h) represents the scanning endpoint s at the pose x + α hiThe world coordinates of (a). M (S)i(x + α h)) represents the scanning end point siMap values of the grid in which it is located.
In order to improve the search efficiency, a non-precise line search method is adopted, and α needs to satisfy the following conditions in the line search method with strong Wolfe condition:
Figure BDA0002313695990000061
wherein 0<c1<c2<1,c1,c2All are self-defined parameters, can be manually set, and generally order c1=0.001,c2=0.1。
A non-precise line searching method firstly finds an interval containing a solution, and then uses a zoom method to find the solution meeting constraint conditions. The method specifically comprises the following steps:
step ⑴. find the lower bound of one step size such that the optimal solution α is contained within this interval*
Step ⑵. calculate the function at the current step size value
Figure BDA0002313695990000062
Determining whether a sufficiently small condition is satisfied, and if not, indicating an optimal solution (α)i-1i) And turning to the zoom method, verifying whether the strong Wolfe condition is met, and if so, α*Namely αiOutput αi. If the condition is not met and the current gradient is positive, the last step is exchanged and the zoom method is switched to.
⑶, if the above conditions are not satisfied, repeating step ⑵ to solve the next step size point.
The zoom method passes through an interval of one step (α)lohi) And solving the step length meeting the strong Wolfe condition in the interval. The zoom method comprises the following steps: 1. judging the current
Figure BDA0002313695990000063
Whether the conditions of Wolfe are met or not, if not, the interval is reduced, the next step length point is calculated, and the step is transferred to step ⑵.2, whether the conditions of strong Wolfe are met or not is judged, and if the conditions of Wolfe are met, α*Namely αjOutput αj. 3. And judging whether the current interval is a decreasing interval or not, if so, adjusting the interval to enable the interval to meet the algorithm input condition, and then, repeatedly executing the zoom method.
When the searching direction is selected as the 'steepest descent direction', namely the negative gradient direction, the line searching method can achieve a 'global convergence' state, and the convergence speed is the linear convergence speed at the moment.
And fourthly, solving the relative pose according to the scanning matching result. And updating the current robot pose q ═ delta q + q according to the pose increment obtained by scanning matching, and updating the current map according to the current pose. And returning to the step two to solve the pose at the next moment.

Claims (3)

1. A scanning matching method based on laser SLAM is characterized in that: the method comprises the following steps:
step one, representing an environment map by adopting an occupation grid map, and determining the occupation probability of each grid in the point cloud at each moment; the grid map divides the environment map into a finite number of grids with equal size, and the grids where each point is located in the actual environment have only two states, namely, the grids occupy the state where an obstacle exists or are idle, namely, the grids do not have the obstacle;
namely: m ═ Σimi,i=1...n;
Wherein m isiM represents the whole environment map for grids in the grid map;
for the grid map, each grid m is estimated by the observation data of the laser and the pose of the robotiState p at time t: p (m)i|z1:t,x1:t) (ii) a Wherein z is1:tIs the total observed data of the laser from 1 to t, x1:tAll the poses of the robot from 1 moment to t moment; and grid miThe state at the moment t is determined by the state of the state at the previous moment and the observation data and the pose at the moment t; the probability value of the occupied grid is stored in the grid of the grid map, the grid value is continuously updated by using a binary Bayesian filtering method, so that the final state of the grid is determined, wherein p is 1 for occupying and p is 0 for idle;
matching the point clouds between two moments to maximize the matching, thereby obtaining the relative pose of the robot; the positioning problem in the laser SLAM is solved by scanning matching, and the scanning matching method is used for solving the relative translation T and rotation theta between the current pose of the robot and the pose at the last moment by maximally overlapping the existing map and the current point cloud; in order to solve the relative pose of the robot between two moments, the relative pose is generally determinedCurrent scan point { piProjecting the pose q at the last moment to the existing map M after rotation transformation, and scanning a point { p by using Euclidean distance measurementi-degree of matching to map M; this problem is a minimization problem, described as:
Figure FDA0002313695980000011
wherein
Figure FDA0002313695980000012
The Euclidean projection from the transformed points to the map;
Figure FDA0002313695980000013
the scanning point is converted through the rotation of the pose:
Figure FDA0002313695980000014
wherein R (theta) is a rotation matrix and T is a translation matrix;
solving the minimization problem by using a Gauss-Newton method, and determining the direction of gradient reduction each time; for a certain point P in the map MmThe map value is M (P)m) While the derivative thereof is expressed as
Figure FDA0002313695980000015
Figure FDA0002313695980000016
The map value at the point may be represented by the four integer point coordinates P closest to the point00、P01、P10、P11Estimating and calculating a map value of; the calculation formula is as follows:
Figure FDA0002313695980000017
Figure FDA0002313695980000021
wherein x and y are points PmThe coordinates of (a);
in the representation of the grid map, the distance between two adjacent grids is 1; in the grid map, for a robot pose q at a certain time, a minimization function is defined as:
Figure FDA0002313695980000022
q*the method comprises the steps of representing a transformation when the optimal matching between a laser scanning point and a map is found, wherein the transformation is the pose change of a robot between two laser scanning; si(q) represents the scanning end point s of the robot under the current poseiWorld coordinates of (a): si(q)=R(θ)si+ T; firstly, the scanning end points are subjected to rotation transformation through a rotation matrix: r (theta) siThen adding the shift matrix T to obtain the scanning end point siThe world coordinates of (a); q is to be*And converting into a pose increment delta q when the solving error is minimum:
Figure FDA0002313695980000023
by deriving it and letting the derivative be 0:
Figure FDA0002313695980000024
wherein:
Figure FDA0002313695980000025
representing the scanning end point siMap value derivatives of the grid in which it is located; while the H matrix is expressed as the square of the derivative;
Figure FDA0002313695980000026
step three, searching the optimal step length of each gradient descent by using a line search method; the method is used for solving the optimal step length of the gradient descent direction by a line search method, and the problem is described as solving a unitary optimization problem:
Figure FDA0002313695980000027
wherein f (x) 1-M (S)i(x) Then, then
Figure FDA0002313695980000028
Figure FDA0002313695980000029
H is the direction of gradient descent, which can be obtained by Gauss-Newton method, and α is the step length, the idea of this method is to solve the function
Figure FDA00023136959800000210
The minimum step length α, the optimal solution α of the target function is used for the robot posture to obtain q as x + α h, and Si(x + α h) represents the scanning endpoint s at the pose x + α hiThe world coordinates of (a); m (S)i(x + α h)) represents the scanning end point siα is required to satisfy the strong Wolfe condition line search method:
Figure FDA00023136959800000211
wherein 0 < c1<c2<1,c1,c2Are all custom parameters, usually let c1=0.001,c2=0.1;
When the searching direction is selected as the 'steepest descent direction', namely the negative gradient direction, the line searching method can reach a 'global convergence' state, and the convergence speed is the linear convergence speed at the moment;
fourthly, solving a relative pose according to a scanning matching result; updating the current robot pose q ═ delta q + q according to the pose increment obtained by scanning matching, and updating the current map according to the current pose; and returning to the step two to solve the pose at the next moment.
2. The laser SLAM-based scan matching method of claim 1, wherein: the line searching method of the third step, find an interval including solution at first, then use zoom method to find out and meet the solution of the constraint condition; the method specifically comprises the following steps:
step (1) find the lower bound of a step size such that the optimal solution α is contained within the interval*
Step (2) calculating a function under the current step value
Figure FDA0002313695980000031
Determining whether a sufficiently small condition is satisfied, and if not, indicating an optimal solution (α)i-1,αi) And turning to a zoom method, verifying whether the strong Wolfe condition is met, and if so, α*Namely αiOutput αi(ii) a If the condition is not met and the current gradient is a positive value, exchanging the previous step length and transferring to a zoom method;
and (3) if the conditions are not met, repeating the step (2) to solve the next step length point.
3. The scan matching method based on laser SLAM as claimed in claim 2, wherein said zoom method passes through a step interval (α)lo,αhi) Solving the step length meeting the strong Wolfe condition in the interval; the method comprises the following specific steps: 1. judging the current
Figure FDA0002313695980000032
If the conditions of Wolfe are met, reducing the interval if the conditions are not met, calculating the next step point, and turning to the step (2). 2, judging whether the conditions of strong Wolfe are met, if the conditions are met, α*Namely αjOutput αj(ii) a 3. And judging whether the current interval is a decreasing interval or not, if so, adjusting the interval to enable the interval to meet the input condition, and then, repeatedly executing the zoom method.
CN201911269204.4A 2019-12-11 2019-12-11 Scanning matching method based on laser SLAM Pending CN111190191A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911269204.4A CN111190191A (en) 2019-12-11 2019-12-11 Scanning matching method based on laser SLAM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911269204.4A CN111190191A (en) 2019-12-11 2019-12-11 Scanning matching method based on laser SLAM

Publications (1)

Publication Number Publication Date
CN111190191A true CN111190191A (en) 2020-05-22

Family

ID=70709182

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911269204.4A Pending CN111190191A (en) 2019-12-11 2019-12-11 Scanning matching method based on laser SLAM

Country Status (1)

Country Link
CN (1) CN111190191A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111752274A (en) * 2020-06-17 2020-10-09 杭州电子科技大学 Laser AGV path tracking control method based on reinforcement learning
CN112612034A (en) * 2020-12-24 2021-04-06 哈尔滨工业大学芜湖机器人产业技术研究院 Pose matching method based on laser frame and probability map scanning
WO2022110451A1 (en) * 2020-11-25 2022-06-02 深圳市优必选科技股份有限公司 Method and apparatus for positioning robot, computer-readable storage medium, and robot
CN116425088A (en) * 2023-06-09 2023-07-14 未来机器人(深圳)有限公司 Cargo carrying method, device and robot

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120121161A1 (en) * 2010-09-24 2012-05-17 Evolution Robotics, Inc. Systems and methods for vslam optimization
WO2016162568A1 (en) * 2015-04-10 2016-10-13 The European Atomic Energy Community (Euratom), Represented By The European Commission Method and device for real-time mapping and localization
CN110456785A (en) * 2019-06-28 2019-11-15 广东工业大学 A kind of autonomous heuristic approach in interior based on caterpillar robot
JP2019207220A (en) * 2018-03-13 2019-12-05 本田技研工業株式会社 Position estimation by dynamic removal of traffic participants with simultaneous execution of stable map generation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120121161A1 (en) * 2010-09-24 2012-05-17 Evolution Robotics, Inc. Systems and methods for vslam optimization
WO2016162568A1 (en) * 2015-04-10 2016-10-13 The European Atomic Energy Community (Euratom), Represented By The European Commission Method and device for real-time mapping and localization
JP2019207220A (en) * 2018-03-13 2019-12-05 本田技研工業株式会社 Position estimation by dynamic removal of traffic participants with simultaneous execution of stable map generation
CN110456785A (en) * 2019-06-28 2019-11-15 广东工业大学 A kind of autonomous heuristic approach in interior based on caterpillar robot

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
下一步: "数值优化(Numerical Optimization)学习系列-线搜索方法(LineSearch)", 《CSDN》 *
刘红英: "《数学规划基础》", 31 October 2012, 北京航空航天大学出版社 *
李昊: "基于激光雷达的二维即时定位与制图技术研究", 《中国优秀硕士学位论文全文数据库,信息科技辑》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111752274A (en) * 2020-06-17 2020-10-09 杭州电子科技大学 Laser AGV path tracking control method based on reinforcement learning
CN111752274B (en) * 2020-06-17 2022-06-24 杭州电子科技大学 Laser AGV path tracking control method based on reinforcement learning
WO2022110451A1 (en) * 2020-11-25 2022-06-02 深圳市优必选科技股份有限公司 Method and apparatus for positioning robot, computer-readable storage medium, and robot
CN112612034A (en) * 2020-12-24 2021-04-06 哈尔滨工业大学芜湖机器人产业技术研究院 Pose matching method based on laser frame and probability map scanning
CN112612034B (en) * 2020-12-24 2023-10-13 长三角哈特机器人产业技术研究院 Pose matching method based on laser frame and probability map scanning
CN116425088A (en) * 2023-06-09 2023-07-14 未来机器人(深圳)有限公司 Cargo carrying method, device and robot
CN116425088B (en) * 2023-06-09 2023-10-24 未来机器人(深圳)有限公司 Cargo carrying method, device and robot

Similar Documents

Publication Publication Date Title
CN111190191A (en) Scanning matching method based on laser SLAM
CN113409410B (en) Multi-feature fusion IGV positioning and mapping method based on 3D laser radar
CN109597864B (en) Method and system for real-time positioning and map construction of ellipsoid boundary Kalman filtering
CN110927740A (en) Mobile robot positioning method
CN112444246B (en) Laser fusion positioning method in high-precision digital twin scene
CN107917710B (en) Indoor real-time positioning and three-dimensional map construction method based on single line laser
CN110866927A (en) Robot positioning and composition method based on EKF-SLAM algorithm combined with dotted line characteristics of foot
JP2019207678A (en) Unsupervised learning of metric representations from slow features
CN112907735B (en) Flexible cable identification and three-dimensional reconstruction method based on point cloud
CN112965063B (en) Robot mapping and positioning method
CN108332752B (en) Indoor robot positioning method and device
CN111066064A (en) Grid occupancy mapping using error range distribution
CN113175929B (en) UPF-based spatial non-cooperative target relative pose estimation method
CN103901891A (en) Dynamic particle tree SLAM algorithm based on hierarchical structure
Lee et al. A reliable position estimation method of the service robot by map matching
CN115855062A (en) Autonomous mapping and path planning method for indoor mobile robot
Kobilarov et al. Differential dynamic programming for optimal estimation
CN114608568B (en) Multi-sensor information based instant fusion positioning method
CN113532439B (en) Synchronous positioning and map construction method and device for power transmission line inspection robot
Comport et al. Robust real-time visual tracking: Comparison, theoretical analysis and performance evaluation
CN117136383A (en) Modeling an environment using image data
Qiao et al. Online monocular lane mapping using catmull-rom spline
CN112612034A (en) Pose matching method based on laser frame and probability map scanning
CN116576868A (en) Multi-sensor fusion accurate positioning and autonomous navigation method
CN117170354A (en) Wheel type robot positioning method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200522