CN110488818B - Laser radar-based robot positioning method and device and robot - Google Patents

Laser radar-based robot positioning method and device and robot Download PDF

Info

Publication number
CN110488818B
CN110488818B CN201910731221.9A CN201910731221A CN110488818B CN 110488818 B CN110488818 B CN 110488818B CN 201910731221 A CN201910731221 A CN 201910731221A CN 110488818 B CN110488818 B CN 110488818B
Authority
CN
China
Prior art keywords
robot
positioning
obtaining
variance
particle filter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910731221.9A
Other languages
Chinese (zh)
Other versions
CN110488818A (en
Inventor
罗丹平
叶力荣
张国栋
闫瑞君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Silver Star Intelligent Group Co Ltd
Original Assignee
Shenzhen Silver Star Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Silver Star Intelligent Technology Co Ltd filed Critical Shenzhen Silver Star Intelligent Technology Co Ltd
Priority to CN201910731221.9A priority Critical patent/CN110488818B/en
Publication of CN110488818A publication Critical patent/CN110488818A/en
Application granted granted Critical
Publication of CN110488818B publication Critical patent/CN110488818B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the invention relates to a robot positioning method, a device and a robot based on a laser radar, wherein the method comprises the following steps: obtaining an initial estimated position of the robot; obtaining a local particle filter model according to the initial estimation position and the map; obtaining obstacle information of the environment around the robot through the laser radar; and positioning the robot by using the local particle filter model and the obstacle information to obtain the position of the robot in the map. According to the laser radar-based robot positioning method, device and robot, the initial estimation position of the robot is obtained, the local particle filter model is obtained by using the initial estimation position and the map, and the robot is positioned by using the local particle filter model.

Description

Laser radar-based robot positioning method and device and robot
Technical Field
The embodiment of the invention relates to the technical field of artificial intelligence, in particular to a robot positioning method and device based on a laser radar and a robot.
Background
The robot is very popular among people because the robot can replace the people to do heavy household work. The robot needs to move in an unknown environment while completing the user's tasks. In order to achieve autonomous positioning and navigation during movement, an incremental map needs to be built, and the position of the map is estimated.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the related art: when the robot is manually moved from one pose to another pose, the pose and the pose of the robot are greatly changed relative to those before the movement. At this time, if the map is continuously constructed, problems of inaccurate map construction and map construction overlapping caused by the pose problem occur, so that the robot needs to be positioned again. At present, a global positioning method is directly adopted to reposition the robot, but the global positioning method has a wide search range, so that the positioning speed is low.
Disclosure of Invention
The embodiment of the invention aims to provide a laser radar-based robot positioning method and device with high positioning speed and a robot.
In a first aspect, an embodiment of the present invention provides a laser radar-based robot positioning method, which is used for a robot including a laser radar, and the method includes:
obtaining an initial estimated position of the robot;
obtaining a local particle filter model according to the initial estimation position and the map;
obtaining obstacle information of the environment around the robot through the laser radar;
and positioning the robot by using the local particle filter model and the obstacle information to obtain the position of the robot in the map.
In some embodiments, the local particle filter model comprises a first local particle filter model and a second local particle filter model;
the positioning the robot by using the local particle filter model and the obstacle information to obtain the position of the robot in the map includes:
positioning the robot by using the first local particle filter model and the obstacle information to obtain a first positioning result;
positioning the robot by using the second local particle filter model and the obstacle information to obtain a second positioning result;
and fusing the first positioning result and the second positioning result to obtain the position of the robot in the map.
In some embodiments, the first localization result comprises a first localization mean and a first localization variance, and the second localization result comprises a second localization mean and a second localization variance;
the fusing the first positioning result and the second positioning result to obtain the position of the robot in the map comprises:
if the first positioning variance and the second positioning variance both reach less than or equal to a preset variance threshold value within a preset time and the difference between the first positioning mean and the second positioning mean reaches less than or equal to a preset difference threshold value within a preset time, then,
and obtaining the position of the robot in the map according to the first positioning mean, the first positioning variance, the second positioning mean and the second positioning variance.
In some embodiments, the fusing the first positioning result and the second positioning result to obtain the position of the robot in the map further comprises:
if the first positioning variance or the second positioning variance does not reach a preset variance threshold value or less within a preset time, or the first positioning variance and the second positioning variance both reach a preset variance threshold value or less within a preset time, but the difference between the first positioning mean value and the second positioning mean value does not reach a preset difference threshold value or less within a preset time, then,
and obtaining a third global particle filter model according to the map, and positioning the robot by using the third global particle filter model to obtain the position of the robot in the map.
In some embodiments, the obtaining the position of the robot in the map according to the first positioning mean, the first positioning variance, the second positioning mean, and the second positioning variance includes:
respectively obtaining a first positioning weight and a second positioning weight according to the first positioning variance and the second positioning variance;
and carrying out weighted average on the first positioning mean value and the second positioning mean value according to the first positioning weight and the second positioning weight to obtain the position of the robot in the map.
In some embodiments, the obtaining an initial estimated position of the robot comprises:
obtaining an original position and an original coordinate system of the robot, wherein the original position is a position of the robot before the robot is moved artificially;
dividing the time of the artificial movement of the robot into at least two sampling time intervals;
in any sampling time interval, establishing a local coordinate system according to the pose of the robot, obtaining the displacement increment of the robot under the local coordinate system, and converting the displacement increment under the local coordinate system into the displacement increment under the original coordinate system;
and obtaining the initial estimation position of the robot according to the original position and the displacement increment of the robot in the original coordinate system in each sampling time interval.
In some embodiments, the obtaining an initial estimated position of the robot further comprises:
obtaining an angular velocity and an acceleration of the robot;
performing time integration on the angular speed within the sampling time interval to obtain the pose of the robot;
performing time integration on the acceleration within the sampling time interval to obtain the speed of the robot;
and obtaining the displacement increment of the robot in the local coordinate system within the sampling time interval according to the speed and the acceleration of the robot.
In some embodiments, said locating the robot using the local particle filter model and the obstacle information comprises:
updating the weight of each particle in the local particle filter model according to the obstacle information;
and obtaining a positioning mean value and a positioning variance according to the weight and the positions of the particles in the local particle filter model.
In a second aspect, an embodiment of the present invention provides a lidar-based robot positioning apparatus for a robot, where the robot includes a lidar, and the apparatus includes:
an initial estimated position obtaining module for obtaining an initial estimated position of the robot;
the local particle filter model obtaining module is used for obtaining a local particle filter model according to the initial estimation position and the map;
the obstacle information acquisition module is used for acquiring obstacle information of the surrounding environment of the robot through the laser radar;
and the positioning module is used for positioning the robot by utilizing the local particle filter model and the obstacle information to obtain the position of the robot in the map.
In some embodiments, the local particle filter model comprises a first local particle filter model and a second local particle filter model;
the positioning module is used for:
positioning the robot by using the first local particle filter model and the obstacle information to obtain a first positioning result;
positioning the robot by using the second local particle filter model and the obstacle information to obtain a second positioning result;
and fusing the first positioning result and the second positioning result to obtain the position of the robot in the map.
In a third aspect, an embodiment of the present invention provides a robot, including:
a robot main body;
the walking mechanism is arranged on the robot main body;
the laser radar and the inertia measurement unit are arranged on the robot main body, and the laser radar is used for obtaining obstacle information of the environment around the robot;
the controller is arranged in the robot main body and connected with the laser radar and the inertia measuring unit;
the controller includes:
at least one processor, and
a memory communicatively coupled to the at least one processor, the memory storing instructions executable by the at least one processor to enable the at least one processor to perform the method described above.
In a fourth aspect, embodiments of the present invention provide a non-transitory computer-readable storage medium having stored thereon computer-executable instructions that, when executed by a robot, cause the robot to perform a method as described above.
According to the laser radar-based robot positioning method, device and robot, the initial estimation position of the robot is obtained, the local particle filter model is obtained by using the initial estimation position and the map, and the robot is positioned by using the local particle filter model.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
Fig. 1 is a schematic view of an application scenario of a robot positioning method and apparatus according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of one embodiment of the robot of the present invention;
FIG. 3 is a schematic flow chart diagram illustrating one embodiment of a robot positioning method of the present invention;
FIG. 4 is a schematic diagram of an original coordinate system and a local coordinate system in an embodiment of the robot positioning method of the present invention;
FIG. 5 is a schematic diagram illustrating updating particle weight values according to an embodiment of the present invention;
FIG. 6 is a schematic flow chart diagram illustrating one embodiment of a robot positioning method of the present invention;
FIG. 7 is a schematic structural diagram of one embodiment of the robotic positioning device of the present invention;
fig. 8 is a schematic diagram of the hardware structure of the controller in one embodiment of the robot of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The laser radar-based robot positioning method and device provided by the embodiment of the invention can be applied to the application scene shown in fig. 1. The application scenario includes a robot 10. The robot 10 may be a mobile robot, such as a sweeping robot, an inspection robot, an unmanned sampling robot, an unmanned forklift, and so on.
The robot 10 may need to move in an unknown environment in order to accomplish a user's task or otherwise. In order to realize autonomous positioning and navigation in the process of movement, an incremental map needs to be built, and positioning is carried out simultaneously, namely the position of the incremental map is estimated.
When the robot is manually moved from one pose to another pose, the pose and the pose of the robot are greatly changed relative to those before the movement. At this time, if the map is continuously constructed, problems of inaccurate map construction and map construction overlapping caused by the pose problem occur, so that the robot needs to be positioned again.
If global positioning is directly carried out in a map, the positioning speed is slow due to the wide searching range. In the embodiment of the invention, the initial estimation is firstly carried out to obtain the initial estimation position of the robot, then the local particle filter model is obtained in the map according to the initial estimation position, and the robot is positioned by utilizing the local particle filter model. Wherein the local particle filter model comprises at least two particles, and the at least two particles are positioned in a certain area near the initial estimation position. And the global positioning refers to positioning the robot by using a global particle filter model, wherein particles in the global particle filter model are positioned in the map according to a certain distribution rule. The local particle filter model searching range is small relative to the global positioning searching range, so that the positioning speed is high.
In order to make autonomous positioning of the robot more accurate, the robot must position itself with reference to its external environment, where the robot can measure the environment using external sensors (e.g., ultrasound, laser, vision sensors, etc.). In the embodiment of the invention, the robot detects the obstacle data of the surrounding environment by using the laser radar carried by the robot, and positions the robot by using the local particle filter model and the obstacle data.
In some embodiments, referring to fig. 2, the robot 10 includes a robot main body 11, a laser radar 12, a controller 13, a traveling mechanism 14, and an inertia measurement unit 15. The robot body 11 is a main structure of the robot, and may be made of a corresponding shape and structure and a corresponding manufacturing material (such as hard plastic or metal such as aluminum or iron) according to actual needs of the robot 10, for example, the robot body is configured to be a flat cylinder shape common to sweeping robots.
The traveling mechanism 14 is a structural device provided in the robot main body 11 and providing the robot 10 with a traveling capability. The running gear 14 can be embodied in particular by means of any type of moving means, such as rollers, tracks, etc. The laser radar 12 is used for sensing the obstacle condition of the environment around the robot and obtaining obstacle information. In other embodiments, other sensors may be used instead of a lidar, such as RGBD cameras.
In some of these embodiments, the inertial measurement unit 15 includes a gyroscope, an accelerometer, etc. for obtaining an angular velocity and an acceleration, etc. of the robot, and estimating the position of the robot according to the angular velocity and the acceleration, obtaining an initial estimated position of the robot. In other embodiments, other methods or other measurement units may be used to obtain the initial estimated position of the robot.
The controller 13 is an electronic computing core built in the robot main body 11 for executing logical operation steps to realize intelligent control of the robot 10. The controller 13 is connected to the laser radar 12 and the inertial measurement unit 15, and is configured to execute a preset algorithm to perform map composition and robot positioning according to data collected by the laser radar 12 and the inertial measurement unit 15.
It should be noted that, depending on the task to be performed, in addition to the above functional modules, one or more other different functional modules (such as a water tank, a cleaning device, etc.) may be mounted on the robot main body 10 and cooperate with each other to perform the corresponding task.
Fig. 3 is a schematic flowchart of a lidar-based robot positioning method according to an embodiment of the present invention, which may be performed by the robot 10 shown in fig. 1 or fig. 2 (specifically, in some embodiments, by the controller 13 in the robot 10), as shown in fig. 3, and the method includes:
101: an initial estimated position of the robot is obtained.
The initial estimated position is an estimated value of "a position where the robot is moved artificially". In some embodiments, an Inertial Measurement Unit (IMU) may be used to obtain an angular velocity and an acceleration of the robot, and then obtain a displacement of the robot after being moved by the human relative to the robot before being moved by the human according to the angular velocity and the acceleration, so as to obtain an initial estimated position of the robot according to an original position (i.e., a position before the human is moved).
The pose of the robot is constantly changed in the process of reaching the position after the robot is moved by people from the original position. Therefore, for the sake of calculation, the time for the robot to move artificially (i.e., the time from before the artificial movement to after the artificial movement) is divided into a plurality of sampling time intervals Δ t, and since Δ t is sufficiently small, the acceleration, velocity, angular velocity, and direction of the robot are considered to be constant within Δ t. And calculating the displacement increment of the robot in each delta t, namely obtaining the displacement of the robot relative to the original position after the robot is artificially moved, thereby obtaining the initial estimated position of the robot. The robot can be lifted and put down through a tact switch or a ground detection module arranged on the robot.
Following a sampling time interval [ t, t + Δ t [ ]]For example, a process of obtaining the displacement increment of the robot within Δ t is described. Referring to fig. 4, P (0) represents an original position, an original coordinate system at the original position is an X-Y-Z coordinate system, P (t) represents a position of the robot at time t, and a local coordinate system at time t is an X (t) -Y (t) -Z (t) coordinate system. Wherein the angular velocity at time t- Δ t can be obtained by the IMU by applying a linear function to the angular velocity at [ t- Δ t, t [ ]]And time integration is carried out to obtain the angle change of the robot at the t moment relative to the t-delta t moment. The angle at time t relative to the original position can be obtained by iterating over the angle changes within each Δ t before time t. For example, the angle may be expressed as (r) in RPYt,pt,yt) And respectively indicate the rotation angles of the robot around the X, Y and Z axes at the time t. From this angle, a coordinate system x (t) -y (t) -z (t) at time t can be established.
Obtaining acceleration (imuax) at time t by the IMUt,imuayt,imuazt) And acceleration at time t- Δ t, for acceleration at time t- Δ t at [ t- Δ t, t]Time integration is carried out to obtain the speed (vx) at the moment tt,vyt,vzt). The projections of the gravity acceleration on the three axes under the X (t) -Y (t) -Z (t) coordinate system are respectively (gx)t,gyt,gzt) The effective acceleration at time t is (ax)t,ayt,azt)=(imuaxt,imuayt,imuazt)-(gxt,gyt,gzt). By the formula
Figure BDA0002160639430000081
Speed of belt entry (vx)t,vyt,vzt) And effective acceleration (ax)t,ayt,azt) The displacement increment within the Δ t interval may be calculated. The displacement increment represents the displacement increment under a local coordinate system X (t) -Y (t) -Z (t), and can be converted into the displacement increment under an original coordinate system X-Y-Z through space coordinate transformation. And adding the coordinates of the original position and displacement increment of the robot in X-Y-Z coordinates in each sampling time interval to obtain the initial estimated position of the robot.
102: and obtaining a local particle filter model according to the initial estimation position and the map.
Specifically, in some embodiments, a local particle filter model is obtained, a region in the map may be obtained according to the initial estimated position, the region may be a certain region near the initial estimated position, and then a plurality of particles are randomly obtained in the region according to a certain distribution rule (e.g., uniform distribution, gaussian distribution, etc.). Each particle has a location in the map.
103: and obtaining obstacle information of the surrounding environment of the robot through the laser radar.
And detecting obstacles in the surrounding environment of the robot through a laser radar to obtain obstacle information of the surrounding environment, wherein the obstacle information comprises the distance, the direction and the like of the obstacle from the robot.
And 104, positioning the robot by using the local particle filter model and the obstacle information to obtain the position of the robot in the map.
Each particle in the particle filter model represents a possible position of the robot, and assuming that the robot moves 0.1 meter and rotates 0.7 radian, the state of each particle can be predicted by using the system model, and the predicted position of each particle is obtained.
The weight of each particle may be updated according to the obstacle information measured by the lidar to ensure that particles closer to the true position get a higher weight. Specifically, if the particles have the same or similar obstacle data in the map, the weight value is high, otherwise, the weight value is low. For example, if the robot measures obstacles in some directions and distances by lidar, the weight value is high if the particles also have obstacles in the same or similar directions and distances in the map.
Referring to fig. 5, a grid represents a grid map, a triangle represents the pose of a particle, a circle represents a scanning point of the laser radar, a white box represents a free grid, and a gray box represents an occupied grid map. The pose of each particle, the matching degree of the laser scanning data and the occupied map are different, and the weight value of the particle can be calculated by calculating the matching degree. It can be seen from the figure that the middle particle matches better, and therefore its weight value is higher.
After the weight of each particle is updated, resampling may be performed, and then a weighted average may be calculated according to the weight value and the predicted position of the resampled particle to obtain a position mean and a variance.
Wherein, in some embodiments, a local particle filter model may be utilized to locate the robot. In other embodiments, in order to reduce the false-true rate, at least two particle filter models may also be used to position the robot, and the positioning results are fused to obtain the position of the robot in the map.
In the following, taking an example that the local particle filter model includes a first local particle filter model and a second local particle filter model, how to fuse the positioning results of at least two local particle filter models is described.
If the robot is located by the first local particle filter model, a first locating mean value and a first locating variance are obtained, and if the robot is located by the second local particle filter model, a second locating mean value and a second locating variance are obtained. And if the first positioning variance and the second positioning variance both reach values less than or equal to a preset variance threshold (which indicates that the first local particle filter model and the second local particle filter model are successfully positioned) within a preset time and the difference between the first positioning mean value and the second positioning mean value reaches values less than or equal to a preset difference threshold within a preset time, obtaining the position of the robot in the map according to the first positioning mean value, the first positioning variance, the second positioning mean value and the second positioning variance.
And the positioning results of the two local particle filter models are fused to obtain the positioning position of the robot, so that the false rate can be reduced, and the robot can be positioned better.
Specifically, a first positioning weight and a second positioning weight may be obtained according to a first positioning variance and a second positioning variance, and then the first positioning mean and the second positioning mean are weighted and averaged according to the first positioning weight and the second positioning weight to obtain the position of the robot in the map.
In some of these embodiments, the inverse of the variance may be used as the weight value. The mean and variance of the first local particle filter model are respectively m1=[x1,y1,theta1]And
Figure BDA0002160639430000101
the mean value and variance of the second local particle filter model are m2=[x2,y2,theta2]And
Figure BDA0002160639430000102
the result after fusion is then m ═ x, y, theta]Wherein, in the step (A),
Figure BDA0002160639430000111
Figure BDA0002160639430000112
in other embodiments, to ensure the robustness of the algorithm, if the local localization is unsuccessful, the robot is globally localized. That is, if the first local particle filter model or the second local particle filter model is unsuccessfully located (that is, the first location variance or the second location variance does not reach or is less than or equal to a preset variance threshold value within a preset time, or, although the first location variance and the second location variance both reach or is less than or equal to a preset variance threshold value within a preset time, the difference between the first location mean value and the second location mean value does not reach or is less than or equal to a preset difference threshold value within a preset time), a third global particle filter model is obtained according to the map, and the third global particle filter model is used to globally locate the robot, so as to obtain the position of the robot in the map.
According to the laser radar-based robot positioning method, device and robot, the initial estimation position of the robot is obtained, the local particle filter model is obtained by using the initial estimation position and the map, and the robot is positioned by using the local particle filter model.
Fig. 6 illustrates a specific implementation manner of the robot positioning method by taking a robot as a sweeper as an example, in the embodiment illustrated in fig. 6, the robot repositioning process includes three processing stages, specifically, a first processing stage 100, a second processing stage 200, and a third processing stage 300, where the first processing stage 100 uses IMU data of the sweeper to estimate a pose of the sweeper moving from one place to another, the second processing stage 200 uses two local particle filter models to perform local positioning, and the third processing stage 300 uses two global particle filter models to perform global positioning.
In the embodiment shown in fig. 6, the initial estimated position of the sweeper is estimated by using IMU data of the sweeper, then two local particle filter models pf1 and pf2 are obtained based on the initial estimated position and the map, the sweeper is positioned by using pf1 to obtain a positioning mean value m1 and a positioning variance c1, and the sweeper is positioned by using pf2 to obtain a positioning mean value m2 and a positioning variance c 2. If both pf1 and pf2 are successfully positioned within the preset time, namely c1 and c2 are both smaller than a preset variance threshold thres _ c, and the difference m1-m2 between the positioning means of the two is smaller than a preset difference threshold thres _ m, the positioning results of the two local particle filter models are fused to obtain the position of the sweeper in the map.
Otherwise, the global particle filter model is used for carrying out global positioning on the sweeper, two global particle filter models pf1 and pf2 are obtained based on the whole map, pf1 is used for positioning the sweeper to obtain a positioning mean value m1 and a positioning variance c1, and pf2 is used for positioning the sweeper to obtain a positioning mean value m2 and a positioning variance c 2. And if the pf1 and the pf2 are successfully positioned in the preset time, and the difference m1-m2 between the positioning mean values of the pf1 and the pf2 is smaller than a preset difference threshold thres _ m, fusing the positioning results of the two global particle filter models to obtain the position of the sweeper in the map. Otherwise, the positioning fails.
Correspondingly, as shown in fig. 7, an embodiment of the present invention further provides a lidar-based robot positioning apparatus, which may be used for the robot 10 shown in fig. 1 or fig. 2, where the robot positioning apparatus 700 includes:
an initial estimated position obtaining module 701, configured to obtain an initial estimated position of the robot;
a local particle filter model obtaining module 702, configured to obtain a local particle filter model according to the initial estimated position and the map;
an obstacle information obtaining module 703, configured to obtain, by using the laser radar, obstacle information of an environment around the robot;
a positioning module 704, configured to position the robot by using the local particle filter model and the obstacle information, and obtain a position of the robot in the map.
According to the laser radar-based robot positioning method, device and robot, the initial estimation position of the robot is obtained, the local particle filter model is obtained by using the initial estimation position and the map, and the robot is positioned by using the local particle filter model.
In some embodiments, the local particle filter model comprises a first local particle filter model and a second local particle filter model;
the positioning module 704 is specifically configured to:
positioning the robot by using the first local particle filter model and the obstacle information to obtain a first positioning result;
positioning the robot by using the second local particle filter model and the obstacle information to obtain a second positioning result;
and fusing the first positioning result and the second positioning result to obtain the position of the robot in the map.
In other embodiments, the first localization result comprises a first localization mean and a first localization variance, and the second localization result comprises a second localization mean and a second localization variance;
the positioning module 704 is specifically configured to:
if the first positioning variance and the second positioning variance both reach less than or equal to a preset variance threshold value within a preset time and the difference between the first positioning mean and the second positioning mean reaches less than or equal to a preset difference threshold value within a preset time, then,
and obtaining the position of the robot in the map according to the first positioning mean, the first positioning variance, the second positioning mean and the second positioning variance.
In other embodiments, the positioning module 704 is further specifically configured to:
if the first positioning variance or the second positioning variance does not reach a preset variance threshold value or less within a preset time, or the first positioning variance and the second positioning variance both reach a preset variance threshold value or less within a preset time, but the difference between the first positioning mean value and the second positioning mean value does not reach a preset difference threshold value or less within a preset time, then,
and obtaining a third global particle filter model according to the map, and positioning the robot by using the third global particle filter model to obtain the position of the robot in the map.
In some embodiments, the positioning module 704 is specifically configured to:
respectively obtaining a first positioning weight and a second positioning weight according to the first positioning variance and the second positioning variance;
and carrying out weighted average on the first positioning mean value and the second positioning mean value according to the first positioning weight and the second positioning weight to obtain the position of the robot in the map.
In some embodiments, the initial estimated position obtaining module 701 is specifically configured to:
obtaining an original position and an original coordinate system of the robot, wherein the original position is a position of the robot before the robot is moved artificially;
dividing the time of the artificial movement of the robot into at least two sampling time intervals;
in any sampling time interval, establishing a local coordinate system according to the pose of the robot, obtaining the displacement increment of the robot under the local coordinate system, and converting the displacement increment under the local coordinate system into the displacement increment under the original coordinate system;
and obtaining the initial estimation position of the robot according to the original position and the displacement increment of the robot in the original coordinate system in each sampling time interval.
In other embodiments, the initial estimated position obtaining module 701 is further specifically configured to:
obtaining an angular velocity and an acceleration of the robot;
performing time integration on the angular speed within the sampling time interval to obtain the pose of the robot;
performing time integration on the acceleration within the sampling time interval to obtain the speed of the robot;
and obtaining the displacement increment of the robot in the local coordinate system within the sampling time interval according to the speed and the acceleration of the robot.
In some embodiments, the positioning module 704 is specifically configured to:
updating the weight of each particle in the local particle filter model according to the obstacle information;
and obtaining a positioning mean value and a positioning variance according to the weight and the positions of the particles in the local particle filter model.
It should be noted that the above-mentioned apparatus can execute the method provided by the embodiments of the present application, and has corresponding functional modules and beneficial effects for executing the method. For technical details which are not described in detail in the device embodiments, reference is made to the methods provided in the embodiments of the present application.
Fig. 8 is a schematic diagram of a hardware structure of the controller 13 according to an embodiment of the robot 10, and as shown in fig. 8, the controller 13 includes:
one or more processors 131 and a processor 132, one processor 131 being exemplified in fig. 8.
The processor 131 and the processor 132 may be connected by a bus or other means, and fig. 8 illustrates the connection by a bus as an example.
The processor 132, as a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules (for example, the initial estimated position obtaining module 701 shown in fig. 7) corresponding to the robot positioning method in the embodiment of the present application. The processor 131 executes various functional applications of the controller and data processing, namely, implements the robot positioning method of the above-described method embodiment, by running nonvolatile software programs, instructions, and modules stored in the processor 132.
The processor 132 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the robot positioning device, and the like. Further, the processor 132 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some embodiments, the processor 132 may optionally include memory located remotely from the processor 131, which may be connected to the robot over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more modules are stored in the processor 132 and when executed by the one or more processors 131 perform the robot positioning method in any of the above-described method embodiments, e.g. performing the method steps 101 to 104 of fig. 3 described above; the functions of the modules 701 and 704 in fig. 7 are realized.
The product can execute the method provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in this embodiment, reference may be made to the methods provided in the embodiments of the present application.
Embodiments of the present application provide a non-transitory computer-readable storage medium storing computer-executable instructions, which are executed by one or more processors, such as one of the processors 131 in fig. 8, to enable the one or more processors to perform the robot positioning method in any of the above method embodiments, such as performing the above-described method steps 101 to 104 in fig. 3; the functions of the modules 701 and 704 in fig. 7 are realized.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
Through the above description of the embodiments, those skilled in the art will clearly understand that the embodiments may be implemented by software plus a general hardware platform, and may also be implemented by hardware. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a computer readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-only memory (ROM), a Random Access Memory (RAM), or the like.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; within the idea of the invention, also technical features in the above embodiments or in different embodiments may be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the invention as described above, which are not provided in detail for the sake of brevity; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. A lidar based robot positioning method for a robot including a lidar, the method comprising:
obtaining an initial estimated position of the robot;
obtaining at least two local particle filter models according to the initial estimation position and a map;
obtaining obstacle information of the environment around the robot through the laser radar;
respectively positioning the robot by using the at least two local particle filter models and the obstacle information to obtain at least two positioning results, wherein the positioning results comprise a positioning mean value and a positioning variance;
if each positioning variance reaches a preset variance threshold value or less within a preset time and the difference between the positioning means reaches a preset difference threshold value or less within a preset time, then,
and obtaining the position of the robot in the map according to the positioning mean values and the positioning variance values.
2. The robot positioning method of claim 1, further comprising:
if one of the positioning variances does not reach to be less than or equal to a preset variance threshold value within a preset time, or the positioning variances reach to be less than or equal to a preset variance threshold value within a preset time but the difference of the positioning means does not reach to be less than or equal to a preset difference threshold value within a preset time, then,
and obtaining a third global particle filter model according to the map, and positioning the robot by using the third global particle filter model to obtain the position of the robot in the map.
3. A robot positioning method according to claim 1 or 2, characterized in that the positioning variance comprises a first positioning variance and a second positioning variance, and the positioning mean comprises a first positioning mean and a second positioning mean;
the obtaining the position of the robot in the map according to the positioning mean values and the positioning variance includes:
respectively obtaining a first positioning weight and a second positioning weight according to the first positioning variance and the second positioning variance;
and carrying out weighted average on the first positioning mean value and the second positioning mean value according to the first positioning weight and the second positioning weight to obtain the position of the robot in the map.
4. The robot positioning method of claim 3, wherein the obtaining an initial estimated position of the robot comprises:
obtaining an original position and an original coordinate system of the robot, wherein the original position is a position of the robot before the robot is moved artificially;
dividing the time of the artificial movement of the robot into at least two sampling time intervals;
in any sampling time interval, establishing a local coordinate system according to the pose of the robot, obtaining the displacement increment of the robot under the local coordinate system, and converting the displacement increment under the local coordinate system into the displacement increment under the original coordinate system;
and obtaining the initial estimation position of the robot according to the original position and the displacement increment of the robot in the original coordinate system in each sampling time interval.
5. The robot positioning method of claim 4, wherein the obtaining an initial estimated position of the robot further comprises:
obtaining an angular velocity and an acceleration of the robot;
performing time integration on the angular speed within the sampling time interval to obtain the pose of the robot;
performing time integration on the acceleration within the sampling time interval to obtain the speed of the robot;
and obtaining the displacement increment of the robot in the local coordinate system within the sampling time interval according to the speed and the acceleration of the robot.
6. The robot positioning method according to claim 1, wherein the positioning the robot using the at least two local particle filter models and the obstacle information comprises:
updating the weight of each particle in the local particle filter model according to the obstacle information;
and obtaining a positioning mean value and a positioning variance according to the weight and the positions of the particles in the local particle filter model.
7. A lidar-based robot positioning apparatus for a robot, the robot including a lidar, the apparatus comprising:
an initial estimated position obtaining module for obtaining an initial estimated position of the robot;
the local particle filter model acquisition module is used for acquiring at least two local particle filter models according to the initial estimation position and the map;
the obstacle information acquisition module is used for acquiring obstacle information of the surrounding environment of the robot through the laser radar;
the positioning module is used for positioning the robot by respectively utilizing the at least two local particle filter models and the obstacle information to obtain at least two positioning results, wherein the positioning results comprise a positioning mean value and a positioning variance;
if each positioning variance reaches a preset variance threshold value or less within a preset time and the difference between the positioning means reaches a preset difference threshold value or less within a preset time, then,
and obtaining the position of the robot in the map according to the positioning mean values and the positioning variance values.
8. A robot, characterized in that the robot comprises:
a robot main body;
the walking mechanism is arranged on the robot main body;
the laser radar and the inertia measurement unit are arranged on the robot main body, and the laser radar is used for obtaining obstacle information of the environment around the robot;
the controller is arranged in the robot main body and connected with the laser radar and the inertia measuring unit;
the controller includes:
at least one processor, and
a memory communicatively coupled to the at least one processor, the memory storing instructions executable by the at least one processor to enable the at least one processor to perform the method of any of claims 1-6.
9. A non-transitory computer-readable storage medium storing computer-executable instructions that, when executed by a robot, cause the robot to perform the method of any of claims 1-6.
CN201910731221.9A 2019-08-08 2019-08-08 Laser radar-based robot positioning method and device and robot Active CN110488818B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910731221.9A CN110488818B (en) 2019-08-08 2019-08-08 Laser radar-based robot positioning method and device and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910731221.9A CN110488818B (en) 2019-08-08 2019-08-08 Laser radar-based robot positioning method and device and robot

Publications (2)

Publication Number Publication Date
CN110488818A CN110488818A (en) 2019-11-22
CN110488818B true CN110488818B (en) 2020-07-17

Family

ID=68550337

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910731221.9A Active CN110488818B (en) 2019-08-08 2019-08-08 Laser radar-based robot positioning method and device and robot

Country Status (1)

Country Link
CN (1) CN110488818B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111174782B (en) * 2019-12-31 2021-09-17 智车优行科技(上海)有限公司 Pose estimation method and device, electronic equipment and computer readable storage medium
CN111267103A (en) * 2020-03-09 2020-06-12 深圳拓邦股份有限公司 Method and device for acquiring initial position of robot, robot and storage medium
CN111474535B (en) * 2020-03-18 2022-03-15 广东省智能机器人研究院 Mobile robot global positioning method based on characteristic thermodynamic diagram
CN111352066B (en) * 2020-03-27 2022-02-22 西安震有信通科技有限公司 Particle filter-based positioning method and device, computer equipment and storage medium
CN111947649A (en) * 2020-06-21 2020-11-17 珠海市一微半导体有限公司 Robot positioning method based on data fusion, chip and robot
CN113558524B (en) * 2021-07-14 2022-11-29 北京小狗吸尘器集团股份有限公司 Sweeping robot and method and device for repositioning lifted sweeping robot
CN116242410B (en) * 2022-09-05 2023-12-19 浙江智马达智能科技有限公司 Calibration method, terminal and computer storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106066179A (en) * 2016-07-27 2016-11-02 湖南晖龙股份有限公司 A kind of robot location based on ROS operating system loses method for retrieving and control system
CN107991683A (en) * 2017-11-08 2018-05-04 华中科技大学 A kind of robot autonomous localization method based on laser radar
CN108507579A (en) * 2018-04-08 2018-09-07 浙江大承机器人科技有限公司 A kind of method for relocating based on localized particle filtering
CN108632761A (en) * 2018-04-20 2018-10-09 西安交通大学 A kind of indoor orientation method based on particle filter algorithm
CN108931245A (en) * 2018-08-02 2018-12-04 上海思岚科技有限公司 The local method for self-locating and equipment of mobile robot
CN109725329A (en) * 2019-02-20 2019-05-07 苏州风图智能科技有限公司 A kind of unmanned vehicle localization method and device
CN109870716A (en) * 2017-12-01 2019-06-11 北京京东尚科信息技术有限公司 Localization method and positioning device and computer readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5807518B2 (en) * 2011-11-09 2015-11-10 富士通株式会社 Estimation apparatus, estimation method, and estimation program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106066179A (en) * 2016-07-27 2016-11-02 湖南晖龙股份有限公司 A kind of robot location based on ROS operating system loses method for retrieving and control system
CN107991683A (en) * 2017-11-08 2018-05-04 华中科技大学 A kind of robot autonomous localization method based on laser radar
CN109870716A (en) * 2017-12-01 2019-06-11 北京京东尚科信息技术有限公司 Localization method and positioning device and computer readable storage medium
CN108507579A (en) * 2018-04-08 2018-09-07 浙江大承机器人科技有限公司 A kind of method for relocating based on localized particle filtering
CN108632761A (en) * 2018-04-20 2018-10-09 西安交通大学 A kind of indoor orientation method based on particle filter algorithm
CN108931245A (en) * 2018-08-02 2018-12-04 上海思岚科技有限公司 The local method for self-locating and equipment of mobile robot
CN109725329A (en) * 2019-02-20 2019-05-07 苏州风图智能科技有限公司 A kind of unmanned vehicle localization method and device

Also Published As

Publication number Publication date
CN110488818A (en) 2019-11-22

Similar Documents

Publication Publication Date Title
CN110488818B (en) Laser radar-based robot positioning method and device and robot
CN109084732B (en) Positioning and navigation method, device and processing equipment
Gomez-Balderas et al. Tracking a ground moving target with a quadrotor using switching control: nonlinear modeling and control
US8793069B2 (en) Object recognition system for autonomous mobile body
CN112183133B (en) Aruco code guidance-based mobile robot autonomous charging method
CN112740274A (en) System and method for VSLAM scale estimation on robotic devices using optical flow sensors
CN112539749B (en) Robot navigation method, robot, terminal device, and storage medium
CN111596665B (en) Dense height map construction method suitable for leg-foot robot planning
CN111915675B (en) Particle drift-based particle filtering point cloud positioning method, device and system thereof
CN111060099B (en) Real-time positioning method for unmanned automobile
CN111182174B (en) Method and device for supplementing light for sweeping robot
CN112734765A (en) Mobile robot positioning method, system and medium based on example segmentation and multi-sensor fusion
CN113587930B (en) Indoor and outdoor navigation method and device of autonomous mobile robot based on multi-sensor fusion
CN110705385B (en) Method, device, equipment and medium for detecting angle of obstacle
CN113256716A (en) Robot control method and robot
CN111002346A (en) Robot trapped detection method and robot
CN112540609A (en) Path planning method and device, terminal equipment and storage medium
CN115436955A (en) Indoor and outdoor environment positioning method
CN112033423B (en) Robot path planning method and device based on road consensus and robot
CN111553342A (en) Visual positioning method and device, computer equipment and storage medium
Pereira et al. Backward motion for estimation enhancement in sparse visual odometry
CN114740869A (en) Robot obstacle avoidance method and system based on multi-sensor fusion estimation and iterative pre-search
Kolu et al. A mapping method tolerant to calibration and localization errors based on tilting 2D laser scanner
CN114643579A (en) Robot positioning method and device, robot and storage medium
Mueller et al. Continuous stereo self-calibration on planar roads

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 518000 1701, building 2, Yinxing Zhijie, No. 1301-72, sightseeing Road, Xinlan community, Guanlan street, Longhua District, Shenzhen, Guangdong Province

Patentee after: Shenzhen Yinxing Intelligent Group Co.,Ltd.

Address before: 518000 building A1, Yinxing hi tech Industrial Park, Guanlan street, Longhua District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Silver Star Intelligent Technology Co.,Ltd.