CN113432533B - Robot positioning method and device, robot and storage medium - Google Patents

Robot positioning method and device, robot and storage medium Download PDF

Info

Publication number
CN113432533B
CN113432533B CN202110750179.2A CN202110750179A CN113432533B CN 113432533 B CN113432533 B CN 113432533B CN 202110750179 A CN202110750179 A CN 202110750179A CN 113432533 B CN113432533 B CN 113432533B
Authority
CN
China
Prior art keywords
current
laser
robot
determining
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110750179.2A
Other languages
Chinese (zh)
Other versions
CN113432533A (en
Inventor
刘嗣超
闫东坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yingdi Mande Technology Co ltd
Original Assignee
Beijing Yingdi Mande Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yingdi Mande Technology Co ltd filed Critical Beijing Yingdi Mande Technology Co ltd
Publication of CN113432533A publication Critical patent/CN113432533A/en
Application granted granted Critical
Publication of CN113432533B publication Critical patent/CN113432533B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The application provides a robot positioning method, a device, a robot and a storage medium, wherein the method comprises the following steps: acquiring a current laser point cloud and a local probability map corresponding to a current running environment of a robot; judging whether laser degradation exists currently or not according to the matching condition between the current laser point cloud and the local probability map; when laser degradation exists currently, determining a laser degradation direction according to the position information of a matching point between the current laser point cloud and a local probability map; acquiring a current predicted position of the robot; and determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point. According to the method provided by the scheme, the pose information of the robot is determined by fusing the laser detection information and the predicted position information according to the laser degradation condition, so that the positioning accuracy of the robot is improved.

Description

Robot positioning method and device, robot and storage medium
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a robot positioning method, a device, a robot, and a storage medium.
Background
Laser slam is one of the most widely used robotic positioning techniques at present. In laser slam, there are some challenging scenarios that can degrade positioning accuracy, such as hallways, single side walls, circular boundaries, etc.
The existing processing can only deal with a certain scene, such as a long corridor, and a method exists, wherein the point cloud is utilized to extract the front end, the current long corridor is judged, then the predicted current robot position is fused with the heading obtained after laser radar scanning is matched, and the fused robot pose is obtained.
However, the existing laser degradation processing is limited to a predetermined scene, and the characteristic of the scene is used as a processing basis, so that the adaptability is poor, and the positioning accuracy of the robot is low.
Disclosure of Invention
The application provides a robot positioning method, a robot positioning device, a robot and a storage medium, which are used for solving the defects of low positioning precision and the like in the prior art.
The first aspect of the application provides a robot positioning method, comprising the following steps:
acquiring a current laser point cloud and a local probability map corresponding to a current running environment of a robot;
judging whether laser degradation exists currently or not according to the matching condition between the current laser point cloud and the local probability map;
when laser degradation exists currently, determining a laser degradation direction according to the position information of a matching point between the current laser point cloud and a local probability map;
acquiring a current predicted position of the robot;
and determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point.
Optionally, the determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point includes:
determining the laser degradation direction as a first direction and determining a direction perpendicular to the laser degradation direction as a second direction;
determining a first direction coordinate corresponding to the current predicted position as a current first direction coordinate of the robot;
determining the current second direction coordinate of the robot according to the position information of each matching point;
determining the current direction of the robot according to the angle information of each matching point;
and determining the current pose information according to the current first direction coordinate, the current second direction coordinate and the current direction of the robot.
Optionally, the determining whether there is laser degradation currently according to the matching between the current laser point cloud and the local probability map includes:
calculating a matching value between the current laser point cloud and the local probability map;
when the matching value is larger than a preset matching value threshold, determining an acquisition point corresponding to the current laser point cloud as a matching point;
and when the number of the matching points exceeds a first preset threshold value, determining that laser degradation exists currently.
Optionally, the method further comprises:
when the number of the matching points is larger than a second preset threshold value and smaller than a first preset threshold value, counting the number of the angle matching points corresponding to the current acquisition points;
and when the number of the angle matching points exceeds a third preset threshold value, determining that laser angle degradation exists currently.
Optionally, the method further comprises:
acquiring the current prediction orientation of the robot;
determining current position information of the robot according to the position information of the matching point;
and determining the current pose information of the robot according to the current predicted orientation of the robot and the current position information.
Optionally, the calculating a matching value between the current local probability map and the historical local probability map includes:
the matching value Score is determined according to the following formula:
Score=(P(x 1 ,y 1 )+...+P(x i ,y i )+...+P(x n ,y n ))/n
wherein, (x) i ,y i ) Representing the coordinates of the ith point cloud in the grid, P (x i ,y i ) And (3) representing the grid occupation probability of the ith point cloud, wherein n is the number of the scanning point clouds.
Optionally, the obtaining the current laser point cloud and the local probability map corresponding to the current running environment of the robot includes:
acquiring a history point cloud and a radar signal of a robot in a current running environment;
constructing a current laser point cloud according to the radar signals;
and constructing the local probability map according to the history point cloud.
A second aspect of the present application provides a robot positioning device comprising:
the first acquisition module is used for acquiring a current laser point cloud and a local probability map corresponding to a current running environment of the robot;
the judging module is used for judging whether laser degradation exists currently or not according to the matching condition between the current laser point cloud and the local probability map;
the determining module is used for determining the laser degradation direction according to the position information of the matching point between the current laser point cloud and the local probability map when the laser degradation exists currently;
the second acquisition module is used for acquiring the current predicted position of the robot;
and the positioning module is used for determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point.
A third aspect of the present application provides a robot comprising: the system comprises a laser radar, a predicted pose sensor, at least one processor and a memory;
the laser radar performs laser detection;
the predicted pose sensor is used for acquiring the current predicted position and the current predicted orientation of the robot;
the memory stores computer-executable instructions;
the at least one processor executes the computer-executable instructions stored by the memory such that the at least one processor performs the method as described above in the first aspect and the various possible designs of the first aspect.
A fourth aspect of the application provides a computer readable storage medium having stored therein computer executable instructions which when executed by a processor implement the method as described above for the first aspect and the various possible designs of the first aspect.
The technical scheme of the application has the following advantages:
according to the robot positioning method, the robot positioning device, the robot and the storage medium, the current laser point cloud and the local probability map corresponding to the current running environment of the robot are obtained; judging whether laser degradation exists currently or not according to the matching condition between the current laser point cloud and the local probability map; when laser degradation exists currently, determining a laser degradation direction according to the position information of a matching point between the current laser point cloud and a local probability map; acquiring a current predicted position of the robot; and determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point. According to the method provided by the scheme, the pose information of the robot is determined by fusing the laser detection information and the predicted position information according to the laser degradation condition, so that the positioning accuracy of the robot is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, a brief description will be given below of the drawings required for the embodiments or the prior art descriptions, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
FIG. 1 is a schematic diagram of a robotic positioning system according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a robot positioning method according to an embodiment of the present application;
FIG. 3 is an exemplary robot running scene graph provided by an embodiment of the present application;
FIG. 4 is another exemplary robot operating scene graph provided by an embodiment of the present application;
fig. 5 is a schematic structural diagram of a robot positioning device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a robot according to an embodiment of the present application.
Specific embodiments of the present application have been shown by way of the above drawings and will be described in more detail below. These drawings and the written description are not intended to limit the scope of the disclosed concept in any way, but to illustrate the inventive concept to those skilled in the art by reference to specific embodiments.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. In the following description of the embodiments, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
First, a configuration of a robot positioning system according to the present application will be described:
the robot positioning method, the robot positioning device, the robot and the storage medium provided by the embodiment of the application are suitable for detecting the current pose information of the robot. Fig. 1 is a schematic structural diagram of a robot positioning system according to an embodiment of the present application, which mainly includes a data acquisition device and a robot positioning device for detecting current pose information of a robot, where both the data acquisition device and the robot positioning device may be deployed inside the robot, and may be deployed on other electronic devices if conditions allow. Specifically, a data acquisition device acquires a current laser point cloud and a local probability map of the robot, and transmits the acquired data to a robot positioning device so that the robot positioning device can determine current pose information of the current robot.
The embodiment of the application provides a robot positioning method for determining pose information of a robot.
As shown in fig. 2, a flow chart of a robot positioning method according to an embodiment of the present application is shown, where the method includes:
step 201, a current laser point cloud and a local probability map corresponding to a current running environment of the robot are obtained.
The laser radar is installed on the robot and is used for detecting the surrounding environment, and the current laser point cloud can be generated according to radar signals of the laser radar. The local probability map is built gradually in the process of continuously generating the laser point cloud, and the surrounding environment of the robot in the whole motion process is embodied, namely, the expression form of the historical laser point cloud of the robot before the current moment is embodied.
Specifically, in an embodiment, a history point cloud and a radar signal under a current running environment of the robot may be obtained; constructing a current laser point cloud according to the radar signals; and constructing a local probability map according to the history point cloud.
For example, when a laser beam is applied to the surface of an object, the reflected laser light (radar signal) carries information about azimuth, distance, etc. When a laser beam is scanned along a certain track, reflected laser spot information is recorded while scanning, and since the laser radar scanning is extremely fine, a large number of laser spots can be obtained, and thus a laser spot cloud can be formed. And, with the continuous formation of the laser point cloud, a corresponding local probability map can be constructed.
Step 202, judging whether laser degradation exists currently according to the matching condition between the current laser point cloud and the local probability map.
When the robot travels in a scene such as a corridor, the laser point cloud obtained each time is almost the same as the robot passes through the corridor, and therefore, it is difficult for the robot to determine the current pose point from the radar signal, and at this time, it is possible to determine that there is laser degradation.
It should be further noted that, in the prior art, whether laser degradation occurs is generally predicted according to the detected scene information (for example, when the gallery is being traversed), and the reliability of the detection result of the scene information is difficult to ensure, so that the accuracy of the laser degradation determination result in the prior art is low. However, in the embodiment of the application, whether laser degradation occurs is judged according to the specific feedback of the radar signal and the matching condition of the local probability map, and the accuracy of the obtained judgment result is higher.
Step 203, when the laser degradation exists currently, determining the laser degradation direction according to the position information of the matching point between the current laser point cloud and the local probability map.
The matching points refer to certain pose points of the robot, and pose point information mainly comprises position information and orientation information of the robot.
Specifically, according to the position information of the matched pose points, the laser degradation of the robot at which pose points is respectively determined, and then the laser degradation direction is determined according to the position information of each matched point. The laser degradation direction includes: the direction of degradation of the position and/or the direction of degradation of the angle. For example, the laser degradation direction is a degradation direction of the position during the passage of the robot through the corridor, and the laser degradation direction is a direction of the corridor.
Specifically, a least square straight line fitting method can be utilized to determine the laser degradation direction according to the position information of each matching point.
Step 204, obtaining a current predicted position of the robot.
It should be noted that other predicted pose sensors may be disposed on the robot, and specifically may be predicted pose sensors such as an odometer and an inertial measurement unit (english: inertial measurement unit, abbreviated as IMU).
Specifically, in the motion process of the robot, the predicted pose sensor can acquire state information of the robot in real time, so that the position of the robot is predicted.
Step 205, determining current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point.
Wherein the pose information of the robot includes robot position coordinates and orientation (angle information).
Specifically, after the laser degradation direction is determined, it may be determined that the position coordinates of the laser degradation direction are inaccurate, and at this time, the position coordinates of the robot in the laser degradation direction may be determined according to the current predicted position of the robot. The position coordinates of the non-degenerate directions can still be determined according to the position information of the matching points, and the orientation of the robot can be determined according to the angle information of the matching points. The embodiment of the application combines the position information of the laser radar in the non-degradation direction and the predicted position information acquired by other predicted position and attitude sensors, eliminates the positioning interference caused by laser degradation and improves the positioning accuracy of the robot.
Specifically, in an embodiment, the laser degradation direction may be determined as a first direction, and a direction perpendicular to the laser degradation direction may be determined as a second direction; determining a first direction coordinate corresponding to the current predicted position as the current first direction coordinate of the robot; determining the current second direction coordinates of the robot according to the position information of each matching point; determining the current direction of the robot according to the angle information of each matching point; and determining current pose information according to the current first direction coordinate, the current second direction coordinate and the current direction of the robot.
For example, if the position coordinates of the robot are (x, y), the first direction may refer to the x-axis direction and the second direction may refer to the y-axis direction.
Specifically, when the robot passes through the corridor, the laser degradation direction is a path direction (corridor direction) through the corridor, the first direction (for example, x-axis direction) is the corridor direction, and the second direction (for example, y-axis direction) is the width direction of the corridor. Specifically, the y-axis coordinate of the robot may be determined according to the current predicted position (x 1, y 1), i.e., y=y1, and the x-axis direction of the robot may be determined according to the position information (x 2, y 2) reflected by the matching point, i.e., x=x2, where the position coordinate of the robot is (x 2, y 1). And finally, determining the current orientation of the robot according to the angle information reflected by the matching point, and determining the current pose information of the robot by combining the position coordinates and the current orientation of the robot.
Exemplary, as shown in fig. 3, in an exemplary robot operation scene graph provided by the embodiment of the present application, in a long corridor scene, the laser is degraded in the corridor direction, that is, the laser is matched with the map, so that a plurality of solutions shown by the dashed lines in the graph can be obtained. The embodiment of the application calculates the fitting direction of the solutions, namely the corridor direction, the position predicted by other sensors (predicted pose sensors) is used in the direction, and the position of the angle and the vertical direction and the corridor direction is matched by using laser, so that the information of laser trust is utilized to the greatest extent, and the positioning precision of the robot is improved. In addition, the same applies to the case of a single-sided wall.
On the basis of the foregoing embodiment, in order to further improve the reliability of the laser degradation determination result, as an implementation manner, in an embodiment, determining whether there is laser degradation currently according to a matching condition between the current laser point cloud and the local probability map includes:
step 2021, calculating a matching value between the current laser point cloud and the local probability map;
step 2022, determining the acquisition point corresponding to the current laser point cloud as a matching point when the matching value is greater than a preset matching value threshold;
in step 2023, when the number of matching points exceeds the first preset threshold, it is determined that there is laser degradation currently.
Specifically, each time the robot moves to a position, a current laser point cloud is generated, then a matching value (similarity) between the current laser point cloud and the local probability map is calculated, when the matching value is larger than a matching value threshold, the current laser point cloud can be determined to be matched with the local probability map, and then the current pose point of the robot is determined to be a matching point. By repeating the matching point detection operation, the number of matching points may be increased continuously, and when it reaches the first preset threshold value, it may be determined that there is laser degradation currently.
Specifically, in one embodiment, in order to ensure the reliability of the matching value calculation result, the matching value Score may be determined according to the following formula:
Score=(P(x 1 ,y 1 )+...+P(x i ,y i )+...+P(x n ,y n ))/n
wherein, (x) i ,y i ) Representing the coordinates of the ith point cloud in the grid, P (x i ,y i ) Represents the grid occupation probability of the ith point cloud, and n is the number of the scanned point clouds
Wherein, (x) i ,y i ) Representing the coordinates of the ith point cloud in the grid, P (x i ,y i ) And (3) representing the grid occupation probability of the ith point cloud, wherein n is the number of the scanning point clouds.
On the basis of the embodiment, since the laser degradation is not only reflected in the aspect of position coordinate detection, the situation of laser angle degradation may occur in reality, and at this time, the current orientation of the robot cannot be determined according to the radar signal, and the positioning accuracy of the robot may be affected.
Therefore, for the above problem, when the number of the matching points is greater than the second preset threshold and less than the first preset threshold, the number of the angle matching points corresponding to the current acquisition point can be counted; and when the number of the angle matching points exceeds a third preset threshold value, determining that laser angle degradation exists currently. Otherwise, it is determined that no laser degradation is currently present.
Specifically, when the number of the matching points is greater than the second preset threshold and less than the first preset threshold, it can be determined that the positioning condition is good in the operation scene, and no laser position degradation exists. In order to further judge whether laser angle degradation exists or not, the number of angle matching points corresponding to the current acquisition point is counted, namely under the condition that the robot does not move at the current acquisition point, the laser point clouds with the directions are matched, and the matched robot directions are determined to be the angle matching points. When the number of the angle matching points exceeds a third preset threshold value, determining that laser angle degradation exists currently, and at the moment, the robot orientation information fed back by the laser radar is not credible.
The first preset threshold value provided by the embodiment of the application is larger than the second preset threshold value, the magnitude relation between the third preset threshold value and the other two threshold values can be set according to actual conditions, the specific numerical value of each threshold value can also be set according to actual conditions, and the embodiment of the application is not limited.
Further, when the laser degradation direction is the degradation direction of the angle, that is, when it is determined that there is currently laser angle degradation, the following method may be further included: acquiring the current prediction direction of the robot; determining current position information of the robot according to the position information of the matching point; and determining the current pose information of the robot according to the current prediction direction and the current position information of the robot.
Further, in order to compensate for the direction detection obstacle caused by the laser angle degradation, the current direction of the robot may be determined according to the current predicted direction determined by the predicted pose sensor (for example, other predicted pose sensors such as IMU) on the robot. Wherein the position coordinates (current position information) of the robot can still be determined according to the position information of the matching point.
As shown in fig. 4, in another exemplary robot operation scene graph provided by the embodiment of the present application, the robot is in a ring (such as a circular boundary scene), the position obtained by laser matching is stable, but the angle (orientation) has any multiple solutions, and at this time, the positioning accuracy is ensured by using the angles predicted by other sensors and the position matched by the laser.
Similarly, when the laser degradation direction includes: the degradation direction of the angle and the degradation direction of the position, i.e. when it is determined that there is currently laser position degradation and laser angle degradation, the predicted position values and the predicted angle values of the other sensors may be fully used until the laser data is restored to be authentic, and then the laser data is selected for robot positioning.
In an embodiment, when the robot operates in an open scene, the laser of the robot cannot scan the obstacle, and then the matching score of the laser and the map is lower than a threshold value, and the predicted values of other sensors can be completely used at this time until the laser data is recovered to be credible, and then the laser data is selected for positioning the robot.
The embodiment of the application provides a robot positioning method, which comprises the steps of obtaining a current laser point cloud and a local probability map corresponding to a current running environment of a robot; judging whether laser degradation exists currently or not according to the matching condition between the current laser point cloud and the local probability map; when laser degradation exists currently, determining a laser degradation direction according to the position information of a matching point between the current laser point cloud and a local probability map; acquiring the current predicted position of the robot; and determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point. According to the method provided by the scheme, the pose information of the robot is determined by fusing the laser detection information and the predicted position information according to the laser degradation condition, so that the positioning accuracy of the robot is improved. And by improving the reliability of the laser degradation judgment result, a foundation is laid for further improving the positioning accuracy of the robot.
The embodiment of the application provides a robot positioning device which is used for executing the robot positioning method provided by the embodiment.
Fig. 5 is a schematic structural diagram of a positioning device for a robot according to an embodiment of the present application. The apparatus 50 includes a first acquisition module 501, a determination module 502, a determination module 503, a second acquisition module 504, and a positioning module 505.
The first obtaining module 501 is configured to obtain a current laser point cloud and a local probability map corresponding to a current running environment of the robot; the judging module 502 is configured to judge whether there is laser degradation currently according to a matching condition between the current laser point cloud and the local probability map; a determining module 503, configured to determine, when there is laser degradation currently, a laser degradation direction according to position information of a matching point between the current laser point cloud and the local probability map; a second obtaining module 504, configured to obtain a current predicted position of the robot; the positioning module 505 is configured to determine current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point, and the angle information of each matching point.
The specific manner in which the respective modules perform the operations in relation to the robotic positioning device of the present embodiment has been described in detail in relation to the embodiments of the method, and will not be described in detail herein.
The robot positioning device provided by the embodiment of the application is used for executing the robot positioning method provided by the embodiment, and the implementation mode and the principle are the same and are not repeated.
The embodiment of the application provides a robot for executing the robot positioning method provided by the embodiment.
Fig. 6 is a schematic structural diagram of a robot according to an embodiment of the present application. The robot 60 includes: at least one processor 61, memory 62, lidar 63 and predicted pose sensor 64;
the laser radar is used for laser detection; the predicted pose sensor is used for acquiring the current predicted position and the current predicted orientation of the robot; the memory stores computer-executable instructions; the at least one processor executes computer-executable instructions stored in the memory, causing the at least one processor to perform the robotic positioning method as provided by the embodiments above.
The implementation manner and principle of the robot provided by the embodiment of the application are the same, and are not repeated.
The embodiment of the application provides a computer readable storage medium, wherein computer execution instructions are stored in the computer readable storage medium, and when a processor executes the computer execution instructions, the robot positioning method provided by any embodiment is realized.
The storage medium containing the computer executable instructions in the embodiments of the present application may be used to store the computer executable instructions of the robot positioning method provided in the foregoing embodiments, and the implementation manner and principle of the storage medium are the same and are not repeated.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to perform part of the steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above. The specific working process of the above-described device may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (8)

1. A robot positioning method, comprising:
acquiring a current laser point cloud and a local probability map corresponding to a current running environment of a robot;
judging whether laser degradation exists currently or not according to the matching condition between the current laser point cloud and the local probability map; the laser degradation includes laser position degradation and laser angle degradation;
when the laser position degradation exists currently, determining a laser degradation direction according to the position information of a matching point between the current laser point cloud and a local probability map;
acquiring a current predicted position of the robot;
determining current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point;
the determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point comprises the following steps:
determining the laser degradation direction as a first direction and determining a direction perpendicular to the laser degradation direction as a second direction;
determining a first direction coordinate corresponding to the current predicted position as a current first direction coordinate of the robot;
determining the current second direction coordinate of the robot according to the position information of each matching point;
determining the current direction of the robot according to the angle information of each matching point;
determining the current pose information according to the current first direction coordinate, the current second direction coordinate and the current direction of the robot;
the determining the current pose information according to the current first direction coordinate, the current second direction coordinate and the current direction of the robot includes:
determining the position coordinates of the robot according to the current first direction coordinates and the current second direction coordinates of the robot;
determining the current pose information according to the position coordinates and the current orientation of the robot;
when the laser is degenerated to be degenerated to the laser position, the judging whether the laser degeneration exists currently according to the matching condition between the current laser point cloud and the local probability map comprises the following steps:
calculating a matching value between the current laser point cloud and the local probability map;
when the matching value is larger than a preset matching value threshold, determining an acquisition point corresponding to the current laser point cloud as a matching point;
and when the number of the matching points exceeds a first preset threshold value, determining that laser degradation exists currently.
2. The method according to claim 1, wherein the method further comprises:
when the number of the matching points is larger than a second preset threshold value and smaller than a first preset threshold value, counting the number of the angle matching points corresponding to the current acquisition points;
and when the number of the angle matching points exceeds a third preset threshold value, determining that laser angle degradation exists currently.
3. The method according to claim 2, wherein the method further comprises:
acquiring the current prediction orientation of the robot;
determining current position information of the robot according to the position information of the matching point;
and determining the current pose information of the robot according to the current predicted orientation of the robot and the current position information.
4. The method of claim 1, wherein the calculating a matching value between the current laser point cloud and the local probability map comprises:
the matching value Score is determined according to the following formula:
Score=(P(x 1 ,y 1 )+...+P(x i ,y i )+...+P(x n ,y n ))/n
wherein, (x) i ,y i ) Representing the coordinates of the ith point cloud in the grid, P (x i ,y i ) And (3) representing the grid occupation probability of the ith point cloud, wherein n is the number of the scanning point clouds.
5. The method of claim 1, wherein obtaining a current laser point cloud and a local probability map corresponding to a current operating environment of the robot comprises:
acquiring a history point cloud and a radar signal of a robot in a current running environment;
constructing a current laser point cloud according to the radar signals;
and constructing the local probability map according to the history point cloud.
6. A robotic positioning device, comprising:
the first acquisition module is used for acquiring a current laser point cloud and a local probability map corresponding to a current running environment of the robot;
the judging module is used for judging whether laser degradation exists currently or not according to the matching condition between the current laser point cloud and the local probability map; the laser degradation includes laser position degradation and laser angle degradation;
the determining module is used for determining the laser degradation direction according to the position information of the matching point between the current laser point cloud and the local probability map when the laser position degradation exists currently;
the second acquisition module is used for acquiring the current predicted position of the robot;
the positioning module is used for determining the current pose information of the robot according to the current predicted position, the laser degradation direction, the position information of each matching point and the angle information of each matching point;
the positioning module is specifically configured to:
determining the laser degradation direction as a first direction and determining a direction perpendicular to the laser degradation direction as a second direction;
determining a first direction coordinate corresponding to the current predicted position as a current first direction coordinate of the robot;
determining the current second direction coordinate of the robot according to the position information of each matching point;
determining the current direction of the robot according to the angle information of each matching point;
determining the current pose information according to the current first direction coordinate, the current second direction coordinate and the current direction of the robot;
the determining the current pose information according to the current first direction coordinate, the current second direction coordinate and the current direction of the robot includes:
determining the position coordinates of the robot according to the current first direction coordinates and the current second direction coordinates of the robot;
determining the current pose information according to the position coordinates and the current orientation of the robot;
when the laser is degraded into laser position degradation, the judging module is specifically configured to:
calculating a matching value between the current laser point cloud and the local probability map;
when the matching value is larger than a preset matching value threshold, determining an acquisition point corresponding to the current laser point cloud as a matching point;
and when the number of the matching points exceeds a first preset threshold value, determining that laser degradation exists currently.
7. A robot, comprising: the system comprises a laser radar, a predicted pose sensor, at least one processor and a memory;
the laser radar performs laser detection;
the predicted pose sensor is used for acquiring the current predicted position and the current predicted orientation of the robot;
the memory stores computer-executable instructions;
the at least one processor executing computer-executable instructions stored in the memory causes the at least one processor to perform the method of any one of claims 1-5.
8. A computer readable storage medium having stored therein computer executable instructions which when executed by a processor implement the method of any one of claims 1 to 5.
CN202110750179.2A 2021-06-18 2021-07-02 Robot positioning method and device, robot and storage medium Active CN113432533B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110679029 2021-06-18
CN2021106790297 2021-06-18

Publications (2)

Publication Number Publication Date
CN113432533A CN113432533A (en) 2021-09-24
CN113432533B true CN113432533B (en) 2023-08-15

Family

ID=77758885

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110750179.2A Active CN113432533B (en) 2021-06-18 2021-07-02 Robot positioning method and device, robot and storage medium

Country Status (1)

Country Link
CN (1) CN113432533B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113978512B (en) * 2021-11-03 2023-11-24 北京埃福瑞科技有限公司 Rail train positioning method and device
CN114199233B (en) * 2021-11-08 2024-04-05 北京旷视科技有限公司 Pose determining method and movable equipment
CN115267796B (en) * 2022-08-17 2024-04-09 深圳市普渡科技有限公司 Positioning method, positioning device, robot and storage medium
CN117073690B (en) * 2023-10-17 2024-03-15 山东大学 Navigation method and system based on multi-map strategy

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107991683A (en) * 2017-11-08 2018-05-04 华中科技大学 A kind of robot autonomous localization method based on laser radar
CN109579849A (en) * 2019-01-14 2019-04-05 浙江大华技术股份有限公司 Robot localization method, apparatus and robot and computer storage medium
CN111077495A (en) * 2019-12-10 2020-04-28 亿嘉和科技股份有限公司 Positioning recovery method based on three-dimensional laser
CN111337011A (en) * 2019-12-10 2020-06-26 亿嘉和科技股份有限公司 Indoor positioning method based on laser and two-dimensional code fusion
CN111443359A (en) * 2020-03-26 2020-07-24 达闼科技成都有限公司 Positioning method, device and equipment
CN111508021A (en) * 2020-03-24 2020-08-07 广州视源电子科技股份有限公司 Pose determination method and device, storage medium and electronic equipment
CN111536964A (en) * 2020-07-09 2020-08-14 浙江大华技术股份有限公司 Robot positioning method and device, and storage medium
CN111582566A (en) * 2020-04-26 2020-08-25 上海高仙自动化科技发展有限公司 Path planning method and planning device, intelligent robot and storage medium
WO2020211655A1 (en) * 2019-04-17 2020-10-22 北京迈格威科技有限公司 Laser coarse registration method, device, mobile terminal and storage medium
CN112123343A (en) * 2020-11-25 2020-12-25 炬星科技(深圳)有限公司 Point cloud matching method, point cloud matching equipment and storage medium
CN112612029A (en) * 2020-12-24 2021-04-06 哈尔滨工业大学芜湖机器人产业技术研究院 Grid map positioning method fusing NDT and ICP
CN112698345A (en) * 2020-12-04 2021-04-23 江苏科技大学 Robot simultaneous positioning and mapping optimization method for laser radar
CN112862874A (en) * 2021-04-23 2021-05-28 腾讯科技(深圳)有限公司 Point cloud data matching method and device, electronic equipment and computer storage medium
WO2021104497A1 (en) * 2019-11-29 2021-06-03 广州视源电子科技股份有限公司 Positioning method and system based on laser radar, and storage medium and processor
CN112904369A (en) * 2021-01-14 2021-06-04 深圳市杉川致行科技有限公司 Robot repositioning method, device, robot and computer-readable storage medium
CN112904358A (en) * 2021-01-21 2021-06-04 中国人民解放军军事科学院国防科技创新研究院 Laser positioning method based on geometric information
CN112923933A (en) * 2019-12-06 2021-06-08 北理慧动(常熟)车辆科技有限公司 Laser radar SLAM algorithm and inertial navigation fusion positioning method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109781119B (en) * 2017-11-15 2020-01-21 百度在线网络技术(北京)有限公司 Laser point cloud positioning method and system

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107991683A (en) * 2017-11-08 2018-05-04 华中科技大学 A kind of robot autonomous localization method based on laser radar
CN109579849A (en) * 2019-01-14 2019-04-05 浙江大华技术股份有限公司 Robot localization method, apparatus and robot and computer storage medium
WO2020211655A1 (en) * 2019-04-17 2020-10-22 北京迈格威科技有限公司 Laser coarse registration method, device, mobile terminal and storage medium
WO2021104497A1 (en) * 2019-11-29 2021-06-03 广州视源电子科技股份有限公司 Positioning method and system based on laser radar, and storage medium and processor
CN112923933A (en) * 2019-12-06 2021-06-08 北理慧动(常熟)车辆科技有限公司 Laser radar SLAM algorithm and inertial navigation fusion positioning method
CN111077495A (en) * 2019-12-10 2020-04-28 亿嘉和科技股份有限公司 Positioning recovery method based on three-dimensional laser
CN111337011A (en) * 2019-12-10 2020-06-26 亿嘉和科技股份有限公司 Indoor positioning method based on laser and two-dimensional code fusion
CN111508021A (en) * 2020-03-24 2020-08-07 广州视源电子科技股份有限公司 Pose determination method and device, storage medium and electronic equipment
CN111443359A (en) * 2020-03-26 2020-07-24 达闼科技成都有限公司 Positioning method, device and equipment
CN111582566A (en) * 2020-04-26 2020-08-25 上海高仙自动化科技发展有限公司 Path planning method and planning device, intelligent robot and storage medium
CN111536964A (en) * 2020-07-09 2020-08-14 浙江大华技术股份有限公司 Robot positioning method and device, and storage medium
CN112123343A (en) * 2020-11-25 2020-12-25 炬星科技(深圳)有限公司 Point cloud matching method, point cloud matching equipment and storage medium
CN112698345A (en) * 2020-12-04 2021-04-23 江苏科技大学 Robot simultaneous positioning and mapping optimization method for laser radar
CN112612029A (en) * 2020-12-24 2021-04-06 哈尔滨工业大学芜湖机器人产业技术研究院 Grid map positioning method fusing NDT and ICP
CN112904369A (en) * 2021-01-14 2021-06-04 深圳市杉川致行科技有限公司 Robot repositioning method, device, robot and computer-readable storage medium
CN112904358A (en) * 2021-01-21 2021-06-04 中国人民解放军军事科学院国防科技创新研究院 Laser positioning method based on geometric information
CN112862874A (en) * 2021-04-23 2021-05-28 腾讯科技(深圳)有限公司 Point cloud data matching method and device, electronic equipment and computer storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于增量式路标表观学习的移动机器人定位;吴华 等;北京航空航天大学学报(第06期);81-85 *

Also Published As

Publication number Publication date
CN113432533A (en) 2021-09-24

Similar Documents

Publication Publication Date Title
CN113432533B (en) Robot positioning method and device, robot and storage medium
CN110632921B (en) Robot path planning method and device, electronic equipment and storage medium
CN110286389B (en) Grid management method for obstacle identification
US20180113234A1 (en) System and method for obstacle detection
CN111258320B (en) Robot obstacle avoidance method and device, robot and readable storage medium
CN109001757B (en) Parking space intelligent detection method based on 2D laser radar
CN110749901B (en) Autonomous mobile robot, map splicing method and device thereof, and readable storage medium
CN111308500B (en) Obstacle sensing method and device based on single-line laser radar and computer terminal
CN114236564B (en) Method for positioning robot in dynamic environment, robot, device and storage medium
KR102569900B1 (en) Apparatus and method for performing omnidirectional sensor-fusion and vehicle including the same
CN112781599A (en) Method for determining the position of a vehicle
CN110426714B (en) Obstacle identification method
CN114556442A (en) Three-dimensional point cloud segmentation method and device and movable platform
KR101030317B1 (en) Apparatus for tracking obstacle using stereo vision and method thereof
WO2022116831A1 (en) Positioning method and apparatus, electronic device and readable storage medium
CN114815851A (en) Robot following method, robot following device, electronic device, and storage medium
CN113376638A (en) Unmanned logistics trolley environment sensing method and system
CN113625232A (en) Method, device, medium and equipment for suppressing multipath false target in radar detection
KR20220041485A (en) Method and apparatus for tracking an object using LIDAR sensor, vehicle including the apparatus, and recording medium for recording program performing the method
CN111781606A (en) Novel miniaturization implementation method for fusion of laser radar and ultrasonic radar
CN111723724A (en) Method and related device for identifying road surface obstacle
CN113203424B (en) Multi-sensor data fusion method and device and related equipment
CN116434181A (en) Ground point detection method, device, electronic equipment and medium
CN112344966B (en) Positioning failure detection method and device, storage medium and electronic equipment
US11267130B2 (en) Robot localization method and apparatus and robot using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant