CN116698038A - Positioning loss judging method and device of robot and robot - Google Patents
Positioning loss judging method and device of robot and robot Download PDFInfo
- Publication number
- CN116698038A CN116698038A CN202310633486.1A CN202310633486A CN116698038A CN 116698038 A CN116698038 A CN 116698038A CN 202310633486 A CN202310633486 A CN 202310633486A CN 116698038 A CN116698038 A CN 116698038A
- Authority
- CN
- China
- Prior art keywords
- dist
- yaw
- pose
- threshold
- delt
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000006073 displacement reaction Methods 0.000 claims abstract description 28
- 238000012545 processing Methods 0.000 claims description 7
- 238000004590 computer program Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/0095—Means or methods for testing manipulators
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention provides a robot positioning loss judging method, which comprises the following steps: maintaining sliding window data for continuously storing the coincidence degree score, the positioning point and the wheel type odometer in a preset number of frames; respectively calculating a difference value delt_dist between the displacement variable quantity of the wheel-type odometer and the positioning displacement variable quantity before and after the window, and a difference value delt_yaw between the angle variable quantity of the wheel-type odometer and the positioning angle variable quantity; calculating standard deviation std_score_30 of the overlap ratio score of all laser radar frames and the map in the sliding window; when the first condition or the second condition is met, the robot positioning is considered to be lost.
Description
Technical Field
The present invention relates to the field of robots, and in particular, to a method and an apparatus for determining loss of positioning of a robot, and a robot.
Background
Autonomous navigation mobile robots require that the mobile robot can achieve point-to-point autonomous path-finding walking capability, provided that the mobile robot knows the location of itself and the location of the target point to be reached. The positioning technology of autonomous navigation mobile robots is one of the hot spot technology of recent years. Wherein lidar positioning is one of the main ways of indoor mobile robot positioning.
The patrol robot realizes positioning navigation based on a 2D grid navigation map which is built in advance. The positioning principle is to find a pose position, and the coincidence degree of the current laser scanning frame and the 2D navigation grid map is met under the position. The more accurate the match theoretically, the more accurate the positioning result phase. Therefore, most algorithms use the degree of matching to determine if the robot position is lost.
The background description provided herein is for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
Disclosure of Invention
Aiming at the technical problems in the related art, the invention provides a robot positioning loss judging method, which comprises the following steps:
s1, acquiring a laser scanning frame Scan and a robot pose position at the current moment;
s2, converting the laser scanning frame Scan at the current moment into a map coordinate system by using the pose position of the robot so as to obtain the projection Scan_m of the current laser frame in the map coordinate system;
s3, calculating the superposition degree score of the projection Scan_m of the laser frame at the current moment on the map coordinate system and the map;
s4, maintaining sliding window data for continuously storing the coincidence degree score, the positioning point and the wheel type odometer in a preset number of frames; respectively calculating a difference value delt_dist between the displacement variable quantity of the wheel-type odometer and the positioning displacement variable quantity before and after the window, and a difference value delt_yaw between the angle variable quantity of the wheel-type odometer and the positioning angle variable quantity; calculating standard deviation std_score_30 of the overlap ratio score of all laser radar frames and the map in the sliding window;
s5, when the first condition or the second condition is met, the robot positioning is considered to be lost, wherein the first condition is as follows: a delt_dist > first threshold or a delt_yw > second threshold, and std_score_30< third threshold; the second condition is: all score's within the sliding window are all < fourth threshold, and delt_dist > fifth threshold, and delt_yaw > sixth threshold.
Specifically, the step S3 specifically includes: traversing all points in Scan_m, finding out an occupied grid point m on a map nearest to the point, and calculating the coincidence degree score of the projection Scan_m of the current laser frame on a map coordinate system and a map by using the following formula;
wherein: n is the total number of points in Scan_m;
[p j ,m j ]corresponding to the closest point pair.
Specifically, in the step S4, a difference value delt_dist between the displacement variable quantity of the wheel-type odometer and the positioning displacement variable quantity before and after the window and a difference value delt_yw between the angle variable quantity of the wheel-type odometer and the positioning angle variable quantity are calculated respectively; the method comprises the following steps:
odom_dist=odom_dist_30-odom_dist_1
odom_yaw=odom_yaw_30-odom_yaw_1
pose_yaw=pose_yaw_30-pose_yaw_1
pose_dist=pose_dist_30-pose_dist_1
delt_dist=|odom_dist-pose_dist|
delt_yaw=|odom_yaw-pose_yaw|。
specifically, the fifth threshold is: 0.6 x a first threshold, a sixth threshold being: a second threshold of 0.6.
Specifically, the preset number is 30.
In a second aspect, another embodiment of the present invention discloses a robot positioning loss judgment device, which includes:
the scanning frame and position acquisition unit is used for acquiring a laser scanning frame Scan at the current moment and a robot pose position;
the projection acquisition unit is used for converting the laser scanning frame Scan at the current moment into a map coordinate system by using the pose position of the robot so as to obtain the projection Scan_m of the current laser frame in the map coordinate system;
the overlap ratio calculating unit is used for calculating the overlap ratio score of the projection Scan_m of the laser frame at the current moment on the map coordinate system and the map;
the sliding window judging parameter calculating unit is used for maintaining sliding window data and continuously storing the coincidence degree score, the positioning point and the wheel type odometer in a preset number of frames; respectively calculating a difference value delt_dist between the displacement variable quantity of the wheel-type odometer and the positioning displacement variable quantity before and after the window, and a difference value delt_yaw between the angle variable quantity of the wheel-type odometer and the positioning angle variable quantity; calculating standard deviation std_score_30 of the overlap ratio score of all laser radar frames and the map in the sliding window;
the loss judging unit is used for considering that the robot positioning is lost when the first condition or the second condition is met, wherein the first condition is as follows: a delt_dist > first threshold or a delt_yw > second threshold, and std_score_30< third threshold; the second condition is: all score's within the sliding window are all < fourth threshold, and delt_dist > fifth threshold, and delt_yaw > sixth threshold.
Specifically, the contact ratio calculating unit specifically includes: traversing all points in Scan_m, finding out an occupied grid point m on a map nearest to the point, and calculating the coincidence degree score of the projection Scan_m of the current laser frame on a map coordinate system and a map by using the following formula;
wherein: n is the total number of points in Scan_m;
[p j ,m j ]corresponding to the closest point pair.
Specifically, the sliding window judgment parameter calculation unit calculates a difference value delt_dist between the displacement variable quantity and the positioning displacement variable quantity of the wheel type odometer before and after the window and a difference value delt_yw between the angle variable quantity and the positioning angle variable quantity of the wheel type odometer respectively; the method comprises the following steps:
odom_dist=odom_dist_30-odom_dist_1
odom_yaw=odom_yaw_30-odom_yaw_1
pose_yaw=pose_yaw_30-pose_yaw_1
pose_dist=pose_dist_30-pose_dist_1
delt_dist=|odom_dist-pose_dist|
delt_yaw=|odom_yaw-pose_yaw|。
specifically, the fifth threshold is: 0.6 x a first threshold, a sixth threshold being: a second threshold of 0.6.
In a third aspect, another embodiment of the present invention discloses a robot comprising: the robot positioning loss judging method comprises a processing module, a chassis, a storage module and a laser radar, wherein the storage module stores instructions, and the instructions are used for realizing the robot positioning loss judging method when executed.
In a fourth aspect, another embodiment of the present invention discloses a robot comprising: the robot positioning loss judging device comprises a processing module, a chassis, a storage module, a three-dimensional laser radar and the robot positioning loss judging device.
In a fifth aspect, another embodiment of the present invention discloses a nonvolatile memory, where an instruction is stored in the nonvolatile memory, where the instruction is used to implement the foregoing method for determining loss of positioning of a robot when the instruction is executed by a processor.
According to the robot positioning loss judging method, according to the difference value delt_dist between the displacement variable quantity of the wheel type odometer and the positioning displacement variable quantity before and after a window in a sliding window and the difference value delt_yw between the angle variable quantity of the wheel type odometer and the positioning angle variable quantity; the standard deviation std_score_30 of the overlap ratio score of all laser radar frames and the map in the sliding window judges whether the robot is lost in positioning or not, and the method of the embodiment does not need to introduce other sensor algorithms and does not need to modify hardware.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for determining loss of positioning of a robot according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a device for determining loss of positioning of a robot according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a robot positioning loss judgment device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which are derived by a person skilled in the art based on the embodiments of the invention, fall within the scope of protection of the invention.
Example 1
Referring to fig. 1, the embodiment discloses a method for determining loss of positioning of a robot, which includes the following steps:
s1, acquiring a laser scanning frame Scan and a robot pose position at the current moment;
the robot positioning loss judging method of the embodiment is applied to an organic robot. The robot of the embodiment comprises a chassis and a laser radar. Wherein the chassis is a controllable chassis that is capable of movement.
In addition, the robot is also provided with a point cloud map in the current operation scene. The robot uses a SLAM mode to establish a point cloud map under the current operation scene, the SLAM mode belongs to a common mapping mode in the field, and the embodiment is not repeated.
The robot of the embodiment has built the 2D navigation grid Map of the environment, the robot is started, a laser positioning program is started, and the position of the initial robot in the Map is manually given; and controlling a robot walking (the robot does not walk and the positioning result is not updated) positioning program to update the current position of the robot in real time, and outputting the pose position of the robot.
S2, converting the laser scanning frame Scan at the current moment into a map coordinate system by using the pose position of the robot so as to obtain the projection Scan_m of the current laser frame in the map coordinate system;
the method comprises the steps of converting a laser scanning frame Scan at the current moment, wherein the laser scanning frame is represented by a series of 2D points (x, y), and the laser scanning frame is converted to a Map coordinate system through a robot pose position;
Scan*pose=Scan_m
scan_m-represents the projection of a laser Scan frame onto the map;
s3, calculating the superposition degree score of the projection Scan_m of the laser frame at the current moment on the map coordinate system and the map;
specifically, the embodiment traverses all points in the scan_m, finds out the occupied grid point m on the map nearest to the point, and calculates the overlap ratio score of the projection scan_m of the current laser frame in the map coordinate system and the map by using the following formula;
wherein: n is the total number of points in Scan_m.
[p j ,m j ]Corresponding to the closest point pair.
The p point is the projection Scan_m of the current laser frame in the map coordinate system, and the laser scanning frame Scan at the current moment is converted into the map coordinate system; m represents a point on the map that is closest to p, the closest point on the map called p.
A two-dimensional grid map describes a point on the map with (x y). (p) jx ,p jy ) (m) jx ,m jy ) Respectively representing the x and y coordinates of the jth p and m points on the two-dimensional grid map.
Two-dimensional grid maps are typically stored in a computer using a search binary tree, inside which a search for the nearest point function has been implemented. The present embodiment uses a find closest point function to obtain the closest point m of the p-point.
Score is a Score indicating how far the current laser scanning frame is projected into the map via the robot pose and coincides with the map.
S4, maintaining sliding window data for continuously storing the coincidence degree score, the positioning point and the wheel type odometer in 30 frames; respectively calculating a difference value delt_dist between the displacement variable quantity of the wheel-type odometer and the positioning displacement variable quantity before and after the window, and a difference value delt_yaw between the angle variable quantity of the wheel-type odometer and the positioning angle variable quantity; calculating standard deviation std_score_30 of the overlap ratio score of all laser radar frames and the map in the sliding window;
odom_dist=odom_dist_30-odom_dist_1
odom_yaw=odom_yaw_30-odom_yaw_1
pose_yaw=pose_yaw_30-pose_yaw_1
pose_dist=pose_dist_30-pose_dist_1
delt_dist=|odom_dist-pose_dist|
delt_yaw=|odom_yaw-pose_yaw|
calculating standard deviation of all score in the sliding window;
s5, when the first condition or the second condition is met, the robot positioning is considered to be lost, wherein the first condition is as follows: a delt_dist > first threshold or a delt_yw > second threshold, and std_score_30< third threshold; the second condition is: all score within the sliding window are all < fourth threshold, and delt_dist > fifth threshold, and delt_yaw > sixth threshold;
wherein the third threshold is 0.1, the fourth threshold is 0.35, and the fifth threshold is: 0.6 x a first threshold, a sixth threshold being: a second threshold of 0.6.
According to the robot positioning loss judging method, according to the difference value delt_dist between the displacement variable quantity of the wheel type odometer and the positioning displacement variable quantity before and after a window in a sliding window and the difference value delt_yw between the angle variable quantity of the wheel type odometer and the positioning angle variable quantity; the standard deviation std_score_30 of the overlap ratio score of all laser radar frames and the map in the sliding window judges whether the robot is lost in positioning or not, and the method of the embodiment does not need to introduce other sensor algorithms and does not need to modify hardware.
Example two
Referring to fig. 2, the present embodiment discloses a robot positioning loss judgment device, which includes the following units:
the scanning frame and position acquisition unit is used for acquiring a laser scanning frame Scan at the current moment and a robot pose position;
the robot of the embodiment has built the 2D navigation grid Map of the environment, the robot is started, a laser positioning program is started, and the position of the initial robot in the Map is manually given; and controlling a robot walking (the robot does not walk and the positioning result is not updated) positioning program to update the current position of the robot in real time, and outputting the pose position of the robot.
The projection acquisition unit is used for converting the laser scanning frame Scan at the current moment into a map coordinate system by using the pose position of the robot so as to obtain the projection Scan_m of the current laser frame in the map coordinate system;
the method comprises the steps of converting a laser scanning frame Scan at the current moment, wherein the laser scanning frame is represented by a series of 2D points (x, y), and the laser scanning frame is converted to a Map coordinate system through a robot pose position;
Scan*pose=Scan_m
scan_m-represents the projection of a laser Scan frame onto the map;
the overlap ratio calculating unit is used for calculating the overlap ratio score of the projection Scan_m of the laser frame at the current moment on the map coordinate system and the map;
specifically, the embodiment traverses all points in the scan_m, finds out the occupied grid point m on the map nearest to the point, and calculates the overlap ratio score of the projection scan_m of the current laser frame in the map coordinate system and the map by using the following formula;
wherein: n is the total number of points in Scan_m.
[p j ,m j ]Corresponding to the closest point pair.
The p point is the projection Scan_m of the current laser frame in the map coordinate system, and the laser scanning frame Scan at the current moment is converted into the map coordinate system; m represents a point on the map that is closest to p, the closest point on the map called p.
A two-dimensional grid map describes a point on the map with (x y). (p) jx ,p jy ) (m) jx ,m jy ) Respectively representing the x and y coordinates of the jth p and m points on the two-dimensional grid map.
Two-dimensional grid maps are typically stored in a computer using a search binary tree, inside which a search for the nearest point function has been implemented. The present embodiment uses a find closest point function to obtain the closest point m of the p-point.
Score is a Score indicating how far the current laser scanning frame is projected into the map via the robot pose and coincides with the map.
The sliding window judging parameter calculating unit is used for maintaining sliding window data and continuously storing the coincidence degree score, the positioning point and the wheel type odometer in 30 frames; respectively calculating a difference value delt_dist between the displacement variable quantity of the wheel-type odometer and the positioning displacement variable quantity before and after the window, and a difference value delt_yaw between the angle variable quantity of the wheel-type odometer and the positioning angle variable quantity; calculating standard deviation std_score_30 of the overlap ratio score of all laser radar frames and the map in the sliding window;
odom_dist=odom_dist_30-odom_dist_1
odom_yaw=odom_yaw_30-odom_yaw_1
pose_yaw=pose_yaw_30-pose_yaw_1
pose_dist=pose_dist_30-pose_dist_1
delt_dist=|odom_dist-pose_dist|
delt_yaw=|odom_yaw-pose_yaw|
calculating standard deviation of all score in the sliding window;
the loss judging unit is used for considering that the robot positioning is lost when the first condition or the second condition is met, wherein the first condition is as follows: a delt_dist > first threshold or a delt_yw > second threshold, and std_score_30< third threshold; the second condition is: all score within the sliding window are all < fourth threshold, and delt_dist > fifth threshold, and delt_yaw > sixth threshold;
wherein the third threshold is 0.1, the fourth threshold is 0.35, and the fifth threshold is: 0.6 x a first threshold, a sixth threshold being: a second threshold of 0.6.
According to the robot positioning loss judging device, according to the difference value delt_dist between the displacement variable quantity of the wheel type odometer and the positioning displacement variable quantity before and after a window in a sliding window and the difference value delt_yw between the angle variable quantity of the wheel type odometer and the positioning angle variable quantity; the standard deviation std_score_30 of the overlap ratio score of all laser radar frames and the map in the sliding window judges whether the robot is lost in positioning or not, and the method of the embodiment does not need to introduce other sensor algorithms and does not need to modify hardware.
Example III
The embodiment discloses a robot, the robot includes: the robot positioning loss judging method comprises a processing module, a chassis, a storage module and a laser radar, wherein the storage module stores instructions which are used for realizing the robot positioning loss judging method according to the embodiment.
In another embodiment, a robot includes: the system comprises a processing module, a chassis, a storage module, a laser radar and a robot positioning loss judging device as described in the second embodiment.
Example IV
Referring to fig. 3, fig. 3 is a schematic diagram of the structure of a robot loss of positioning judging apparatus of the present embodiment. The robot positioning loss judging device 20 of this embodiment includes a processor 21, a memory 22, and a computer program stored in the memory 22 and executable on the processor 21. The steps of the above-described method embodiments are implemented by the processor 21 when executing the computer program. Alternatively, the processor 21 may implement the functions of the modules/units in the above-described device embodiments when executing the computer program.
Illustratively, the computer program may be partitioned into one or more modules/units that are stored in the memory 22 and executed by the processor 21 to complete the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program in the robot positioning loss judging device 20. For example, the computer program may be divided into modules in the second embodiment, and specific functions of each module refer to the working process of the apparatus described in the foregoing embodiment, which is not described herein.
The robot positioning loss determination device 20 may include, but is not limited to, a processor 21, a memory 22. It will be appreciated by those skilled in the art that the schematic diagram is merely an example of the robot positioning loss determination device 20, and does not constitute a limitation of the robot positioning loss determination device 20, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the robot positioning loss determination device 20 may further include an input-output device, a network access device, a bus, etc.
The processor 21 may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. The general-purpose processor may be a microprocessor or the processor may be any conventional processor or the like, and the processor 21 is a control center of the robot positioning loss determination device 20, and connects the respective parts of the entire robot positioning loss determination device 20 using various interfaces and lines.
The memory 22 may be used to store the computer program and/or module, and the processor 21 may implement various functions of the robot positioning loss determination device 20 by executing or executing the computer program and/or module stored in the memory 22, and invoking data stored in the memory 22. The memory 22 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 22 may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
Wherein the modules/units integrated by the robot positioning loss judging device 20 may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as independent products. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and the computer program may implement the steps of each of the method embodiments described above when executed by the processor 21. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
It should be noted that the above-described apparatus embodiments are merely illustrative, and the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiment of the device provided by the invention, the connection relation between the modules represents that the modules have communication connection, and can be specifically implemented as one or more communication buses or signal lines. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.
Claims (10)
1. A robot positioning loss judging method comprises the following steps:
s1, acquiring a laser scanning frame Scan and a robot pose position at the current moment;
s2, converting the laser scanning frame Scan at the current moment into a map coordinate system by using the pose position of the robot so as to obtain the projection Scan_m of the current laser frame in the map coordinate system;
s3, calculating the superposition degree score of the projection Scan_m of the laser frame at the current moment on the map coordinate system and the map;
s4, maintaining sliding window data for continuously storing the coincidence degree score, the positioning point and the wheel type odometer in a preset number of frames; respectively calculating a difference value delt_dist between the displacement variable quantity of the wheel-type odometer and the positioning displacement variable quantity before and after the window, and a difference value delt_yaw between the angle variable quantity of the wheel-type odometer and the positioning angle variable quantity; calculating standard deviation std_score_30 of the overlap ratio score of all laser radar frames and the map in the sliding window;
s5, when the first condition or the second condition is met, the robot positioning is considered to be lost, wherein the first condition is as follows: a delt_dist > first threshold or a delt_yw > second threshold, and std_score_30< third threshold; the second condition is: all score's within the sliding window are all < fourth threshold, and delt_dist > fifth threshold, and delt_yaw > sixth threshold.
2. The method according to claim 1, wherein the step S3 is specifically: traversing all points in Scan_m, finding out an occupied grid point m on a map nearest to the point, and calculating the coincidence degree score of the projection Scan_m of the current laser frame on a map coordinate system and a map by using the following formula;
wherein: n is the total number of points in Scan_m;
[p j ,m j ]corresponding to the closest point pair.
3. The method according to claim 2, wherein in the step S4, a difference value delt_dist between the wheel-type odometer displacement variation and the positioning displacement variation before and after the window and a difference value delt_yw between the wheel-type odometer angle variation and the positioning angle variation are calculated respectively; the method comprises the following steps:
odom_dist=odom_dist_30-odom_dist_1
odom_yaw=odom_yaw_30-odom_yaw_1
pose_yaw=pose_yaw_30-pose_yaw_1
pose_dist=pose_dist_30-pose_dist_1
delt_dist=|odom_dist-pose_dist|
delt_yaw=|odom_yaw-pose_yaw|。
4. a method according to claim 3, the fifth threshold being: 0.6 x a first threshold, a sixth threshold being: a second threshold of 0.6.
5. The method of claim 4, the preset number being 30.
6. A robot positioning loss judgment device, comprising the following units:
the scanning frame and position acquisition unit is used for acquiring a laser scanning frame Scan at the current moment and a robot pose position;
the projection acquisition unit is used for converting the laser scanning frame Scan at the current moment into a map coordinate system by using the pose position of the robot so as to obtain the projection Scan_m of the current laser frame in the map coordinate system;
the overlap ratio calculating unit is used for calculating the overlap ratio score of the projection Scan_m of the laser frame at the current moment on the map coordinate system and the map;
the sliding window judging parameter calculating unit is used for maintaining sliding window data and continuously storing the coincidence degree score, the positioning point and the wheel type odometer in a preset number of frames; respectively calculating a difference value delt_dist between the displacement variable quantity of the wheel-type odometer and the positioning displacement variable quantity before and after the window, and a difference value delt_yaw between the angle variable quantity of the wheel-type odometer and the positioning angle variable quantity; calculating standard deviation std_score_30 of the overlap ratio score of all laser radar frames and the map in the sliding window;
the loss judging unit is used for considering that the robot positioning is lost when the first condition or the second condition is met, wherein the first condition is as follows: a delt_dist > first threshold or a delt_yw > second threshold, and std_score_30< third threshold; the second condition is: all score's within the sliding window are all < fourth threshold, and delt_dist > fifth threshold, and delt_yaw > sixth threshold.
7. The apparatus according to claim 6, wherein the contact ratio calculating unit specifically includes: traversing all points in Scan_m, finding out an occupied grid point m on a map nearest to the point, and calculating the coincidence degree score of the projection Scan_m of the current laser frame on a map coordinate system and a map by using the following formula;
wherein: n is the total number of points in Scan_m;
[p j ,m j ]corresponding to the closest point pair.
8. The device according to claim 7, wherein the sliding window judging parameter calculating unit calculates a difference value delt_dist between the displacement variation of the wheel-type odometer and the positioning displacement variation before and after the window, and a difference value delt_yw between the angle variation of the wheel-type odometer and the positioning angle variation; the method comprises the following steps:
odom_dist=odom_dist_30-odom_dist_1
odom_yaw=odom_yaw_30-odom_yaw_1
pose_yaw=pose_yaw_30-pose_yaw_1
pose_dist=pose_dist_30-pose_dist_1
delt_dist=|odom_dist-pose_dist|
delt_yaw=|odom_yaw-pose_yaw|。
9. the apparatus of claim 8, the fifth threshold value is: 0.6 x a first threshold, a sixth threshold being: a second threshold of 0.6.
10. A robot, the robot comprising: a processing module, a chassis, a storage module, a lidar, the storage module storing instructions that, when executed, are adapted to implement a robot loss of position determination method according to any of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310633486.1A CN116698038A (en) | 2023-05-30 | 2023-05-30 | Positioning loss judging method and device of robot and robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310633486.1A CN116698038A (en) | 2023-05-30 | 2023-05-30 | Positioning loss judging method and device of robot and robot |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116698038A true CN116698038A (en) | 2023-09-05 |
Family
ID=87830520
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310633486.1A Pending CN116698038A (en) | 2023-05-30 | 2023-05-30 | Positioning loss judging method and device of robot and robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116698038A (en) |
-
2023
- 2023-05-30 CN CN202310633486.1A patent/CN116698038A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112179330B (en) | Pose determination method and device of mobile equipment | |
US11422261B2 (en) | Robot relocalization method and apparatus and robot using the same | |
CN112643664B (en) | Positioning error eliminating method, positioning error eliminating device, robot and storage medium | |
CN107223244B (en) | Localization method and device | |
CN115326051A (en) | Positioning method and device based on dynamic scene, robot and medium | |
JP7351892B2 (en) | Obstacle detection method, electronic equipment, roadside equipment, and cloud control platform | |
CN111352430A (en) | Path planning method and device and robot | |
CN116698038A (en) | Positioning loss judging method and device of robot and robot | |
CN115223135B (en) | Parking space tracking method and device, vehicle and storage medium | |
CN114646317A (en) | Vehicle visual positioning navigation control method and device, computer equipment and medium | |
CN112415524A (en) | Robot and positioning navigation method and device thereof | |
CN117252912A (en) | Depth image acquisition method, electronic device and storage medium | |
CN113138596A (en) | Robot automatic charging method, system, terminal device and storage medium | |
CN113808196A (en) | Plane fusion positioning method and device, electronic equipment and storage medium | |
CN116892925A (en) | 2D grid map dynamic updating method, device and robot | |
CN112729349A (en) | Method and device for on-line calibration of odometer, electronic equipment and storage medium | |
CN116778201A (en) | Point cloud matching algorithm failure judgment method, device and robot | |
CN110647890B (en) | High-performance image feature extraction and matching method, system and storage medium | |
CN115837921B (en) | Vehicle track collision detection method, device, equipment and storage medium | |
CN115963822A (en) | Particle filter positioning-based slip processing method and device, medium and robot | |
WO2023070441A1 (en) | Movable platform positioning method and apparatus | |
CN115409986A (en) | Laser SLAM loop detection method and device based on point cloud semantics and robot | |
CN117111616A (en) | Robot turning judgment method and device, medium and robot | |
CN116642514A (en) | Positioning initialization method and device for robot during charging and robot | |
CN116793345A (en) | Posture estimation method and device of self-mobile equipment and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |