CN111590595B - Positioning method and device, mobile robot and storage medium - Google Patents

Positioning method and device, mobile robot and storage medium Download PDF

Info

Publication number
CN111590595B
CN111590595B CN202010622515.0A CN202010622515A CN111590595B CN 111590595 B CN111590595 B CN 111590595B CN 202010622515 A CN202010622515 A CN 202010622515A CN 111590595 B CN111590595 B CN 111590595B
Authority
CN
China
Prior art keywords
straight line
data segment
mobile robot
angle
search range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010622515.0A
Other languages
Chinese (zh)
Other versions
CN111590595A (en
Inventor
胡旭
闫瑞君
眭灵慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Silver Star Intelligent Group Co Ltd
Original Assignee
Shenzhen Silver Star Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Silver Star Intelligent Technology Co Ltd filed Critical Shenzhen Silver Star Intelligent Technology Co Ltd
Priority to CN202010622515.0A priority Critical patent/CN111590595B/en
Publication of CN111590595A publication Critical patent/CN111590595A/en
Application granted granted Critical
Publication of CN111590595B publication Critical patent/CN111590595B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/027Electromagnetic sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Abstract

The invention relates to the technical field of mobile robots, and discloses a positioning method, which is applied to a mobile robot, wherein the mobile robot comprises a radar sensor, and the method comprises the following steps: acquiring an initial pose of the mobile robot in a preset map and radar data, and mapping the radar data onto the preset map according to the initial pose to generate mapping point cloud; judging whether the mapping point cloud comprises a first straight line data segment or not, if so, searching a second straight line data segment on a preset map, wherein a point set of the second straight line data segment corresponds to a point set of the first straight line data segment on the preset map, determining an angle difference, correcting the initial pose of the mobile robot based on the angle difference, generating a first candidate pose set, and positioning the mobile robot according to the first candidate pose set; and if not, generating a second candidate pose set to position the mobile robot. The initial pose is corrected by calculating the angle difference, the calculation amount can be reduced, and the positioning accuracy is improved.

Description

Positioning method and device, mobile robot and storage medium
Technical Field
The embodiment of the invention relates to the technical field of mobile robots, in particular to a positioning method, a positioning device, a mobile robot and a storage medium.
Background
With the development of technology and the improvement of living standards of people, mobile robots such as cleaning robots, service robots, remote monitoring robots, mobile robots, etc. gradually enter the lives of people. The cleaning robot based on the laser radar generally needs to construct an environment map according to sensing data of the laser radar, and the pose (position and direction) of the robot needs to be estimated in real time in the map construction process.
The conventional positioning method is that a positioning initial value is provided through a speedometer, the accuracy of the positioning initial value is influenced by factors such as a speedometer model, slippage, noise and the like, an error usually exists between the positioning initial value and a true value, the direction error is larger than the position error, a search range and a sampling interval are set, the search range comprises a position range and an angle range, and therefore candidate poses are determined. The smaller the sampling interval is, the more the candidate poses are, the larger the calculation amount is, and the real-time performance of the system is not facilitated. Because the direction error of the odometer is large, the real direction may be at the edge of the search angle range, even exceed the angle search range, so that the matching result is poor and the best pose cannot be obtained.
In view of the foregoing, there is a need for improvement in the art.
Disclosure of Invention
In order to solve the above technical problems, embodiments of the present invention provide a positioning method, an apparatus, a mobile robot, and a storage medium, so as to solve the technical problem that the amount of calculation is large in the current positioning through the search range and the sampling interval, reduce the amount of calculation, and improve the accuracy of angle estimation.
In order to solve the above technical problem, an embodiment of the present invention provides the following technical solutions:
in a first aspect, an embodiment of the present invention provides a positioning method applied to a mobile robot, where the mobile robot includes a radar sensor, and the method includes:
acquiring an initial pose of the mobile robot in a preset map and radar data acquired by a radar sensor, and mapping the radar data onto the preset map according to the initial pose to generate a mapping point cloud;
determining whether the mapped point cloud includes a first linear data segment,
if the first linear data segment is judged to be the first linear data segment, searching a second linear data segment on a preset map according to the first linear data segment, wherein a point set of the second linear data segment corresponds to a point set of the first linear data segment on the preset map, determining an angle difference according to the first linear data segment and the second linear data segment, correcting the initial pose of the mobile robot based on the angle difference, generating a first candidate pose set according to the corrected initial pose, and positioning the mobile robot according to the first candidate pose set;
if not, generating a second candidate pose set according to the initial pose, and positioning the mobile robot according to the second candidate pose set.
In some embodiments, said determining an angular difference from said first rectilinear data segment and said second rectilinear data segment comprises:
the first straight line data segment comprises a mapping point set, straight line fitting is carried out on the mapping point set, a first straight line equation is determined, and a first straight line angle of the first straight line data segment in a preset map is determined according to the first straight line equation;
searching an obstacle point set corresponding to the mapping point set of the first linear data segment on a preset map, performing linear fitting on the obstacle point set, determining a second linear equation, and determining a second linear angle of the obstacle point set in the preset map according to the second linear equation;
an angular difference between the first and second rectilinear angles is calculated.
In some embodiments, said determining an angular difference from said first rectilinear data segment and said second rectilinear data segment comprises:
if a plurality of first straight line data sections in the radar data are identified, determining a straight line equation corresponding to each first straight line data section, and determining a plurality of first straight line angles;
and calculating a plurality of angle differences between the plurality of first straight line angles and the plurality of second straight line angles, averaging the plurality of angle differences, and taking the average as the angle difference between the first straight line angle and the second straight line angle.
In some embodiments, the correcting the initial pose of the mobile robot based on the angle difference includes:
determining the initial pose of the mobile robot in a preset map as (x, y, psi);
the determining an angle difference according to the first straight line data segment and the second straight line data segment, and correcting the initial pose of the mobile robot based on the angle difference comprises:
the angle difference is delta psi, and the corrected initial pose is (x, y, psi + delta psi).
In some embodiments, the generating a first set of candidate poses from the corrected initial poses comprises:
and generating a first candidate pose set according to a preset first search range, wherein the first search range comprises a first angle search range and a first position search range, and the first angle search range and the first position search range are generated according to the corrected initial pose.
In some embodiments, said generating a second set of candidate poses from the initial poses comprises:
and generating a second candidate pose set according to a preset second search range, wherein the second search range comprises a second angle search range and a second position search range, and the second angle search range is larger than the first angle search range.
In some embodiments, the searching for the second straight-line data segment on the preset map according to the first straight-line data segment includes:
determining a center mapping point of the center of the radar sensor on the preset map according to the initial pose;
the point set of the first straight-line data segment comprises a plurality of mapping points, and a plurality of scanning rays extending from the central mapping point to the plurality of mapping points are determined;
and respectively searching a plurality of obstacle points corresponding to the plurality of mapping points in the plurality of scanning ray directions, wherein the distance between each obstacle point and the adjacent mapping point meets a preset distance range, and the plurality of obstacle points form the second straight line data section.
In some embodiments, after acquiring the radar data transmitted by the radar sensor, the method further comprises:
converting the radar data from a polar coordinate system to a radar coordinate system;
and then converting the radar data from a radar coordinate system to a map coordinate system.
In a second aspect, an embodiment of the present invention provides a positioning apparatus applied to a mobile robot, where the mobile robot includes a radar sensor, the apparatus includes:
the mapping unit is used for acquiring an initial pose of the mobile robot in a preset map and radar data acquired by a radar sensor, mapping the radar data onto the preset map according to the initial pose and generating a mapping point cloud;
a judging unit for judging whether the mapping point cloud includes a first linear data segment,
if the judgment result is yes, searching a second straight line data segment on a preset map according to the first straight line data segment, determining an angle difference according to the first straight line data segment and the second straight line data segment, correcting the initial pose of the mobile robot based on the angle difference, generating a first candidate pose set according to the corrected initial pose, and positioning the mobile robot according to the first candidate pose set;
if not, generating a second candidate pose set according to the initial pose, and positioning the mobile robot according to the second candidate pose set.
In a third aspect, an embodiment of the present invention provides a mobile robot, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the positioning method described above.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the positioning method described above.
The beneficial effects of the embodiment of the invention are as follows: in contrast to the prior art, the embodiment of the present invention provides a positioning method applied to a mobile robot, where the mobile robot includes a radar sensor, and the method includes: acquiring an initial pose of the mobile robot in a preset map and radar data acquired by a radar sensor, and mapping the radar data onto the preset map according to the initial pose to generate a mapping point cloud; judging whether the mapping point cloud comprises a first straight line data segment or not, if so, searching a second straight line data segment on a preset map according to the first straight line data segment, wherein a point set of the second straight line data segment corresponds to a point set of the first straight line data segment on the preset map, determining an angle difference according to the first straight line data segment and the second straight line data segment, correcting the initial pose of the mobile robot based on the angle difference, generating a first candidate pose set according to the corrected initial pose, and positioning the mobile robot according to the first candidate pose set. Therefore, on one hand, the angle of the mobile robot in the scene can be obtained by identifying the first straight line data segment in the mapping point cloud, so that the search range is reduced, and the calculated amount is reduced, on the other hand, the angle difference between the first straight line data segment and the second straight line data segment is calculated through straight line fitting, so that the initial pose is corrected, and the positioning accuracy can be improved.
Drawings
One or more embodiments are illustrated in drawings corresponding to, and not limiting to, the embodiments, in which elements having the same reference number designation may be represented as similar elements, unless specifically noted, the drawings in the figures are not to scale.
Fig. 1 is a schematic structural diagram of a mobile robot according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a positioning method according to an embodiment of the present invention;
FIG. 3 is a detailed flowchart of step S30 in FIG. 2;
FIG. 4 is a detailed flowchart of step S40 in FIG. 2;
FIG. 5 is a schematic diagram of an obstacle point provided by an embodiment of the present invention;
FIG. 6 is a detailed flowchart of step S60 in FIG. 2;
fig. 7 is a schematic structural diagram of a positioning device according to an embodiment of the present invention.
Detailed Description
In order to facilitate an understanding of the invention, the invention is described in more detail below with reference to the accompanying drawings and detailed description. It will be understood that when an element is referred to as being "secured to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may be present. The terms "vertical," "horizontal," "left," "right," and the like as used herein are for descriptive purposes only.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
The cleaning robot based on the laser radar generally needs to construct an environment map according to sensing data of the laser radar, and the pose (position and direction) of the robot needs to be estimated in real time in the map construction process. The conventional positioning method provides a positioning initial value through a speedometer, the accuracy of the positioning initial value is influenced by factors such as speedometer model, slippage, noise and the like, usually has an error from a real value, and a direction error is larger than a position error, a search range and a sampling interval are set, the search range comprises a position range (such as x, y belonging to (-0.2m,0.2m), so a rectangular search area is generated) and an angle range (such as angle belonging to (-20 degrees, 20 degrees)), the sampling interval of x and y can be set to 0.01, the angle sampling interval can be set to 0.5 °, so that the number of candidate poses can be generated as follows: and (0.2 × 2/0.025) × (20 × 2/0.5) ═ 16 × 80, projecting the radar data onto a map according to candidate poses, wherein the sum of probability values of all mapping points is the matching degree of the pose, and taking the pose with the highest matching degree as the positioning pose of the mobile machine.
In the method, the positioning precision is closely related to the size of a sampling interval, and the smaller the sampling interval is, the higher the positioning precision is, and conversely, the lower the positioning precision is. The smaller the sampling interval is, the more the candidate poses are, the larger the calculation amount is, and the real-time performance of the system is not facilitated. It is not feasible to seek both accuracy and efficiency. In addition, the angular interval cannot be too large because the positions of the maps differ greatly when the distance value of the radar data is large.
Because the direction error of the odometer is large, the real direction may be at the edge of the search angle range, even exceed the angle search range, so that the matching result is poor and the best pose cannot be obtained.
The invention can calculate the angle of the mobile machine in the scene by identifying the linear information in the environment, thereby reducing the supervision search range, improving the accuracy of angle estimation and reducing the calculation amount.
The present invention will be described in detail with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a mobile robot according to an embodiment of the present invention;
as shown in fig. 1, the mobile robot 10 may include an apparatus main body, a plurality of information collecting devices 11 installed on the apparatus main body, and a controller 12 connected to the information collecting devices.
The main body of the apparatus is the main structure of the mobile robot 10, and provides a basic structural framework and is used to accommodate a plurality of different functional modules to form the complete mobile robot 10.
The information collecting apparatus 11 is a device in which the mobile robot 10 senses external information and interacts with the outside world. The information acquisition devices can select proper types and set corresponding numbers according to actual needs.
Specifically, with continued reference to fig. 1, the information collecting device at least includes: the navigation device 11D.
The navigation device 11D is a functional module that provides positioning and mapping services for the robot and provides information on the current position of the mobile robot in real time. It may in particular be implemented using any type of structural and functional module. For example, based on SLAM (simultaneous localization and mapping), a range of a set size is continuously explored, and a map of the entire environment to be cleaned is created during walking.
In other embodiments, the information collecting apparatus includes: trigger device 11A, action collection system 11B and depth sensor 11C. The triggering device 11A is an interactive device for acquiring a triggering instruction to start an object calibration process and enter a visual education mode. The action acquisition device 11B is used for identifying action instructions and analyzing the instruction meanings of the limb actions of the visual education people. The depth sensor 11C is a device that collects the distance between the environment and the object and the robot (i.e., depth information). The three-dimensional space acquisition system can acquire three-dimensional coordinate information of the surface of the three-dimensional structure, so that the robot can construct a three-dimensional space.
The controller 12 is a control core of the whole mobile robot, and can receive external information and call a corresponding computer software program to execute one or more logic processing steps, such as completing an object calibration process. The controller may be embodied in any type of structure including, but not limited to, a microprocessor or the like.
The controller 12 may include: the embodiment of the invention takes bus connection as an example, and establishes communication connection between any two of the processor, the memory and the communication module.
The processor is a control center of the mobile robot, connects various parts of the whole mobile robot by various interfaces and lines, and executes various functions and processes data of the mobile robot by running or executing software programs and/or modules stored in the memory and calling data stored in the memory, thereby performing overall monitoring of the mobile robot. A processor may include one or more processing units; preferably, the processor may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor.
The mobile robot 10 may further include a power source (e.g., a battery) for supplying power to various components, and preferably, the power source may be logically connected to the processor through a power management system, so that functions of managing charging, discharging, and power consumption are implemented through the power management system.
The apparatus main body is provided with a corresponding traveling mechanism 14 and a cleaning mechanism 15. One skilled in the art may add or reduce one or more other different functional modules according to different practical application scenarios of the robot.
The traveling mechanism 13 is a structural device for providing a moving capability for the sweeping robot and driving the sweeping robot to move along a set route. The running gear can be realized in particular by means of any type of moving means, such as rollers, tracks, etc.
Sweeping mechanism 14 is one or more devices for accomplishing floor sweeping tasks including, but not limited to, a water storage tank, a mop, a V-roll brush, and a dust bin, among others. The cleaning mechanism can be detachable, and different cleaning mechanisms can be replaced and used for cleaning areas with specific areas according to different cleaning tasks.
The mobile robot 10 further includes a radar sensor 15, and the radar sensor 15 includes a laser emitting port and a laser receiving port, the laser emitting port is used for emitting laser, and the laser receiving port is used for receiving radar data. In an embodiment of the present invention, the radar sensor is a laser radar sensor.
In addition, the mobile robot 10 includes some functional modules, such as an odometer, which are not shown, and thus will not be described in detail.
Preferably, an embodiment of the present invention further provides a mobile robot, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program, when executed by the processor, implements the processes of the above-mentioned embodiment of the battery detection method, and can achieve the same technical effects, and details are not repeated here to avoid repetition.
The mobile robot 10 of the present embodiment exists in various forms, including but not limited to: a cleaning robot, a service robot, a remote monitoring robot, a mobile robot such as a mobile robot, and the like. In some embodiments, the number of the mobile robots 10 may be one or more.
Referring to fig. 2, fig. 2 is a schematic flow chart of a positioning method according to an embodiment of the present invention;
as shown in fig. 2, the positioning method is applied to a mobile robot, the mobile robot includes a radar sensor, the main execution subject of the positioning method is a processor of the mobile robot, the method includes steps S10 to S70:
step S10: acquiring an initial pose of the mobile robot in a preset map and radar data acquired by a radar sensor, and mapping the radar data onto the preset map according to the initial pose to generate a mapping point cloud;
specifically, the mobile robot includes an odometer, and the odometer can provide initial pose data of the mobile robot in a preset map (the accuracy of the value is affected by factors such as an odometer model, slippage, noise, and the like, and usually has an error with a real value, and a direction error is larger than a position error, so that a relatively accurate initial pose value can be obtained after the initial pose is corrected), wherein the initial pose includes the position and the direction of the mobile robot in the preset map. In this embodiment, the preset map may be a grid probability map in which the probability value is larger at a position where an obstacle is present. The mobile robot can measure the distance data of obstacles through the radar sensor to construct the grid probability map. In other embodiments, the preset map may also be other types of maps, such as a topological map.
The radar sensor can be mechanical laser radar or solid-state laser radar, the radar sensor includes a laser transmitting port and a receiving port, and a data is obtained in transmission and receiving once, and 360 degrees are rotated to the radar, if its angle step is 1 degree, then 360 data can be obtained, and of course, above-mentioned angle step can set up according to actual need, does not limit here.
The radar data is obtained at the initial pose, and the radar data is coordinate data based on a polar coordinate system centered on a radar sensor. The radar data are mapped to a preset map according to the initial pose, and actually, coordinate transformation is carried out on the radar data, so that mapping data of the radar data on a map coordinate system can be obtained.
Specifically, the mapping the radar data to a preset map according to the initial pose to generate a mapping point cloud includes:
converting the radar data from a polar coordinate system to a radar coordinate system;
and then converting the radar data from a radar coordinate system to a map coordinate system.
It will be appreciated that the data acquired by the radar is based on a radar coordinate system, each data is represented in polar coordinates, and the map is 2D, each data relative to the radar coordinate system is first converted into euclidean coordinates, i.e. (x, y), and each radar coordinate is converted into a position under the map coordinate system according to the relative pose of the radar coordinate system under the map coordinate system, i.e. the projection.
The relative pose of the radar coordinate system in the map coordinate system is equal to the pose of the radar coordinate system in the robot coordinate system, and the relative pose of the radar coordinate system in the map coordinate system is equal to the pose of the radar coordinate system in the robot coordinate system. The pose of the radar coordinate system in the robot coordinate system is defined by the structure of the mobile machine during design and is fixed, the pose of the robot coordinate system in the map coordinate system is the positioning result of the robot in the map and changes in real time along with the movement of the mobile machine, and the candidate pose is the pose of the assumed robot coordinate system in the map coordinate system.
Specifically, the radar coordinate system is a 2D cartesian coordinate system, the radar data is converted from a polar coordinate system to a radar coordinate system, and the conversion is completed by the following formula (1):
Figure BDA0002563500750000101
where ρ is a detection distance of the radar sensor at the θ angle.
Specifically, the radar data is converted from a radar coordinate system to a map coordinate system, and the conversion is completed by the following formula (2):
Figure BDA0002563500750000102
wherein the initial pose estimated by the odometer is (x)odom,yodomodom),[x,y]Is a 2D coordinate, x, in a radar coordinate systemm,ymIs 2D coordinates in a map coordinate system.
The step S20: determining whether the mapped point cloud includes a first linear data segment,
in step S20, the mapping point cloud includes a plurality of mapping points arranged in sequence, where the mapping points are used to represent the contour of an obstacle, the mapping points are divided into a plurality of mapping point groups according to a detection sequence, each mapping point group includes at least two mapping points, the mapping point groups are respectively subjected to straight line fitting to obtain a plurality of line segments, an included angle between two adjacent line segments is calculated in sequence, and if it is determined that the included angle is less than or equal to a preset angle threshold, it is determined that the two adjacent mapping point groups are located on the same straight line, so that one or more first straight line data segments in the mapping point cloud can be identified. If the included angle is larger than the preset angle threshold value, the data is truncated at the moment, namely the next data is taken as a starting point to start counting again. The preset angle threshold may be manually set as required, for example: set to ± 10 degrees.
The radar sensor comprises a laser transmitting port and a receiving port, one-time transmitting and receiving are carried out to obtain one datum, the radar rotates 360 degrees, and if the angle step length of the radar is 1 degree, 360 data can be obtained. After rotating for one circle, 360 detected data are sent out, the data are stored in an array, and the index number of the data is the detection sequence. According to the detection order of the radar data, the detection order of each mapping point in the corresponding mapping point cloud can be determined.
It is understood that one or more first straight data segments may be obtained by calculating the angle difference between adjacent radar data, and in order to improve the accuracy of the calculation of the straight angle, a number threshold is set so that the number of the first straight data segments is greater than the number threshold. If the number of the first straight line data segments is larger than the number threshold, the first straight line angle can be calculated in an averaging mode.
Step S30: if the first straight line data segment is judged to be the same as the second straight line data segment, searching a second straight line data segment on a preset map according to the first straight line data segment, wherein the point set of the second straight line data segment corresponds to the point set of the first straight line data segment on the preset map;
referring to fig. 3, fig. 3 is a detailed flowchart of step S30 in fig. 2;
as shown in fig. 3, in step S30, the searching for the second straight-line data segment on the preset map according to the first straight-line data segment includes steps S31 to S33:
step S31: and determining a center mapping point of the center of the radar sensor on the preset map according to the initial pose.
Step S32: the point set of the first straight-line data segment includes a plurality of mapping points, and a plurality of scanning rays extending from the central mapping point toward the plurality of mapping points are determined.
Step S33: and respectively searching a plurality of obstacle points corresponding to the plurality of mapping points in the plurality of scanning ray directions, wherein the distance between each obstacle point and the adjacent mapping point meets a preset distance range. The plurality of obstacle points form the second straight-line data segment.
Specifically, the greater the probability value of the grid probability map, the greater the probability of an obstacle being present. If there is a wall, the probability value is larger for the position of the wall, and the probability values of the two sides of the wall are gradually reduced. The plurality of obstacle points corresponding to the plurality of mapping points are respectively searched in the plurality of scanning ray directions, specifically, a first probability peak point is searched from a short distance to a long distance near a data mapping point to serve as an obstacle point of the radar point in a map, so that the plurality of obstacle points corresponding to the plurality of mapping points can be searched on the grid probability map, the plurality of obstacle points can represent the real positions of the obstacle profile corresponding to the first straight line data segment, the second straight line data segment can be formed according to the plurality of obstacle points, and the angle difference between the second straight line data segment and the first straight line data segment can serve as a reliable basis for correcting the initial pose.
Step S40: determining an angle difference according to the first straight line data segment and the second straight line data segment;
referring back to fig. 4, fig. 4 is a detailed flowchart of step S40 in fig. 2;
in step S40, the determining an angle difference from the first straight data piece and the second straight data piece includes steps S41 to S44:
step S41: the first straight line data segment comprises a mapping point set, straight line fitting is carried out on the mapping point set, a first straight line equation is determined, and a first straight line angle of the first straight line data segment in a preset map is determined according to the first straight line equation;
step S42: searching a set of obstacle points corresponding to the set of mapping points of the first linear data segment on a preset map,
step S43: performing linear fitting on the obstacle point set, determining a second linear equation, and determining a second linear angle of the obstacle point set in a preset map according to the second linear equation;
step S44: an angular difference between the first and second rectilinear angles is calculated.
In step S41, the set of mapping points is composed of a plurality of mapping points on the first straight-line data segment. And fitting the plurality of mapping points of the first linear data segment by a least square method to obtain a linear equation of the first linear data segment so as to determine the linear angle of the first linear data segment in the preset map. The first line equation is generated by fitting the data of the first line data segment, i.e. the plurality of mapping points of the first line data segment, for example, the equation y ═ ax + b, thereby determining the line angle arctan (a) of each line equation.
It can be understood that if a plurality of first straight line data segments in the mapping point cloud are identified, a first straight line equation corresponding to each first straight line data segment is determined, and a plurality of first straight line angles are determined. In step S44, a plurality of angle differences between the first and second straight line angles are calculated, the plurality of angle differences are averaged, and the average is defined as the angle difference between the first and second straight line angles.
In step S42, the obstacle point set is composed of a plurality of obstacle points corresponding to the plurality of mapping points of the first straight line data segment.
In step S43, a straight line equation of the second straight line data segment is obtained by least-squares fitting a plurality of obstacle points of the second straight line data segment to determine a straight line angle of the second straight line data segment in the preset map. A second line equation is generated by fitting the data of the second line data segment, i.e. the plurality of obstacle points of the second line data segment, for example, the equation y ═ ax + b, thereby determining the line angle arctan (a) of each two line equations.
Specifically, the obstacle point set is an obstacle in a grid probability map, and due to the complexity of the grid probability map, the obstacle points in the obstacle point set have errors or positioning deviation, so that straight line fitting is performed by a least square method based on RANSAC, and outliers are removed, so that a more accurate straight line angle is obtained.
In step S44, the calculating an angle difference between the first straight angle and the second straight angle includes:
if a plurality of first straight line data sections in the radar data are identified, determining a straight line equation corresponding to each first straight line data section, and determining a plurality of first straight line angles;
calculating a plurality of angle differences between a plurality of first straight line angles and a plurality of second straight line angles, averaging the plurality of angle differences, and taking the average as the angle difference between the first straight line angle and the second straight line angle, for example: the angle difference δ ψ is equal to a corresponding straight line angle-mapping straight line angle, and the pose after correction is as follows: (x)odom,yodomodom+ δ ψ). In the embodiment of the present invention, the mobile robot includes an odometer, and the initial pose is acquired by the odometer.
By calculating the average value of the first straight line angles of the first straight line data segments, the error can be reduced.
Step S50: correcting the initial pose of the mobile robot based on the angle difference;
in step S50, the correcting the initial pose of the mobile robot based on the angle difference includes:
determining the initial pose of the mobile robot in a preset map as (x, y, psi);
the determining an angle difference according to the first straight line data segment and the second straight line data segment, and correcting the initial pose of the mobile robot based on the angle difference comprises:
the angle difference is delta psi, and the corrected initial pose is (x, y, psi + delta psi).
Compared with the initial pose before correction, the angle of the corrected initial pose is more accurate, when a search window (comprising an angle search range and a position search range) is subsequently set for pose matching, the angle search range can be set to be smaller (even the angle search is not carried out), and the matching accuracy can be improved when the search is carried out near a true value.
Referring to fig. 5, fig. 5 is a schematic diagram of an obstacle point according to an embodiment of the present invention;
as shown in fig. 5, an obstacle point search range is set, and by finding an obstacle point in the grid probability map on a scanning ray formed by the radar center and the data mapping point, the first probability peak point is found as an obstacle point of the radar point in the map from a short distance to a long distance in the vicinity of the data mapping point.
Step S60: and generating a first candidate pose set according to the corrected initial pose, and positioning the mobile robot according to the first candidate pose set.
In step S60, the generating a first candidate pose set according to the corrected initial poses includes:
and generating a first candidate pose set according to a preset first search range, wherein the first search range comprises a first angle search range and a first position search range, and the first angle search range and the first position search range are generated according to the corrected initial pose.
Specifically, the first search range includes a first angle search range and a first position search range, and according to the first angle search range and the first position search range, a first candidate pose set may be determined, and specifically, according to a position sampling interval and an angle sampling interval, a number of candidate poses is determined, so as to generate the first candidate pose set, where the number of the first candidate pose set is (length of the first position search range/position sampling interval) (first angle search range/angle sampling interval).
It is understood that the first angle search range and the first position search range are generated from the corrected initial pose, and since the initial pose of the mobile robot has been corrected based on the angle difference between the first straight angle and the second straight angle, the first angle search range can be set small or the search process can be eliminated.
Referring to fig. 6, fig. 6 is a detailed flowchart of step S20 in fig. 2;
in step S60, the mobile robot is positioned according to the first candidate pose set, including steps 61 to 62:
step S61: calculating a matching degree of each candidate pose in the first candidate pose set based on the first candidate pose set;
specifically, the probability value of each obstacle point corresponding to each candidate pose on the grid probability map is determined, the sum of the probability values of all the obstacle points is calculated, and the sum of the probability values of all the obstacle points is determined as the matching degree of the candidate pose.
It can be understood that the grid probability map is a 2D probability map, similar to an image, the probability value of each pixel represents the probability of an obstacle existing there, and the probability of the corresponding pixel is changed each time radar data is inserted, and if a certain pixel is mapped by radar data more times, the probability value is higher, otherwise, the probability value that the certain pixel is not mapped is lower.
Step S62: and determining the candidate pose with the highest matching degree as the current pose of the mobile robot.
Specifically, the matching degree of each candidate pose in the candidate pose set is determined, and the candidate pose with the highest matching degree is determined as the current pose of the mobile robot.
Step S70: if not, generating a second candidate pose set according to the initial pose, and positioning the mobile robot according to the second candidate pose set.
In step S70, the generating a second set of candidate poses from the initial poses includes:
and generating a second candidate pose set according to a preset second search range, wherein the second search range comprises a second angle search range and a second position search range, and the second angle search range is larger than the first angle search range.
It can be understood that when the first straight data segment in the radar data is not identified, the representative search range is smaller, and the search range needs to be expanded, specifically, a preset second search range is passed, wherein the second search range includes a second angle search range and a second position search range, the second angle search range is greater than the first angle search range, and the second position search range is equal to the first position search range.
And determining the current pose of the mobile robot through the second candidate pose set, wherein the processing mode of the current pose is similar to that of the first candidate pose set, and the details are not repeated here.
On one hand, the angle of the mobile robot in a scene can be obtained by identifying and mapping a first straight line data segment in the point cloud, so that the search range is reduced, and the calculated amount is reduced, on the other hand, the angle difference between the first straight line data segment and a second straight line data segment is calculated through straight line fitting, so that the initial pose is corrected, and the positioning accuracy can be improved, wherein the first straight line data segment represents the obstacle contour information and the position information measured by a radar sensor, and a measurement error exists; the second straight line data segment represents the real contour information and the real position information of the obstacle corresponding to the first straight line data segment. The direction of correcting the initial pose can be improved by calculating the angle difference between the first straight line data segment and the second straight line data segment.
In an embodiment of the present invention, a positioning method is provided, which is applied to a mobile robot including a radar sensor, and includes: acquiring an initial pose of the mobile robot in a preset map and radar data acquired by a radar sensor, and mapping the radar data onto the preset map according to the initial pose to generate a mapping point cloud; judging whether the mapping point cloud comprises a first straight line data segment or not, if so, searching a second straight line data segment on a preset map according to the first straight line data segment, wherein a point set of the second straight line data segment corresponds to a point set of the first straight line data segment on the preset map, determining an angle difference according to the first straight line data segment and the second straight line data segment, correcting the initial pose of the mobile robot based on the angle difference, generating a first candidate pose set according to the corrected initial pose, and positioning the mobile robot according to the first candidate pose set. Therefore, on one hand, the angle of the mobile robot in the scene can be obtained by identifying the first straight line data segment in the mapping point cloud, so that the search range is reduced, and the calculated amount is reduced, on the other hand, the angle difference between the first straight line data segment and the second straight line data segment is calculated through straight line fitting, so that the initial pose is corrected, and the positioning accuracy can be improved.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a positioning device according to an embodiment of the present invention;
as shown in fig. 7, the positioning apparatus 70 is applied to a mobile robot including a radar sensor, and includes:
the mapping unit 71 is configured to acquire an initial pose of the mobile robot in a preset map and radar data acquired by a radar sensor, map the radar data onto the preset map according to the initial pose, and generate a mapping point cloud;
a judging unit 72 for judging whether the mapping point cloud includes a first linear data segment,
a pose correction unit 73, configured to execute the steps according to the determination result of the determination unit, if yes, search a second linear data segment on a preset map according to the first linear data segment, where a point set of the second linear data segment corresponds to a point set of the first linear data segment on the preset map, determine an angle difference according to the first linear data segment and the second linear data segment, correct the initial pose of the mobile robot based on the angle difference, generate a first candidate pose set according to the corrected initial pose, and position the mobile robot according to the first candidate pose set;
if not, generating a second candidate pose set according to the initial pose, and positioning the mobile robot according to the second candidate pose set.
In an embodiment of the present invention, the pose correction unit is specifically configured to:
the first straight line data segment comprises a mapping point set, straight line fitting is carried out on the mapping point set, a first straight line equation is determined, and a first straight line angle of the first straight line data segment in a preset map is determined according to the first straight line equation;
searching an obstacle point set corresponding to the mapping point set of the first linear data segment on a preset map, performing linear fitting on the obstacle point set, determining a second linear equation, and determining a second linear angle of the obstacle point set in the preset map according to the second linear equation;
an angular difference between the first and second rectilinear angles is calculated.
In an embodiment of the present invention, the pose correction unit is further specifically configured to:
if a plurality of first straight line data sections in the radar data are identified, determining a straight line equation corresponding to each first straight line data section, and determining a plurality of first straight line angles;
and calculating a plurality of angle differences between the plurality of first straight line angles and the plurality of second straight line angles, averaging the plurality of angle differences, and taking the average as the angle difference between the first straight line angle and the second straight line angle.
In an embodiment of the present invention, the pose correction unit is further specifically configured to:
determining the initial pose of the mobile robot in a preset map as (x, y, psi);
the determining an angle difference according to the first straight line data segment and the second straight line data segment, and correcting the initial pose of the mobile robot based on the angle difference comprises:
the angle difference is delta psi, and the corrected initial pose is (x, y, psi + delta psi).
In an embodiment of the present invention, by providing a positioning apparatus applied to a mobile robot including a radar sensor, the apparatus includes: the mapping unit is used for acquiring an initial pose of the mobile robot in a preset map and radar data acquired by a radar sensor, mapping the radar data onto the preset map according to the initial pose and generating a mapping point cloud; if the judgment result is yes, searching a second straight line data segment on a preset map according to the first straight line data segment, determining an angle difference according to the first straight line data segment and the second straight line data segment, correcting the initial pose of the mobile robot based on the angle difference, generating a first candidate pose set according to the corrected initial pose, and positioning the mobile robot according to the first candidate pose set; if not, generating a second candidate pose set according to the initial pose, and positioning the mobile robot according to the second candidate pose set.
An embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the positioning method described above.
The computer-readable storage medium may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
It should be noted that the description of the present invention and the accompanying drawings illustrate preferred embodiments of the present invention, but the present invention may be embodied in many different forms and is not limited to the embodiments described in the present specification, which are provided as additional limitations to the present invention, and the present invention is provided for understanding the present disclosure more fully. Furthermore, the above-mentioned technical features are combined with each other to form various embodiments which are not listed above, and all of them are regarded as the scope of the present invention described in the specification; further, modifications and variations will occur to those skilled in the art in light of the foregoing description, and it is intended to cover all such modifications and variations as fall within the true spirit and scope of the invention as defined by the appended claims.

Claims (8)

1. A positioning method applied to a mobile robot, wherein the mobile robot comprises a radar sensor, the method comprising:
acquiring an initial pose of the mobile robot in a preset map and radar data acquired by a radar sensor, and mapping the radar data onto the preset map according to the initial pose to generate a mapping point cloud;
judging whether the mapping point cloud comprises a first straight line data segment or not;
if the first linear data segment is judged to be the first linear data segment, searching a second linear data segment on a preset map according to the first linear data segment, wherein a point set of the second linear data segment corresponds to a point set of the first linear data segment on the preset map, determining an angle difference according to the first linear data segment and the second linear data segment, correcting the initial pose of the mobile robot based on the angle difference, generating a first candidate pose set according to the corrected initial pose, and positioning the mobile robot according to the first candidate pose set;
if not, generating a second candidate pose set according to the initial pose, and positioning the mobile robot according to the second candidate pose set;
generating a first set of candidate poses from the corrected initial poses, comprising:
generating a first candidate pose set according to a preset first search range, wherein the first search range comprises a first angle search range and a first position search range, and the first angle search range and the first position search range are generated according to the corrected initial pose;
the generating a second set of candidate poses from the initial poses comprises:
and generating a second candidate pose set according to a preset second search range, wherein the second search range comprises a second angle search range and a second position search range, and the second angle search range is larger than the first angle search range.
2. The method of claim 1, wherein determining an angular difference from the first and second linear data segments comprises:
the first straight line data segment comprises a mapping point set, straight line fitting is carried out on the mapping point set, a first straight line equation is determined, and a first straight line angle of the first straight line data segment in a preset map is determined according to the first straight line equation;
searching an obstacle point set corresponding to the mapping point set of the first linear data segment on a preset map, performing linear fitting on the obstacle point set, determining a second linear equation, and determining a second linear angle of the obstacle point set in the preset map according to the second linear equation;
an angular difference between the first and second rectilinear angles is calculated.
3. The method of claim 1, wherein determining an angular difference from the first and second linear data segments comprises:
if a plurality of first straight line data sections in the radar data are identified, determining a straight line equation corresponding to each first straight line data section, and determining a plurality of first straight line angles;
a plurality of angle differences between a plurality of first straight line angles and corresponding second straight line angles are calculated, the plurality of angle differences are averaged, and the average value is used as the angle difference between the first straight line angle and the second straight line angle.
4. The method according to claim 1, wherein the correcting the initial pose of the mobile robot based on the angle difference comprises:
determining the initial pose of the mobile robot in a preset map as (x, y, psi);
the determining an angle difference according to the first straight line data segment and the second straight line data segment, and correcting the initial pose of the mobile robot based on the angle difference comprises:
the angle difference is delta psi, and the corrected initial pose is (x, y, psi + delta psi).
5. The method of claim 1, wherein the searching for the second straight-line data segment on the preset map according to the first straight-line data segment comprises:
determining a center mapping point of the center of the radar sensor on the preset map according to the initial pose;
the point set of the first straight-line data segment comprises a plurality of mapping points, and a plurality of scanning rays extending from the central mapping point to the plurality of mapping points are determined;
and respectively searching a plurality of obstacle points corresponding to the plurality of mapping points in the plurality of scanning ray directions, wherein the distance between each obstacle point and the adjacent mapping point meets a preset distance range, and the plurality of obstacle points form the second straight line data section.
6. A positioning device for a mobile robot, the mobile robot including a radar sensor, the device comprising:
the mapping unit is used for acquiring an initial pose of the mobile robot in a preset map and radar data acquired by a radar sensor, mapping the radar data onto the preset map according to the initial pose and generating a mapping point cloud;
the judging unit is used for judging whether the mapping point cloud comprises a first linear data segment or not;
if the judgment result is yes, searching a second straight line data segment on a preset map according to the first straight line data segment, determining an angle difference according to the first straight line data segment and the second straight line data segment, correcting the initial pose of the mobile robot based on the angle difference, generating a first candidate pose set according to the corrected initial pose, and positioning the mobile robot according to the first candidate pose set;
if not, generating a second candidate pose set according to the initial pose, and positioning the mobile robot according to the second candidate pose set;
a pose correction unit, specifically configured to:
generating a first candidate pose set according to a preset first search range, wherein the first search range comprises a first angle search range and a first position search range, and the first angle search range and the first position search range are generated according to the corrected initial pose;
and generating a second candidate pose set according to a preset second search range, wherein the second search range comprises a second angle search range and a second position search range, and the second angle search range is larger than the first angle search range.
7. A mobile robot, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the positioning method of any one of claims 1 to 5.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the positioning method according to any one of claims 1 to 5.
CN202010622515.0A 2020-06-30 2020-06-30 Positioning method and device, mobile robot and storage medium Active CN111590595B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010622515.0A CN111590595B (en) 2020-06-30 2020-06-30 Positioning method and device, mobile robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010622515.0A CN111590595B (en) 2020-06-30 2020-06-30 Positioning method and device, mobile robot and storage medium

Publications (2)

Publication Number Publication Date
CN111590595A CN111590595A (en) 2020-08-28
CN111590595B true CN111590595B (en) 2021-09-28

Family

ID=72189771

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010622515.0A Active CN111590595B (en) 2020-06-30 2020-06-30 Positioning method and device, mobile robot and storage medium

Country Status (1)

Country Link
CN (1) CN111590595B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114102577B (en) * 2020-08-31 2023-05-30 北京极智嘉科技股份有限公司 Robot and positioning method applied to robot
CN111966109B (en) * 2020-09-07 2021-08-17 中国南方电网有限责任公司超高压输电公司天生桥局 Inspection robot positioning method and device based on flexible direct current converter station valve hall
CN112577500A (en) * 2020-11-27 2021-03-30 北京迈格威科技有限公司 Positioning and map construction method and device, robot and computer storage medium
CN112433211B (en) * 2020-11-27 2022-11-29 浙江商汤科技开发有限公司 Pose determination method and device, electronic equipment and storage medium
CN112859836A (en) * 2020-12-30 2021-05-28 广东美的白色家电技术创新中心有限公司 Autonomous mobile device, correction method and computer storage medium
CN113295175A (en) * 2021-04-30 2021-08-24 广州小鹏自动驾驶科技有限公司 Map data correction method and device
WO2022246812A1 (en) * 2021-05-28 2022-12-01 上海高仙自动化科技发展有限公司 Positioning method and apparatus, electronic device, and storage medium
CN113256722B (en) * 2021-06-21 2021-10-15 浙江华睿科技股份有限公司 Pose determination method, pose determination device and storage medium
CN113984071B (en) * 2021-09-29 2023-10-13 云鲸智能(深圳)有限公司 Map matching method, apparatus, robot, and computer-readable storage medium
CN113607166B (en) * 2021-10-08 2022-01-07 广东省科学院智能制造研究所 Indoor and outdoor positioning method and device for autonomous mobile robot based on multi-sensor fusion
CN114800504A (en) * 2022-04-26 2022-07-29 平安普惠企业管理有限公司 Robot posture analysis method, device, equipment and storage medium
CN114794986A (en) * 2022-05-05 2022-07-29 深圳市无限动力发展有限公司 Wall-following breaking and continuous sweeping method and device for sweeper, computer equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5341459A (en) * 1991-05-09 1994-08-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Generalized compliant motion primitive
CN104010274A (en) * 2014-06-12 2014-08-27 国家电网公司 Indoor wireless positioning method based on path matching
CN106607907A (en) * 2016-12-23 2017-05-03 西安交通大学 Mobile vision robot and measurement and control method thereof
CN108983248A (en) * 2018-06-26 2018-12-11 长安大学 It is a kind of that vehicle localization method is joined based on the net of 3D laser radar and V2X
CN110926485A (en) * 2019-11-11 2020-03-27 华中科技大学 Mobile robot positioning method and system based on linear features
CN111308481A (en) * 2020-02-21 2020-06-19 深圳市银星智能科技股份有限公司 Laser positioning method and device and mobile robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5341459A (en) * 1991-05-09 1994-08-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Generalized compliant motion primitive
CN104010274A (en) * 2014-06-12 2014-08-27 国家电网公司 Indoor wireless positioning method based on path matching
CN106607907A (en) * 2016-12-23 2017-05-03 西安交通大学 Mobile vision robot and measurement and control method thereof
CN108983248A (en) * 2018-06-26 2018-12-11 长安大学 It is a kind of that vehicle localization method is joined based on the net of 3D laser radar and V2X
CN110926485A (en) * 2019-11-11 2020-03-27 华中科技大学 Mobile robot positioning method and system based on linear features
CN111308481A (en) * 2020-02-21 2020-06-19 深圳市银星智能科技股份有限公司 Laser positioning method and device and mobile robot

Also Published As

Publication number Publication date
CN111590595A (en) 2020-08-28

Similar Documents

Publication Publication Date Title
CN111590595B (en) Positioning method and device, mobile robot and storage medium
US10823576B2 (en) Systems and methods for robotic mapping
CN110091326B (en) Mobile robot and control method for mobile robot
Georgiev et al. Localization methods for a mobile robot in urban environments
EP3764186A1 (en) Method for controlling autonomous mobile robot to travel along edge
CN111881239B (en) Construction method, construction device, intelligent robot and readable storage medium
WO2019033712A1 (en) Map creation method for mobile robot and path planning method based on map
WO2023005377A1 (en) Map building method for robot, and robot
JP6649743B2 (en) Matching evaluation device and matching evaluation method
CN110895408B (en) Autonomous positioning method and device and mobile robot
CN113475977A (en) Robot path planning method and device and robot
CN112578392A (en) Environment boundary construction method based on remote sensor and mobile robot
CN113566808A (en) Navigation path planning method, device, equipment and readable storage medium
JP2016191735A (en) Map creation device, autonomous traveling body, autonomous traveling body system, portable terminal, map creation method, map creation program and computer readable recording medium
CN112013840A (en) Sweeping robot and map construction method and device thereof
CN116465393A (en) Synchronous positioning and mapping method and device based on area array laser sensor
CN114777761A (en) Cleaning machine and map construction method
CN114995459A (en) Robot control method, device, equipment and storage medium
CN112284387A (en) Positioning method for cleaning device, and storage medium
CN109959935B (en) Map establishing method, map establishing device and robot
Fang et al. A real-time and low-cost 3D SLAM system based on a continuously rotating 2D laser scanner
CN116185046B (en) Mobile robot positioning method, mobile robot and medium
CN114115263B (en) Autonomous mapping method and device for AGV, mobile robot and medium
Lee et al. The design of practical mapping system for mobile robots using laser range sensor
US20220214443A1 (en) Method for simultaneous localization and mapping and mobile platform using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 518000 1701, building 2, Yinxing Zhijie, No. 1301-72, sightseeing Road, Xinlan community, Guanlan street, Longhua District, Shenzhen, Guangdong Province

Patentee after: Shenzhen Yinxing Intelligent Group Co.,Ltd.

Address before: 518000 building A1, Yinxing hi tech Industrial Park, Guanlan street, Longhua District, Shenzhen City, Guangdong Province

Patentee before: Shenzhen Silver Star Intelligent Technology Co.,Ltd.

CP03 Change of name, title or address