CN112097772A - Robot and map construction method and device thereof - Google Patents

Robot and map construction method and device thereof Download PDF

Info

Publication number
CN112097772A
CN112097772A CN202010843669.2A CN202010843669A CN112097772A CN 112097772 A CN112097772 A CN 112097772A CN 202010843669 A CN202010843669 A CN 202010843669A CN 112097772 A CN112097772 A CN 112097772A
Authority
CN
China
Prior art keywords
robot
particle
positions
particles
estimated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010843669.2A
Other languages
Chinese (zh)
Other versions
CN112097772B (en
Inventor
何婉君
刘志超
熊友军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202010843669.2A priority Critical patent/CN112097772B/en
Publication of CN112097772A publication Critical patent/CN112097772A/en
Priority to PCT/CN2020/140420 priority patent/WO2022036981A1/en
Application granted granted Critical
Publication of CN112097772B publication Critical patent/CN112097772B/en
Priority to US18/171,630 priority patent/US20230205212A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides a robot map construction method, which comprises the steps of obtaining the particle position of a robot through a particle filtering method; detecting whether the particle position of the robot is matched with the position of the historical track point of the robot or not; if the particle position of the robot is matched with the position of the historical track point of the robot, optimizing the track of the robot by a graph optimization method; and constructing a map according to the optimized track. The map is constructed through the track of the robot after double optimization, so that particle positioning errors can be reduced by effectively combining particle filtering and a map optimization method, and a more accurate map can be obtained.

Description

Robot and map construction method and device thereof
Technical Field
The application belongs to the field of robots, and particularly relates to a robot and a map construction method and device thereof.
Background
With the continuous development of the robot technology, more and more robots are applied to the life and work of people, and convenience is brought to the life and work of people. During the automatic task execution process of the robot, scene map information is generally acquired. The robot can automatically complete the tasks to be executed according to the acquired scene map information.
When a scene where the robot is located is mapped, the robot is usually positioned by using a particle filtering method or a graph optimization method. However, in a large scene, the particle swarm cannot be matched and converged to a correct position, or when a single path adopted by the image optimization method is updated, the deviation of the trajectory of the robot is large due to a measurement error, which is not beneficial to reducing the error of the map constructed by the robot.
Disclosure of Invention
In view of this, the embodiment of the present application provides a robot and a map construction method and apparatus thereof, so as to solve the problem that when a robot constructs a map in the prior art, the robot trajectory has a large idea due to a measurement error, and is not beneficial to reducing an error of the constructed map.
A first aspect of an embodiment of the present application provides a mapping method for a robot, where the method includes:
acquiring the particle position of the robot by a particle filtering method;
detecting whether the particle position of the robot is matched with the position of the historical track point of the robot or not;
if the particle position of the robot is matched with the position of the historical track point of the robot, optimizing the track of the robot by a graph optimization method;
and constructing a map according to the optimized track.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the acquiring a particle position of a robot through a particle filtering method includes:
estimating the corresponding particle position of the robot in the map;
determining the matching degree of the position of the robot and the estimated position of the particles according to the scanned obstacle information;
acquiring the weight of the estimated particle position according to the matching degree of the robot and the particle;
and resampling the particles according to the weight of the particles to obtain the screened particles, and returning to the step of estimating the positions of the particles corresponding to the robot in the map until the screened particles meet the preset requirement.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, determining a matching degree of the robot position and the estimated multiple particle positions according to the scanned obstacle information includes:
acquiring a scanning position of an obstacle scanned by the robot;
obtaining the estimated position of the corresponding obstacle according to the estimated positions of the plurality of particles;
and determining the matching degree of the robot and the particle position according to the distance between the scanning position and the estimated position.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the determining, according to a distance between the scanning position and the estimated position, a matching degree between the robot and the particle position includes:
respectively determining the estimated positions of the obstacles corresponding to the particle positions and the shortest distance between the estimated positions and the scanning positions of the obstacles scanned by the robot;
and calculating the matching degree of the estimated particle positions and the robot according to the shortest distance between the estimated positions of the obstacles in the particle positions and the scanning position.
With reference to the first possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, the obtaining a weight of the estimated particle position according to a matching degree between the robot and the particle includes:
obtaining the weight value of the particle according to the sum of the matching degrees of the robot and the particle at different moments;
and normalizing the weight values of the plurality of particles to obtain the weight of the particle position corresponding to the normalized particle.
With reference to the first aspect, in a fifth possible implementation manner of the first aspect, detecting whether a particle position of the robot matches a position of a historical track point of the robot includes:
acquiring current poses of a plurality of particles at current time;
obtaining historical track points respectively determined at a plurality of moments before the current time;
and matching the current pose with the historical track points.
With reference to the fifth possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, optimizing a trajectory of the robot by a graph optimization method includes:
constructing a graph network according to the historical track points, and determining node poses and side poses in the graph network;
and determining an error optimization function according to the node pose and the edge pose, and optimizing the particle pose in the historical track point according to the error optimization function.
A second aspect of an embodiment of the present application provides a mapping apparatus for a robot, the apparatus including:
a particle position acquisition unit for acquiring a particle position of the robot by a particle filtering method;
the position matching unit is used for detecting whether the particle position of the robot is matched with the position of the historical track point of the robot or not;
the track optimization unit is used for optimizing the track of the robot by a graph optimization method if the positions of the particles of the robot are matched with the positions of the historical track points of the robot;
and the map building unit is used for building a map according to the optimized track.
A third aspect of embodiments of the present application provides a robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method according to any one of the first aspect when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium, in which a computer program is stored, which, when executed by a processor, performs the steps of the method according to any one of the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: in the gradual movement process of the robot, the particle positions of the robot can be optimized through particle filtering, when the position matching of the robot and the historical track points is detected, the track of the robot is further optimized through graph optimization, and a map is constructed according to the track of the robot after double optimization, so that the particle positioning errors can be reduced by effectively combining the particle filtering and graph optimization methods, and the method is favorable for obtaining a more accurate map.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an implementation of a mapping method for a robot according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of an implementation of obtaining a particle position of a robot through a particle filtering method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a graphical network provided by an embodiment of the present application;
fig. 4 is a schematic diagram of a mapping apparatus of a robot according to an embodiment of the present disclosure;
fig. 5 is a schematic view of a robot provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Fig. 1 is a schematic flow chart of an implementation of a mapping method for a robot according to an embodiment of the present application, which is detailed as follows:
and S101, acquiring the particle position of the robot by a particle filter method.
Particle filtering is the use of a set of particles to represent the probability, whose distribution is expressed by random state particles drawn from the posterior probability. After the robot receives the laser data, pose matching is carried out on each frame of laser data obtained through scanning, motion estimation is carried out on the pose of the robot, the weight of the particles is updated after the motion estimation and the scanning matching, resampling is carried out on the particles with the updated weights, and diversity of the particles is guaranteed. The process of acquiring the particle position of the robot by the particle filtering method may be as shown in fig. 2, and includes:
s201, estimating the corresponding particle position of the robot in the map.
The robot can estimate the pose of the robot through the odometer motion model according to the mileage data output by the wheel type encoder, and a plurality of particle positions corresponding to the robot are obtained. The particle position is used to represent the position of the robot, and in the particle filtering process, the particle position is also the particle in the map.
In a possible implementation manner, the corresponding particle position of the robot in the map can also be estimated through other sensing devices of the robot. For example, the moving distance and moving direction of the robot may be detected by a speed sensor and an orientation sensor, and the position of the robot at the current time may be estimated by the moving speed and moving direction, so as to determine the particle position corresponding to the robot position.
The estimated particle position of the robot at the current time may be motion estimated based on the particle position corresponding to the robot at the previous time before the current time. The plurality of particle positions corresponding to the robot at the previous time may be included, and the plurality of particle positions corresponding to the robot at the current time may be estimated from the plurality of particle positions at the previous time. The particle positions at the previous time may be a particle group optimized by a particle filtering method, that is, the particle positions at the previous time may include one or more particle positions. Alternatively, the particle position at the previous time may be a position determined by the robot based on the acquired obstacle information.
And S202, determining the matching degree of the position of the robot and the estimated position of the particles according to the scanned obstacle information.
At the current moment, after the estimated particle position of the robot is obtained through motion or other sensing equipment, the obstacle information collected at the current moment can be obtained.
From the estimated particle positions of the robot at the current time in S201, obstacle information corresponding to the estimated particle positions can be obtained by combining the obstacle information scanned by the robot before the current time. The obstacle information may include a distance between the particle position and the obstacle, an orientation relationship between the particle position and the obstacle, and the like.
The estimated obstacle information corresponding to each particle position at the current time may be compared with the currently acquired obstacle information, including a comparison of distances to the respective obstacles and a comparison of a directional relationship with the respective obstacles. The matching degree of the particle position and the robot position can be determined according to the matching degree of the distance and the orientation information.
In one implementation, the scanned obstacle and the obstacle corresponding to the particle position may be subjected to matching calculation, an obstacle corresponding to the closest particle position to the scanned obstacle is searched, and a distance between the scanned obstacle and the obstacle corresponding to the closest particle position is obtained. For example, when N (N is greater than or equal to 1) obstacles are scanned, the closest obstacles of the N obstacles are respectively searched in the obstacles corresponding to the particle positions to be matched and calculated, and the distance between each scanned obstacle and the closest obstacle is respectively calculated. Assuming that the pose of the scanned obstacle in the map (which may be a grid map) is Pmapi, the pose of the obstacle corresponding to the estimated particle position is Phiti, and the distance between each scanned obstacle and the closest obstacle is denoted dis (Pmapi, phii). In one possible implementation, the matching degree of the particle position and the obstacle information scanned by the laser frame can be expressed as:
Figure BDA0002642313280000061
and S203, acquiring the weight of the estimated particle position according to the matching degree of the robot and the particle.
After the matching degree between the robot and the particle is calculated, the matching degree can be used as the weight of the robot at the particle position, that is, the weight of the particle position.
In one implementation, the weight w of the particle position may be represented as the sum of the weights added up at all times. I.e. by calculating the weight w of the particle position at each time instant1、w2……wiThe weight for the particle position at the ith time may be expressed as: w is a1+w2+…+wi
In one implementation, to facilitate weighting of the particles at each time, after obtaining the weight w of the particle position, the weights may be normalized, and the normalized weights
Figure BDA0002642313280000062
The weight representing the current particle position divided by the sum of the weights of all particle positions is formulated as:
Figure BDA0002642313280000063
Figure BDA0002642313280000071
wherein scoretIs the degree of particle matching at a certain time, wiM is the number of particles, which is the weight of a particle.
And S204, resampling the particles according to the weight of the particles to obtain the screened particles, and returning to the step of estimating the positions of the particles corresponding to the robot in the map until the screened particles meet the preset requirement.
After the weights corresponding to the particle positions at the current time are obtained, particle resampling can be performed according to the weights. In the resampling process, the probability that the particle position with the larger weight is sampled and selected is higher, and the probability that the particle position with the smaller weight is sampled and selected is lower.
The similarity of the particle weights, i.e. whether the particle distribution is uniform, can be determined by a preset index. For example, the similarity of the particles can be evaluated using the monitoring index Neff:
Figure BDA0002642313280000072
Figure BDA0002642313280000073
is the normalized weight of the ith particle, and M is the number of particles.
When the monitoring index is lower than the preset threshold value, the particles are distributed unevenly, some particles are closer to the true value, some particles are farther from the true value, and resampling is needed. Resampling deletes the particle positions with smaller weights, and samples around the particle positions with larger weights a plurality of times to form a new particle set. Until the monitoring index is greater than or equal to the preset threshold value, the particles are uniformly distributed around the true value.
S102, detecting whether the particle position of the robot is matched with the position of the historical track point of the robot.
In a possible implementation manner, when detecting whether the positions of the particles of the robot are matched with the positions of the historical track points of the robot, the particles with the largest weight can be selected from the particle swarm corresponding to the current time of the robot, and the particles are matched with the historical track formed by the particles with the largest weight. If the difference between the pose of the particle at the current moment and the pose of the particle in the historical track is smaller than a preset difference threshold (the particle can be the particle with the maximum weight or a particle swarm subjected to particle filtering), the current position of the particle and the historical track can be determined to generate a loop, and the current position of the particle is a loop point.
In a possible implementation manner, if the matching degree between the pose of the particle at the current time and the pose of the particle in the historical track is smaller than a preset difference threshold (the particle may be a particle with the largest weight or a particle swarm subjected to particle filtering), and the particle in the matched historical track is not connected with the particle at the current time, it may be determined that a loop is generated between the current particle position and the historical track, and the current particle position is a loop point. This can avoid the robot being mistakenly recognized as producing a loop when not moving.
S103, if the particle position of the robot is matched with the position of the historical track point of the robot, optimizing the track of the robot by a graph optimization method.
And when the pose of the robot at the current moment is different from the pose of the particles in the historical track by less than a preset difference threshold value, or the matched particles in the historical track are further confirmed not to be connected with the particles at the current moment, confirming that the positions of the particles at the current moment are the loopback points. As shown in fig. 3, the particle positions at various times in this loop are the nodes of the loop. The nodes are connected through edges to represent a pose constraint relation, the edges connecting the particles at the current moment and the loopback points can be called loopback edges, and a graph network is formed through the nodes and the edges.
As shown in FIG. 3, node poses are represented as T1, T2.. Tn, edge relative pose between node i and node j is Tij, and the optimization error function can be represented as eij=ln(Tij -1Ti -1Tj). The optimization can be carried out by using an LM (Levenberg-Marquardt ) method, the track can be adjusted, and the position and posture of each track node with the minimum overall error can be obtained.
The particles processed by the graph optimization and correction have higher accuracy. On the basis of the particles after the correction processing, the particles with smaller weight can be further deleted by forced resampling, a new particle set is formed by sampling on the basis of the particle pose with higher weight, and the matching degree is further calculated by setting the weight of all the sampled particles to be 1/M (total sampling of M particles) until the screened particles meet the preset requirement.
And S104, constructing a map according to the optimized track.
After particle filtering and graph optimization processing are performed on the positions of the particles of the robot, a relatively accurate historical track can be obtained, the particles with higher weight can be selected, laser data corresponding to the whole historical track are traversed, the laser data are spliced, a two-dimensional laser image is obtained, and for example, a two-dimensional laser grid map and the like can be obtained.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 4 is a schematic diagram of a mapping apparatus of a robot according to an embodiment of the present disclosure, where the mapping apparatus of the robot includes:
a particle position acquisition unit 401 for acquiring a particle position of the robot by a particle filtering method;
a position matching unit 402, configured to detect whether a particle position of the robot matches a position of a historical track point of the robot;
a trajectory optimization unit 403, configured to optimize a trajectory of the robot by a graph optimization method if a particle position of the robot matches a position of a historical trajectory point of the robot;
and a map building unit 404, configured to build a map according to the optimized track.
The robot mapping apparatus shown in fig. 4 corresponds to the robot mapping method shown in fig. 1.
Fig. 5 is a schematic view of a robot provided in an embodiment of the present application. As shown in fig. 5, the robot 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52, such as a robot mapping program, stored in said memory 51 and executable on said processor 50. The processor 50, when executing the computer program 52, implements the steps in the above-described embodiments of the mapping method for each robot. Alternatively, the processor 50 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 52.
Illustratively, the computer program 52 may be partitioned into one or more modules/units, which are stored in the memory 51 and executed by the processor 50 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 52 in the robot 5.
The robot may include, but is not limited to, a processor 50, a memory 51. Those skilled in the art will appreciate that fig. 5 is merely an example of a robot 5 and does not constitute a limitation of robot 5 and may include more or fewer components than shown, or some components in combination, or different components, e.g., the robot may also include input output devices, network access devices, buses, etc.
The Processor 50 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the robot 5, such as a hard disk or a memory of the robot 5. The memory 51 may also be an external storage device of the robot 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the robot 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the robot 5. The memory 51 is used for storing the computer program and other programs and data required by the robot. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. . Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A method of robot mapping, the method comprising:
acquiring the particle position of the robot by a particle filtering method;
detecting whether the particle position of the robot is matched with the position of the historical track point of the robot or not;
if the particle position of the robot is matched with the position of the historical track point of the robot, optimizing the track of the robot by a graph optimization method;
and constructing a map according to the optimized track.
2. The method of claim 1, wherein obtaining the particle positions of the robot by a particle filtering method comprises:
estimating the corresponding particle position of the robot in the map;
determining the matching degree of the position of the robot and the estimated position of the particles according to the scanned obstacle information;
acquiring the weight of the estimated particle position according to the matching degree of the robot and the particle;
and resampling the particles according to the weight of the particles to obtain the screened particles, and returning to the step of estimating the positions of the particles corresponding to the robot in the map until the screened particles meet the preset requirement.
3. The method of claim 2, wherein determining a degree of matching of the robot position to the estimated plurality of particle positions based on the scanned obstacle information comprises:
acquiring a scanning position of an obstacle scanned by the robot;
obtaining the estimated position of the corresponding obstacle according to the estimated positions of the plurality of particles;
and determining the matching degree of the robot and the particle position according to the distance between the scanning position and the estimated position.
4. The method of claim 3, wherein determining a degree of matching of the robot to the particle location based on the distance of the scanned location from the estimated location comprises:
respectively determining the estimated positions of the obstacles corresponding to the particle positions and the shortest distance between the estimated positions and the scanning positions of the obstacles scanned by the robot;
and calculating the matching degree of the estimated particle positions and the robot according to the shortest distance between the estimated positions of the obstacles in the particle positions and the scanning position.
5. The method of claim 2, wherein obtaining the weight of the estimated particle location based on the degree of matching of the robot to the particle comprises:
obtaining the weight value of the particle according to the sum of the matching degrees of the robot and the particle at different moments;
and normalizing the weight values of the plurality of particles to obtain the weight of the particle position corresponding to the normalized particle.
6. The method of claim 1, wherein detecting whether the particle locations of the robot match the locations of the historical trajectory points of the robot comprises:
acquiring current poses of a plurality of particles at current time;
obtaining historical track points respectively determined at a plurality of moments before the current time;
and matching the current pose with the historical track points.
7. The method of claim 6, wherein optimizing the trajectory of the robot by a graph optimization method comprises:
constructing a graph network according to the historical track points, and determining node poses and side poses in the graph network;
and determining an error optimization function according to the node pose and the edge pose, and optimizing the particle pose in the historical track point according to the error optimization function.
8. A robot mapping apparatus, comprising:
a particle position acquisition unit for acquiring a particle position of the robot by a particle filtering method;
the position matching unit is used for detecting whether the particle position of the robot is matched with the position of the historical track point of the robot or not;
the track optimization unit is used for optimizing the track of the robot by a graph optimization method if the positions of the particles of the robot are matched with the positions of the historical track points of the robot;
and the map building unit is used for building a map according to the optimized track.
9. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1 to 7 are implemented when the computer program is executed by the processor.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202010843669.2A 2020-08-20 2020-08-20 Robot and map construction method and device thereof Active CN112097772B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202010843669.2A CN112097772B (en) 2020-08-20 2020-08-20 Robot and map construction method and device thereof
PCT/CN2020/140420 WO2022036981A1 (en) 2020-08-20 2020-12-28 Robot, and map construction method and device thereof
US18/171,630 US20230205212A1 (en) 2020-08-20 2023-02-20 Mapping method for mobile robot, mobile robot and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010843669.2A CN112097772B (en) 2020-08-20 2020-08-20 Robot and map construction method and device thereof

Publications (2)

Publication Number Publication Date
CN112097772A true CN112097772A (en) 2020-12-18
CN112097772B CN112097772B (en) 2022-06-28

Family

ID=73754192

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010843669.2A Active CN112097772B (en) 2020-08-20 2020-08-20 Robot and map construction method and device thereof

Country Status (3)

Country Link
US (1) US20230205212A1 (en)
CN (1) CN112097772B (en)
WO (1) WO2022036981A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112946660A (en) * 2021-01-28 2021-06-11 西北工业大学 Multi-beam forward-looking sonar-based simultaneous positioning and mapping method
WO2022036981A1 (en) * 2020-08-20 2022-02-24 深圳市优必选科技股份有限公司 Robot, and map construction method and device thereof

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140350839A1 (en) * 2013-05-23 2014-11-27 Irobot Corporation Simultaneous Localization And Mapping For A Mobile Robot
CN105509755A (en) * 2015-11-27 2016-04-20 重庆邮电大学 Gaussian distribution based mobile robot simultaneous localization and mapping method
CN105892461A (en) * 2016-04-13 2016-08-24 上海物景智能科技有限公司 Method and system for matching and recognizing the environment where robot is and map
CN107063264A (en) * 2017-04-13 2017-08-18 杭州申昊科技股份有限公司 A kind of robot map creating method suitable for extensive substation
US20170276501A1 (en) * 2016-03-28 2017-09-28 Fetch Robotics, Inc. System and Method for Localization of Robots
CN109343540A (en) * 2018-11-30 2019-02-15 广东工业大学 A kind of rear end SLAM track optimizing method based on winding detection
CN109556611A (en) * 2018-11-30 2019-04-02 广州高新兴机器人有限公司 A kind of fusion and positioning method based on figure optimization and particle filter
CN110472585A (en) * 2019-08-16 2019-11-19 中南大学 A kind of VI-SLAM closed loop detection method based on inertial navigation posture trace information auxiliary
CN110487276A (en) * 2019-08-20 2019-11-22 北京理工大学 A kind of sample vector matching locating method based on correlation analysis
CN111337018A (en) * 2020-05-21 2020-06-26 上海高仙自动化科技发展有限公司 Positioning method and device, intelligent robot and computer readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112097772B (en) * 2020-08-20 2022-06-28 深圳市优必选科技股份有限公司 Robot and map construction method and device thereof

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140350839A1 (en) * 2013-05-23 2014-11-27 Irobot Corporation Simultaneous Localization And Mapping For A Mobile Robot
CN105509755A (en) * 2015-11-27 2016-04-20 重庆邮电大学 Gaussian distribution based mobile robot simultaneous localization and mapping method
US20170276501A1 (en) * 2016-03-28 2017-09-28 Fetch Robotics, Inc. System and Method for Localization of Robots
CN105892461A (en) * 2016-04-13 2016-08-24 上海物景智能科技有限公司 Method and system for matching and recognizing the environment where robot is and map
CN107063264A (en) * 2017-04-13 2017-08-18 杭州申昊科技股份有限公司 A kind of robot map creating method suitable for extensive substation
CN109343540A (en) * 2018-11-30 2019-02-15 广东工业大学 A kind of rear end SLAM track optimizing method based on winding detection
CN109556611A (en) * 2018-11-30 2019-04-02 广州高新兴机器人有限公司 A kind of fusion and positioning method based on figure optimization and particle filter
CN110472585A (en) * 2019-08-16 2019-11-19 中南大学 A kind of VI-SLAM closed loop detection method based on inertial navigation posture trace information auxiliary
CN110487276A (en) * 2019-08-20 2019-11-22 北京理工大学 A kind of sample vector matching locating method based on correlation analysis
CN111337018A (en) * 2020-05-21 2020-06-26 上海高仙自动化科技发展有限公司 Positioning method and device, intelligent robot and computer readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022036981A1 (en) * 2020-08-20 2022-02-24 深圳市优必选科技股份有限公司 Robot, and map construction method and device thereof
CN112946660A (en) * 2021-01-28 2021-06-11 西北工业大学 Multi-beam forward-looking sonar-based simultaneous positioning and mapping method

Also Published As

Publication number Publication date
WO2022036981A1 (en) 2022-02-24
CN112097772B (en) 2022-06-28
US20230205212A1 (en) 2023-06-29

Similar Documents

Publication Publication Date Title
US10852139B2 (en) Positioning method, positioning device, and robot
CN110587597B (en) SLAM closed loop detection method and detection system based on laser radar
CN110969649A (en) Matching evaluation method, medium, terminal and device of laser point cloud and map
CN112097772B (en) Robot and map construction method and device thereof
CN110838122B (en) Point cloud segmentation method and device and computer storage medium
CN112198878B (en) Instant map construction method and device, robot and storage medium
CN111177295A (en) Image-building ghost eliminating method and device, computer-readable storage medium and robot
CN112050821A (en) Lane line polymerization method
CN113158869A (en) Image recognition method and device, terminal equipment and computer readable storage medium
CN115810133B (en) Welding control method based on image processing and point cloud processing and related equipment
CN115063454A (en) Multi-target tracking matching method, device, terminal and storage medium
CN114593735B (en) Pose prediction method and device
CN112053383A (en) Method and device for real-time positioning of robot
CN110146855B (en) Radar intermittent interference suppression threshold calculation method and device
CN113252023A (en) Positioning method, device and equipment based on odometer
CN112833912B (en) V-SLAM map verification method, device and equipment
CN113905066B (en) Networking method of Internet of things, networking device of Internet of things and electronic equipment
WO2022252482A1 (en) Robot, and environment map construction method and apparatus therefor
CN115147497A (en) Calibration method and device and electronic equipment
CN116486063A (en) Detection frame calibration method, device, equipment and computer readable storage medium
CN113776517A (en) Map generation method, device, system, storage medium and electronic equipment
CN110609561A (en) Pedestrian tracking method and device, computer readable storage medium and robot
CN116958266B (en) Closed loop detection method, device, communication equipment and storage medium
CN115077467B (en) Cleaning robot posture estimation method and device and cleaning robot
US20230288526A1 (en) Beacon map construction method, device, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant