CN111486842B - Repositioning method and device and robot - Google Patents

Repositioning method and device and robot Download PDF

Info

Publication number
CN111486842B
CN111486842B CN201910085267.8A CN201910085267A CN111486842B CN 111486842 B CN111486842 B CN 111486842B CN 201910085267 A CN201910085267 A CN 201910085267A CN 111486842 B CN111486842 B CN 111486842B
Authority
CN
China
Prior art keywords
particle
particles
matching
global
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910085267.8A
Other languages
Chinese (zh)
Other versions
CN111486842A (en
Inventor
刘志超
张健
熊友军
庞建新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN201910085267.8A priority Critical patent/CN111486842B/en
Priority to US16/699,765 priority patent/US11474204B2/en
Publication of CN111486842A publication Critical patent/CN111486842A/en
Application granted granted Critical
Publication of CN111486842B publication Critical patent/CN111486842B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects

Abstract

The invention discloses a repositioning method, which comprises the following steps: acquiring a plurality of sub-image boundary points and a plurality of particles of a current frame; respectively mapping the plurality of sub-graph boundary points to a global coordinate system based on each particle to obtain a plurality of global boundary points of each particle; matching a plurality of global boundary points of the particles with a static object point set in a known map to find matching boundary points in the plurality of global boundary points; calculating the distance between the matching boundary point of the particle and the corresponding static object point, and if the distance is smaller than a preset threshold value, improving the weight of the matching boundary point; matching a plurality of global boundary points of the particles with a known map to calculate the weight of the particles; and estimating the repositioning result of the current frame by using the weights and the poses of the plurality of particles. The invention also discloses a repositioning device, a robot and a readable storage medium. Through the mode, the method and the device can improve the robustness and accuracy of the relocation.

Description

Repositioning method and device and robot
Technical Field
The present invention relates to the field of positioning, and in particular, to a repositioning method and apparatus, a robot, and a readable storage medium.
Background
The robot, the unmanned aerial vehicle and other carriers with autonomous movement capability can collect data of sensors mounted on the carriers in the working process, and generate positioning of the position and the attitude (short for pose) of the carriers by combining the existing map data, thereby carrying out autonomous navigation.
In navigation, sensing of the surroundings is often required, and the current position of the carrier in the map is confirmed from the known map, for example, when the position is incorrect at initialization or during navigation, this process may also be referred to as repositioning.
The common repositioning method is a Monte Carlo positioning (MCL) method or an adaptive Monte Carlo positioning (AMCL) method, the method has low requirement on computing resources, and has high positioning accuracy rate and short consumed time under the conditions of unchanged environment and obvious characteristics; but under the condition of environment change, the accuracy rate is low, the error rate is high, and the consumed time is long. In practical application, environmental changes are difficult to control, and relocation may be wrong, affecting the safety of navigation.
Disclosure of Invention
The invention mainly solves the technical problem of providing a repositioning method and device, a robot and a readable storage medium, which can solve the problem of high repositioning error rate under the condition of environment change in the prior art.
In order to solve the above technical problem, the present invention provides a relocation method, including: acquiring a plurality of sub-image boundary points and a plurality of particles of a current frame; respectively mapping the plurality of sub-graph boundary points to a global coordinate system based on each particle to obtain a plurality of global boundary points of each particle, wherein the position information of each global boundary point of each particle relative to the particle is the same as the position information of the corresponding sub-graph boundary point; matching a plurality of global boundary points of the particles with a static object point set in a known map to find matching boundary points in the plurality of global boundary points; calculating the distance between the matching boundary point of the particle and the corresponding static object point, and if the distance is smaller than a preset threshold value, improving the weight of the matching boundary point; matching a plurality of global boundary points of the particles with a known map to calculate the weight of the particles; and estimating the repositioning result of the current frame by using the weights and the poses of the plurality of particles.
In order to solve the above technical problem, the present invention provides a relocation apparatus, which includes at least one processor, working alone or in cooperation, for executing instructions to implement the foregoing relocation method.
In order to solve the technical problem, the invention provides a robot, which comprises a processor and a distance sensor, wherein the processor is connected with the distance sensor, and the processor is used for executing instructions to realize the repositioning method.
In order to solve the above technical problem, the present invention provides a readable storage medium storing instructions that when executed implement the foregoing relocation method.
The invention has the beneficial effects that: in the repositioning process, a plurality of global boundary points of the particle are matched with a static object point set in a known map to find a matching boundary point in the global boundary points, the distance between the matching boundary point of the particle and the corresponding static object point is calculated, if the distance is smaller than a preset threshold value, the weight of the matching boundary point is increased, then the global boundary points of the particle are matched with the known map to calculate the weight of the particle, the weight of each global boundary point is required to be used in the weight calculation process of the particle, and the weight of the matching boundary point in the global boundary points is increased, which means that the weight calculation of the particle considers the matching degree with the static object point more, the static object is not influenced by environmental changes, so that the influence of the environmental changes on repositioning is reduced, and the robustness and the accuracy of repositioning are improved.
Drawings
FIG. 1 is a flow chart illustrating an embodiment of a relocation method according to the present invention;
FIG. 2 is a schematic flow diagram of particle filtering;
FIG. 3 is a flow chart illustrating a relocation method according to an embodiment of the present invention;
FIG. 4 is a schematic structural view of a first embodiment of the relocating device in accordance with the present invention;
FIG. 5 is a schematic structural diagram of a first embodiment of the robot of the present invention;
fig. 6 is a schematic structural diagram of a first embodiment of the readable storage medium of the present invention.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and examples. Non-conflicting ones of the following embodiments may be combined with each other.
As shown in fig. 1, an embodiment of the relocation method of the present invention includes:
s1: a plurality of sub-picture boundary points and a plurality of particles of a current frame are obtained.
For convenience of explanation, the following describes a process of repositioning the carrier in the navigation process by using a robot as an example. In practical applications, the carrier may also be other devices capable of autonomous movement and navigation, such as an unmanned aerial vehicle.
The relocation method of this embodiment may be based on particle filter positioning, or Monte Carlo positioning (MCL). The essence of particle filtering is to approximate the posterior probability density characterizing an arbitrary state using a finite set of weighted random samples (particles). Particle filtering has the advantage of solving complex problems, such as the state recursive estimation problem of highly nonlinear, non-gaussian dynamical systems.
As shown in fig. 2, the main steps of particle filtering include:
s11: and acquiring the state of the particles.
The initial particles may be randomly generated and the current state of the particles may be subsequently predicted based on a motion model of the system and the state of the particles at a previous time.
S12: and updating the particle weight.
Each particle has a weight for evaluating how well the particle matches the actual state of the system. The initial weights of all particles are generally set to be the same, and the particle weights can be subsequently updated using measurement data, with particles that match the measurement data to a higher degree being weighted higher.
S13: and (5) resampling the particles.
In actual calculation, through several cycles, only a few particles have larger weights, the weights of the other particles can be ignored, the variance of the particle weights increases with time, the number of effective particles decreases, and the problem is called weight degradation. As the number of invalid particles increases, a large amount of computation is wasted on the particles that hardly function, so that estimation performance is degraded. To avoid the weight degradation problem, resampling may be performed according to the weight of the particle, i.e. using a copy of the particle with larger weight instead of the particle with smaller weight.
S14: and (6) state estimation.
The state and weight of the particles are used to estimate the current state of the system.
The steps of S11-S14 may be performed in a loop, and each execution of S11-S14 may be referred to as an iteration.
The robot may need to perform multiple iterations based on particle filtering to achieve repositioning, each iteration process may be referred to as a frame, and a current frame may refer to a current iteration process.
A common sensor for robot positioning is a distance sensor. The robot may scan the surrounding environment with a range sensor to range the surrounding objects to a number of sub-graph boundary points. Each sub-graph boundary point has angle and distance information, providing information of the map boundary (e.g., obstacle) at that angle. The distance sensor may be a laser radar, an ultrasonic ranging sensor, an infrared ranging sensor, or the like.
Each particle represents one possible pose of the robot. If the current frame is an initial frame, then particles can be obtained through random generation; if the current frame is not the initial frame, then the particles of the current frame can be predicted from the particles and control instructions of the previous frame. For example, if the control instructions cause the robot to move 0.1 meters and rotate 0.7 radians, then each particle in the previous frame can be moved 0.1 meters and rotated 0.7 radians, and reasonable control noise added as needed to get the particle in the current frame.
MCL has a problem that it cannot recover from a robot kidnapping or global positioning failure. With the increase of the number of iterations, the particles can be concentrated into a single pose, and if the pose is just incorrect, the algorithm cannot be recovered. This problem is very important, and in practice all particles near the correct pose may be accidentally discarded during the resampling step.
To solve this problem, in an embodiment of the present invention, the plurality of particles includes at least one randomly injected particle, and the self-Localization algorithm is based on Adaptive Monte Carlo Localization (AMCL).
In contrast to the MCL algorithm, the AMCL algorithm may consider the short-term likelihood mean estimate wfastAnd long-term likelihood average estimate wslowTo determine whether to implant particles randomly, specifically, max (0, 1-w)fast/wslow) As the probability of random sampling, when wfast<wslowAnd randomly injecting particles according to the ratio of the two, otherwise, not randomly injecting the particles.
S2: and respectively mapping the plurality of sub-graph boundary points to a global coordinate system based on each particle to obtain a plurality of global boundary points of each particle.
In the related contents of S2-S5, for convenience of description, except for the case specifically indicated, the weight update of only one particle is taken as an example for specific description, and in practical applications, the weight calculation process of different particles is not limited in sequence and may be performed in parallel.
The angle and the distance of each sub-graph boundary point are relative to the current pose of the robot, and the coordinate of each sub-graph boundary point in the robot coordinate system can be determined. The robot coordinate system is a coordinate system established according to the current pose of the robot. For a single particle, mapping is to transform the coordinates of the sub-graph boundary points in the robot coordinate system to a global coordinate system based on the particle to obtain corresponding global boundary points. The position information of each global boundary point of each particle relative to the particle is the same as that of the corresponding sub-image boundary point, and the position information comprises an angle and a distance.
S3: a plurality of global boundary points of the particle are matched with a set of static object points in a known map to find matching boundary points among the plurality of global boundary points.
The known map, which may also be referred to as the current map, is the scope of the relocation, which can be divided into global relocation and local relocation, depending on whether the known map is complete or not. The static object point set comprises a plurality of static object points, the static object points are on the edges of static objects in the known map, and the static objects refer to objects which are fixed and cannot be influenced by environmental changes, such as doors, windows, pillars and the like.
The matching boundary point refers to a global boundary point successfully matched with the static object point. Specifically, the global boundary point closest to the static object point may be used as the matching boundary point corresponding to the static object point; or finding a point with the closest distance for the global boundary point in the known map, and if the point is a static object point, the global boundary point is a matching boundary point.
A set of static object points is acquired before this step is performed. In particular, deep learning may be utilized to extract static objects from a known map to obtain a set of static object points. Or registering a known map with a static feature to obtain a set of static object points. Static features are known, representing features of static objects, which can be extracted manually by a person on a map in advance. For example, after determining a known map for local repositioning, an Iterative Closest Point (ICP) registration of the static feature with the known map is performed to find a set of static object points in the known map.
S4: and calculating the distance between the matching boundary point of the particle and the corresponding static object point, and if the distance is smaller than a preset threshold value, improving the weight of the matching boundary point.
Each global boundary point has a weight, typically inherited from the corresponding sub-graph boundary point. The distance between a certain matching boundary point and the corresponding static object point is smaller than the preset threshold, which means that the matching degree between the matching boundary point and the corresponding static object point is high, and the weight of the matching boundary point can be increased.
If the distance is greater than or equal to the preset threshold, the weight of the matching boundary point may not be adjusted.
S5: the particles are matched against a known map using a plurality of global boundary points of the particles to calculate the weights of the particles.
Specifically, a matching point closest to each global boundary point may be found in a known map, then a matching degree between each global boundary point and its corresponding matching point (generally, a negative correlation with a distance between the two, which is referred to as a matching degree of the global boundary point for short) is calculated, and finally, a weighted average of all matching degrees is obtained as the weight of the particle. In the process of obtaining the weighted average of the matching degrees, the weight of each matching degree is the weight of the corresponding global boundary point.
In the prior art, if environmental changes occur, such as addition, disappearance, movement and the like of an object, the matching degree of at least part of non-matching boundary points is abnormally reduced, and a large error may be brought to the calculation result of the particle weight. In this embodiment, since the weight of the matching boundary point is increased, the influence of the matching degree of the matching boundary point on the calculation result of the particle weight is increased, and the influence of the matching degree of the non-matching boundary point is decreased, so that the influence of the environmental change on the particle weight is reduced.
S6: and estimating the repositioning result of the current frame by using the weights and the poses of the plurality of particles.
Optionally, the pose of the particle with the largest weight may be selected as the repositioning result. Further, a plurality of particles may be resampled with the weights of the particles. The order of execution between resampling and relocation result is not limited in this case, as resampling does not affect the most weighted particles.
Alternatively, a plurality of particles may be resampled by using the weights of the particles, and then a weighted average of the poses of the plurality of particles after resampling may be calculated as the repositioning result. In this case, the weight of the pose of each particle in the calculation process of the repositioning result is the weight of the particle.
The steps can be executed circularly until a relocation result meeting the requirement is obtained.
Through the implementation of the above embodiment, the weight of the global boundary point is needed in the weight calculation process of the particle, and the weight of the matching boundary point in the global boundary point is increased, which means that the weight calculation of the particle considers more matching degrees with the static object point, and the static object is not affected by the environmental change, so that the influence of the environmental change on the repositioning is reduced, and the robustness and the accuracy of the repositioning are improved.
As shown in fig. 3, a specific embodiment of the relocation method of the present invention includes two major parts: static object identification and improved AMCL.
In the static object identification part, ICP registration is carried out on the known map and the static features to obtain a static object point set SD.
In the improved AMCL part, first, a plurality of particles are acquired in a known map, or scattered in the known map, and a plurality of sub-graph boundary points (which may also be referred to as laser point clouds) in a robot coordinate system are acquired by using laser radar scanning. Then, for each particle, mapping the sub-graph boundary points to a particle-based global coordinate system to obtain a plurality of global boundary points, matching the global boundary points with a static object point set SD obtained by a static object identification part to obtain matching boundary points, judging whether the distance between the matching boundary points and the corresponding static object points is smaller than a preset threshold value, if so, increasing the weight of the matching boundary points, otherwise, not modifying and increasing the weight of the matching boundary points. The weight of the particle is then calculated. After the calculation of all the particle weights is completed, maximum likelihood estimation is carried out to obtain a repositioning result, namely the pose of the one with the largest weight in the multiple particles is taken as the result of robot positioning.
As shown in fig. 4, the first embodiment of the relocating device of the present invention includes: a processor 110. Only one processor 110 is shown, and the actual number may be larger. The processors 110 may operate individually or in concert.
The processor 110 controls the operation of the relocating device, and the processor 110 may also be referred to as a Central Processing Unit (CPU). The processor 110 may be an integrated circuit chip having the processing capability of signal sequences. The processor 110 may also be a general purpose processor, a digital signal sequence processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The processor 110 is configured to execute instructions to implement the methods provided by any of the embodiments of the relocation methods of the present invention and combinations that are not conflicting.
As shown in fig. 5, the first embodiment of the robot of the present invention includes: a processor 210 and a distance sensor 220.
The processor 210 controls the operation of the robot, and the processor 210 may also be referred to as a Central Processing Unit (CPU). The processor 210 may be an integrated circuit chip having the processing capability of signal sequences. Processor 210 may also be a general purpose processor, a digital signal sequence processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The distance sensor 220 can acquire distance information between an obstacle within a measurement range and the distance sensor 220. The distance sensor may be a laser radar, an ultrasonic ranging sensor, an infrared ranging sensor, or the like.
The processor 210 is configured to execute instructions to implement the methods provided by any of the embodiments of the relocation methods of the present invention and combinations that are not conflicting.
As shown in fig. 6, the first embodiment of the storage medium readable by the present invention includes a memory 310, and the memory 310 stores instructions that, when executed, implement the method provided by any embodiment of the relocation method of the present invention and any non-conflicting combination.
The Memory 310 may include a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a hard disk, an optical disk, and the like.
In the embodiments provided in the present invention, it should be understood that the disclosed method and apparatus can be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be physically included alone, or two or more units may be integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (13)

1. A relocation method, comprising:
acquiring a plurality of sub-image boundary points and a plurality of particles of a current frame;
respectively mapping the plurality of sub-graph boundary points to a global coordinate system based on each particle to obtain a plurality of global boundary points of each particle, wherein the position information of each global boundary point of each particle relative to the particle is the same as the position information of the corresponding sub-graph boundary point;
matching the plurality of global boundary points of the particle with a set of static object points in a known map to find matching boundary points among the plurality of global boundary points;
calculating the distance between the matching boundary point of the particle and the corresponding static object point, and if the distance is smaller than a preset threshold value, improving the weight of the matching boundary point;
matching the plurality of global boundary points of the particle with the known map to calculate a weight of the particle;
and estimating the repositioning result of the current frame by utilizing the weights and the poses of a plurality of particles.
2. The method of claim 1,
the matching the plurality of global boundary points of the particle with a set of static object points to find matching boundary points among the plurality of global boundary points further comprises:
the set of static object points is obtained.
3. The method of claim 2,
the acquiring the set of static object points comprises:
extracting static objects from the known map using deep learning to obtain the set of static object points.
4. The method of claim 2,
the acquiring the set of static object points comprises:
registering the known map with a static feature to obtain the set of static object points.
5. The method of claim 1,
the estimating the repositioning result of the current frame by using the weights and the poses of a plurality of particles comprises:
selecting a pose of a most weighted one of the plurality of particles as the repositioning result.
6. The method of claim 5,
the matching the plurality of global boundary points of the particle with the known map to calculate the weight of the particle further comprises:
resampling a plurality of the particles with the weights of the particles.
7. The method of claim 1,
the estimating the repositioning result of the current frame by using the weights and the poses of a plurality of particles comprises:
resampling a plurality of said particles with their weights;
and calculating a weighted average of the poses of the plurality of particles after resampling as the repositioning result.
8. The method of claim 1,
after said calculating a distance between the matching boundary point of the particle and the corresponding static object point, further comprising:
and if the distance is greater than or equal to the preset threshold, not adjusting the weight of the matching boundary point.
9. The method of claim 1,
the plurality of particles includes at least one randomly implanted particle.
10. A relocating device comprising at least one processor, acting alone or in conjunction, for executing instructions to implement a method according to any one of claims 1 to 9.
11. A robot comprising a processor and a distance sensor, the processor being coupled to the distance sensor, the processor being configured to execute instructions to implement the method of any of claims 1-9.
12. A robot as claimed in claim 11, characterized in that the distance sensor is a lidar.
13. A readable storage medium storing instructions that, when executed, implement the method of any one of claims 1-9.
CN201910085267.8A 2019-01-29 2019-01-29 Repositioning method and device and robot Active CN111486842B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910085267.8A CN111486842B (en) 2019-01-29 2019-01-29 Repositioning method and device and robot
US16/699,765 US11474204B2 (en) 2019-01-29 2019-12-02 Localization method and robot using the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910085267.8A CN111486842B (en) 2019-01-29 2019-01-29 Repositioning method and device and robot

Publications (2)

Publication Number Publication Date
CN111486842A CN111486842A (en) 2020-08-04
CN111486842B true CN111486842B (en) 2022-04-15

Family

ID=71732399

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910085267.8A Active CN111486842B (en) 2019-01-29 2019-01-29 Repositioning method and device and robot

Country Status (2)

Country Link
US (1) US11474204B2 (en)
CN (1) CN111486842B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112068547A (en) * 2020-08-05 2020-12-11 歌尔股份有限公司 Robot positioning method and device based on AMCL and robot
CN112305520B (en) * 2020-10-27 2024-03-12 三一机器人科技有限公司 Correction method and device for detection position of reflection column of single laser radar
CN112509027B (en) * 2020-11-11 2023-11-21 深圳市优必选科技股份有限公司 Repositioning method, robot, and computer-readable storage medium
CN112418316B (en) * 2020-11-24 2023-09-29 深圳市优必选科技股份有限公司 Robot repositioning method and device, laser robot and readable storage medium
CN112462769A (en) * 2020-11-25 2021-03-09 深圳市优必选科技股份有限公司 Robot positioning method and device, computer readable storage medium and robot
CN114721364A (en) * 2020-12-22 2022-07-08 莱克电气绿能科技(苏州)有限公司 Mobile robot control method, device, equipment and storage medium
CN112762928B (en) * 2020-12-23 2022-07-15 重庆邮电大学 ODOM and DM landmark combined mobile robot containing laser SLAM and navigation method
CN112612862B (en) * 2020-12-24 2022-06-24 哈尔滨工业大学芜湖机器人产业技术研究院 Grid map positioning method based on point cloud registration
CN112987027B (en) * 2021-01-20 2024-03-15 长沙海格北斗信息技术有限公司 Positioning method of AMCL algorithm based on Gaussian model and storage medium
CN113483747B (en) * 2021-06-25 2023-03-24 武汉科技大学 Improved AMCL (advanced metering library) positioning method based on semantic map with corner information and robot
CN115375869B (en) * 2022-10-25 2023-02-10 杭州华橙软件技术有限公司 Robot repositioning method, robot and computer-readable storage medium
CN116840820B (en) * 2023-08-29 2023-11-24 上海仙工智能科技有限公司 Method and system for detecting 2D laser positioning loss and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105928505A (en) * 2016-04-19 2016-09-07 深圳市神州云海智能科技有限公司 Determination method and apparatus for position and orientation of mobile robot

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7774158B2 (en) * 2002-12-17 2010-08-10 Evolution Robotics, Inc. Systems and methods for landmark generation for visual simultaneous localization and mapping
KR100809352B1 (en) * 2006-11-16 2008-03-05 삼성전자주식회사 Method and apparatus of pose estimation in a mobile robot based on particle filter
US9766074B2 (en) * 2008-03-28 2017-09-19 Regents Of The University Of Minnesota Vision-aided inertial navigation
JP7147119B2 (en) * 2015-11-02 2022-10-05 スターシップ テクノロジーズ オウ Device and method for autonomous self-localization
CN105652871A (en) * 2016-02-19 2016-06-08 深圳杉川科技有限公司 Repositioning method for mobile robot
US9766349B1 (en) * 2016-09-14 2017-09-19 Uber Technologies, Inc. Localization and tracking using location, signal strength, and pseudorange data
CN106441279B (en) * 2016-12-08 2019-03-29 速感科技(北京)有限公司 Robot localization method, the system explored based on autonomous positioning and edge
CN107908185A (en) * 2017-10-14 2018-04-13 北醒(北京)光子科技有限公司 A kind of robot autonomous global method for relocating and robot
CN107991683B (en) * 2017-11-08 2019-10-08 华中科技大学 A kind of robot autonomous localization method based on laser radar
CN112639502A (en) * 2018-09-07 2021-04-09 华为技术有限公司 Robot pose estimation

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105928505A (en) * 2016-04-19 2016-09-07 深圳市神州云海智能科技有限公司 Determination method and apparatus for position and orientation of mobile robot

Also Published As

Publication number Publication date
US11474204B2 (en) 2022-10-18
CN111486842A (en) 2020-08-04
US20200241112A1 (en) 2020-07-30

Similar Documents

Publication Publication Date Title
CN111486842B (en) Repositioning method and device and robot
CN111060101B (en) Vision-assisted distance SLAM method and device and robot
US9607401B2 (en) Constrained key frame localization and mapping for vision-aided inertial navigation
Montemerlo et al. Simultaneous localization and mapping with unknown data association using FastSLAM
CN109932713B (en) Positioning method, positioning device, computer equipment, readable storage medium and robot
US20150199556A1 (en) Method of using image warping for geo-registration feature matching in vision-aided positioning
CN111380510B (en) Repositioning method and device and robot
US20220051031A1 (en) Moving object tracking method and apparatus
JP6330068B2 (en) Image analysis system and method
WO2012071320A1 (en) Coded filter
US11574480B2 (en) Computerized device for driving assistance
CN111373336A (en) State awareness method and related equipment
CN112991389A (en) Target tracking method and device and mobile robot
KR20230020845A (en) Electronic deivce and method for tracking object thereof
Naujoks et al. Fast 3D extended target tracking using NURBS surfaces
WO2024001083A1 (en) Localization method, apparatus and device, and storage medium
CN110648353A (en) Monocular sensor-based robot indoor positioning method and device
CN115993132A (en) Visual inertial odometer initialization method and device and aerocar
WO2022186777A1 (en) Method and system for localizing a mobile robot
KR101426040B1 (en) Method for tracking moving object and apparatus in intelligent visual surveillance system
CN108917768B (en) Unmanned aerial vehicle positioning navigation method and system
Zhang et al. Mobile robot global localization using particle swarm optimization with a 2D range scan
Soloviev et al. Assured vision aided inertial localization
An et al. Tracking an RGB-D camera on mobile devices using an improved frame-to-frame pose estimation method
US20220334238A1 (en) Method and Device for Estimating a Velocity of an Object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant