WO2022183785A1 - Procédé et appareil de positionnement de robot, robot et support de stockage lisible - Google Patents
Procédé et appareil de positionnement de robot, robot et support de stockage lisible Download PDFInfo
- Publication number
- WO2022183785A1 WO2022183785A1 PCT/CN2021/132992 CN2021132992W WO2022183785A1 WO 2022183785 A1 WO2022183785 A1 WO 2022183785A1 CN 2021132992 W CN2021132992 W CN 2021132992W WO 2022183785 A1 WO2022183785 A1 WO 2022183785A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pose
- robot
- point cloud
- laser point
- positioning
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 238000005457 optimization Methods 0.000 claims abstract description 47
- 238000004590 computer program Methods 0.000 claims abstract description 13
- 238000006243 chemical reaction Methods 0.000 claims abstract description 6
- 239000002245 particle Substances 0.000 claims description 78
- 230000006870 function Effects 0.000 claims description 34
- 230000009466 transformation Effects 0.000 claims description 10
- 239000011159 matrix material Substances 0.000 claims description 9
- 238000004422 calculation algorithm Methods 0.000 claims description 6
- 238000012952 Resampling Methods 0.000 claims description 5
- 238000001914 filtration Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 238000005259 measurement Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000000717 retained effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
Definitions
- the present application relates to the technical field of robot positioning, and in particular, to a robot positioning method, device, robot and readable storage medium.
- robot positioning accuracy is an important indicator.
- the principle of robot laser positioning technology is to calculate the position of the robot itself through the features scanned by lidar, which is a relative positioning technology.
- This relative positioning technology is very easily affected by changes in the surrounding environment.
- the map formed by the surrounding environment is also pre-scanned and established by lidar.
- a resolution will be set for the map (for example, one pixel represents the actual 5cm ), usually this resolution will take into account both the navigation accuracy and the amount of calculation, and is affected by the map resolution, so the navigation accuracy will also be subject to certain constraints. Therefore, for scenarios that need to achieve high-precision positioning (such as within 3cm) and have high requirements for positioning stability, such as positioning deviation caused by uneven ground, and need to quickly correct positioning, etc., there is a high-precision and robust positioning at this time.
- the system is a technical problem that needs to be solved urgently.
- the present application provides a robot positioning method, device, robot and readable storage medium.
- An embodiment of the present application provides a method for positioning a robot, where the robot is equipped with a lidar, and the method includes:
- Gradient optimization is performed on the score optimization function of the corresponding laser point cloud based on the initial pose and the transformed coordinates to solve the matching pose between the scanning data of the lidar and the grid map, and the matching pose is taken as The second pose of the robot; wherein, the score optimization function is constructed according to the converted coordinates of the laser point cloud and the Gaussian distribution information of the grid where the laser point cloud is located in the grid map;
- the second pose In the case that the second pose satisfies the preset matching condition, the second pose is used as the current final pose of the robot, and in the case that the preset matching condition is not satisfied, the first pose One pose is the current final pose of the robot.
- the second pose satisfies a preset matching condition, including:
- the score of the laser point cloud under the second pose is greater than or equal to a confidence threshold, and the difference between the second pose and the first pose is less than or equal to a preset error threshold.
- the gradient optimization is performed on the score optimization function of the corresponding laser point cloud based on the initial pose and the transformed coordinates to solve the matching position between the data scanned by the lidar and the grid map.
- posture including:
- a second pose is obtained by adding the initial pose and the current pose increment, wherein in the second pose, the scanning data of the lidar and the grid map have the highest matching degree.
- the method before solving the second pose of the robot, the method further includes:
- the positioning of the position of the robot once to obtain the first pose includes:
- a corresponding number of particles are randomly generated around the initial position of the robot, and each particle has its own initial pose and the same initial weight;
- the first pose of the robot is calculated according to the updated pose and updated weight of each particle obtained by the final iteration.
- the transformation formula of the coordinate transformation is:
- (x i , y i ) represents the coordinates of the i-th laser point cloud in the grid map;
- X′ i represents the converted coordinates of the i-th laser point cloud after conversion;
- (T x , T y ) represents the The position of the robot,
- T ⁇ represents the attitude angle of the robot.
- the expression of the score optimization function of the i-th laser point cloud is:
- q i and ⁇ i represent the mean and variance of the Gaussian distribution of the grid where the i-th laser point cloud is located, respectively.
- the embodiment of the present application also provides a robot positioning device, the robot is equipped with a laser radar, and the device includes:
- a positioning module is used to position the robot once to obtain the first pose
- the conversion module is used for taking the first pose as the initial pose of the robot, and using the initial pose to convert the coordinates of the current laser point cloud in the grid map to the coordinates in the robot coordinate system to obtain the The transformation coordinates of the laser point cloud;
- the secondary positioning module is used to perform gradient optimization on the score optimization function of the laser point cloud based on the initial pose and the transformed coordinates to solve the matching position between the scanning data of the laser radar and the grid map.
- the matching pose is used as the second pose of the robot; wherein, the score optimization function is constructed according to the converted coordinates of the laser point cloud and the Gaussian distribution information of the grid where the laser point cloud is located in the grid map;
- a positioning selection module configured to use the second pose as the current final pose of the robot when the second pose satisfies a preset matching condition, and when the preset matching condition is not met Next, take the first pose as the current final pose of the robot.
- An embodiment of the present application further provides a robot, the robot includes a processor and a memory, the memory stores a computer program, and when the computer program is executed on the processor, the above-mentioned robot positioning method is implemented.
- Embodiments of the present application further provide a readable storage medium, which stores a computer program, and when the computer program is executed by a processor, implements the above-mentioned method for positioning a robot.
- the robot positioning method of the embodiment of the present application first obtains the first positioning position of the robot; then, based on the first positioning position, combined with the surrounding environment information of the robot and using an iterative optimization method, the position of the robot is solved again, The second positioning position of the robot is obtained, and the current final positioning result of the robot is selected by comparing the two positioning positions.
- This method determines the final positioning position through two positioning, which can adapt to different environments and scenarios. When very high accuracy is required, the positioning can be stable, and in some scenarios, the required accuracy can be achieved, and it has strong positioning robustness.
- FIG. 1 shows a schematic flowchart of a robot positioning method according to an embodiment of the present application
- FIG. 2 shows a schematic flowchart of one positioning of the robot positioning method according to the embodiment of the present application
- FIG. 3 shows a schematic flowchart of the secondary positioning of the robot positioning method according to the embodiment of the present application
- FIG. 4 shows a schematic structural diagram of a robot positioning device according to an embodiment of the present application.
- the method of particle filtering is used to fuse odometer data, inertial measurement unit (IMU) data and lidar measurement data, and then calculate the state values of multiple particles.
- the average position is taken as the final robot position.
- this method uses the motion model of the sensor to estimate an initial pose, and then uses Newton iteration or least squares optimization, etc. method, find an optimal pose within a certain range near the initial pose, this method can obtain higher accuracy because it can calculate the optimal position when the environmental features are obvious, but when There are certain changes in the environment, or in areas with similar environments, the position will become unstable, resulting in position jumping, and when the position jumps to the wrong position, it will deviate from the correct position more and more due to the movement of the robot.
- the range is set, the positioning cannot be recovered. For example, if the matching range is 1 meter, this method cannot be recovered after the position drifts more than 1 meter.
- an embodiment of the present application proposes a robot positioning method, which can not only resist environmental changes, but also have strong positioning robustness, and can also obtain high-precision positioning results.
- the method of the embodiment of the present application will combine the particle filter algorithm to perform a primary positioning of the robot, and then use the result of the primary positioning as the basis for the secondary positioning, and combine the optimization and matching method to perform the secondary positioning of the robot. By comparing the two positioning results The result is to select the final positioning result.
- this embodiment proposes a method for positioning a robot, which can be used for self-positioning of a robot equipped with a lidar in different situations.
- step S110 the position of the robot is positioned once to obtain the first pose.
- the position of the robot is positioned once to ensure the robustness of the positioning.
- the steps for obtaining the first pose include:
- sub-step S111 a corresponding number of particles are randomly generated around the initial position of the robot, and each particle has its own initial pose and the same initial weight.
- a series of particles can be generated near the initial position of the robot, and these particles generally conform to a corresponding distribution law, such as a Gaussian distribution.
- the initial pose of each particle includes the distance r and the direction angle ⁇ between it and the robot, and the distance r can be represented by two-dimensional plane coordinates (x, y).
- the weight w represents the probability that the pose of the particle is the real position of the robot. In the initial state, the weight normalization process will be performed on all particles, so that the initial weight value of each particle is the same.
- the robot controls the motion of the robot, that is, using the motion model to update the pose of these particles, and according to the observation data of the surrounding environment from the lidar mounted on the robot, that is, using the observation model to score these particles, and obtain the weight distribution of each particle .
- the final particle pose is taken as the position of the robot.
- sub-step S112 the robot is moved in the grid map according to the movement instruction, and the pose of each particle is updated after each movement to obtain the updated pose of each particle.
- the updated pose of each particle can be calculated according to the motion model, which can be expressed as follows: Among them, ut represents the control information of the mobile robot; f u represents the motion model.
- Sub-step S113 update the weight of the corresponding particle according to the updated pose of the particle and the observation data obtained by the robot through the mounted lidar to obtain the updated weight of each particle.
- the pose of the particles can be made closer to the real position of the robot.
- the update weight of the mth particle it can be expressed as follows: Among them, z t represents the observation data obtained by the lidar measurement at time t, and f z represents the observation model.
- Sub-step S114 Perform particle resampling according to the distribution of the updated weights of all particles to obtain the resampled number of particles, and return to the above-mentioned moving step to perform a preset number of iterations.
- the particle weights are sorted by the size of the particles, and particle resampling is performed according to the weight distribution of these particles, so as to obtain M particles with a constant number, and then return to the above-mentioned step S112, and repeat the above-mentioned steps.
- the step of updating the pose and updating the weight of 100 stops until the preset number of times is executed.
- particles with high weights that is, particles that are closer to the real position of the robot
- particles with low weights that is, particles with unreliable poses
- the above-mentioned preset times may be preset by the robot, may also be pre-generated by the robot, or may be preset by the user according to actual needs.
- the above-mentioned number of iterations can also be constrained according to other conditions, such as making the weight value of the particle meet corresponding requirements, etc., and is not limited to the set number of iterations.
- Sub-step S115 Calculate the first pose of the robot according to the updated pose and updated weight of each particle obtained by the final iteration.
- the first pose of the robot may be calculated by using a weighted average algorithm or the like.
- other methods may also be used to calculate the position of the robot based on the poses of these particles, which is not limited here.
- this embodiment will solve the position of the robot again on the basis of solving the first pose.
- the Gauss-Newton iterative optimization method is used to obtain the high-precision positioning position of the robot.
- this embodiment Before performing the gradient optimization solution, this embodiment first performs gridization on the map preloaded by the robot, so as to obtain a grid map composed of several grids. Furthermore, the mean and variance of the Gaussian distribution of each grid are calculated according to the position of the obstacles in the map. Among them, the mean and variance of the Gaussian distribution of these grids will be reserved for subsequent matching optimization, and the mean and variance can be calculated by the corresponding calculation formula of the mean and variance of the Gaussian distribution.
- Step S120 taking the first pose as the initial pose of the robot, and using the initial pose to convert the coordinates of the current laser point cloud in the grid map to the coordinates in the robot coordinate system to obtain the converted coordinates of the laser point cloud.
- the score of each laser point cloud in the grid can be calculated. It can be understood that the higher the score, the higher the matching degree between the laser scan and the map in the pose represented by it, which means that the pose is closer to the real pose of the robot, otherwise, the matching degree is lower.
- the transformation coordinates of the i-th laser point cloud after transformation are X′ i , so the transformation of the coordinate transformation
- the formula is:
- T (T x , Ty , T ⁇ )
- T ⁇ represent the coordinate position and attitude angle of the robot, respectively.
- Step S130 based on the initial pose and the transformed coordinates, perform gradient optimization on the score optimization function of the laser point cloud to solve the matching pose between the scanning data of the laser radar and the grid map, and the matching pose is used as the second position of the robot. pose.
- the score optimization function of each laser point cloud is pre-built according to the converted coordinates of the laser point cloud and the Gaussian distribution information of the grid where the laser point cloud is located in the grid map.
- the Gaussian distribution information may include mean and variance. Wait.
- the score optimization function of the i-th laser point cloud can be expressed as:
- q i and ⁇ i represent the mean and variance of the Gaussian distribution of the grid where the i-th laser point cloud is located, respectively. It can be understood that the score optimization function is the same for each laser point cloud. In addition, the expression of the above-mentioned score optimization function is only an example, and is not limited thereto.
- the gradient solution of the score optimization function can be performed to obtain the optimal pose.
- the steps of solving the second pose include:
- Sub-step S131 using the Gauss-Newton iterative algorithm to solve the corresponding gradient and Hessian matrix for the score optimization function of the corresponding laser point cloud based on the transformed coordinates.
- a Jacobian matrix composed of the first-order derivatives can be obtained, wherein the gradient is a direction vector composed of the first-order derivatives.
- the second-order partial derivative is performed on the expression of the first-order derivative, and the Hessian matrix composed of the second-order partial derivative can be obtained.
- Sub-step S132 superimpose the gradients of all laser point clouds and the Hessian matrix to calculate the current pose increment.
- the pose of the robot is T
- the expression of its first derivative is:
- pi represents the matching probability of the ith laser point cloud.
- Sub-step S133 adding the initial pose and the current pose increment to obtain a second pose.
- the second pose is T 0 + ⁇ T.
- the second pose is used to achieve optimal alignment between the scan data of the lidar and the raster map.
- the first pose or the second pose is selected as the current final pose of the robot according to whether the second pose satisfies the preset matching condition.
- Step S140 in the case that the second pose satisfies the preset matching condition, the second pose is used as the current final pose of the robot.
- the preset matching condition may be: the score corresponding to the second pose is greater than or equal to a preset confidence threshold, and the difference between the second pose and the first pose is less than or equal to Preset error threshold. It can be understood that when the second pose itself is obtained relatively high and the error from the first pose is within an acceptable deviation range, the positioning position obtained later is selected as the final positioning result.
- Step S150 in the case that the second pose does not meet the preset matching condition, the first pose is taken as the current final pose of the robot.
- the first pose is taken as the current final positioning position of the robot.
- the robot positioning method of this embodiment is firstly positioned by the particle swarm filtering method. Since the position of the robot is a slowly changing process, it will not jump when the environment diverges, so it has better robustness; and the second The secondary positioning method is based on the first positioning position, and according to the surrounding environment information of the robot, an iterative optimization method is used to obtain a higher-precision positioning. This method can adapt to different environments and scenarios, and has better robustness. performance and high positioning accuracy.
- this embodiment provides a robot positioning device.
- the robot is equipped with a laser radar.
- the robot positioning device 100 includes:
- the primary positioning module 110 is used for once positioning the position of the robot to obtain the first pose
- the conversion module 120 is configured to use the first pose as the initial pose of the robot, and use the initial pose to convert the coordinates of the current laser point cloud in the grid map to the coordinates in the robot coordinate system to obtain the The transformed coordinates of the laser point cloud.
- the secondary positioning module 130 is configured to perform gradient optimization on the score optimization function of the laser point cloud based on the initial pose and the transformed coordinates to solve the matching position between the scan data of the lidar and the grid map. pose, and the matching pose is used as the second pose of the robot.
- the score optimization function is constructed according to the converted coordinates of the laser point cloud and the Gaussian distribution information of the grid where the laser point cloud is located in the grid map.
- the positioning selection module 140 is configured to use the second pose as the current final pose of the robot when the second pose satisfies a preset matching condition, and, when the preset matching condition is not met In the case of , the first pose is taken as the current final pose of the robot.
- the present application also provides a robot, for example, the robot is a mobile robot, which can realize self-positioning during the moving process.
- the robot includes a processor and a memory, wherein the memory stores a computer program, and the processor executes the computer program so that the robot executes the functions of the above-mentioned robot positioning method or each module in the above-mentioned robot positioning device.
- the present application also provides a readable storage medium for storing the computer program used in the above-mentioned robot.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more functions for implementing the specified logical function(s) executable instructions. It should also be noted that, in alternative implementations, the functions noted in the block may occur out of the order noted in the figures.
- each block of the block diagrams and/or flow diagrams, and combinations of blocks in the block diagrams and/or flow diagrams can be implemented using dedicated hardware-based systems that perform the specified functions or actions. be implemented, or may be implemented in a combination of special purpose hardware and computer instructions.
- each functional module or unit in each embodiment of the present application may be integrated together to form an independent part, or each module may exist independently, or two or more modules may be integrated to form an independent part.
- the functions are implemented in the form of software function modules and sold or used as independent products, they can be stored in a computer-readable storage medium.
- the technical solution of the present application can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution.
- the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a smart phone, a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
- the aforementioned storage medium includes: U disk, mobile hard disk, Read-Only Memory (ROM, Read-Only Memory), Random Access Memory (RAM, Random Access Memory), magnetic disk or optical disk and other media that can store program codes .
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
La présente invention concerne un procédé et un appareil de positionnement de robot, ainsi qu'un robot et un support de stockage lisible. Ledit procédé comprend les étapes consistant à : effectuer un positionnement une fois sur la position d'un robot pour obtenir une première pose ; et, en prenant la première pose en tant que pose initiale du robot, effectuer une conversion de coordonnées sur les coordonnées d'un nuage de points laser, et effectuer une recherche de gradient sur une fonction d'optimisation de score du nuage de points laser correspondant en utilisant les coordonnées converties afin de résoudre une pose d'appariement de balayage, la pose d'appariement de balayage étant prise en tant que seconde pose du robot. Finalement, la première pose ou la seconde pose est sélectionnée en tant que résultat final de positionnement selon que la seconde pose satisfait ou non une condition d'appariement prédéfinie. La présente invention peut être adaptée à différents environnements et scénarios, et peut obtenir un positionnement stable lorsqu'une précision très élevée n'est pas requise, tout en obtenant une précision requise dans certains scénarios, et présente donc une robustesse de positionnement relativement forte. L'appareil de positionnement correspond au procédé de positionnement ; le robot exécute le procédé de positionnement ; et le support de stockage lisible stocke un programme informatique pour mettre en œuvre le procédé de positionnement.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110246953.6A CN113050116B (zh) | 2021-03-05 | 2021-03-05 | 机器人定位方法、装置、机器人和可读存储介质 |
CN202110246953.6 | 2021-03-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022183785A1 true WO2022183785A1 (fr) | 2022-09-09 |
Family
ID=76510211
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/132992 WO2022183785A1 (fr) | 2021-03-05 | 2021-11-25 | Procédé et appareil de positionnement de robot, robot et support de stockage lisible |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113050116B (fr) |
WO (1) | WO2022183785A1 (fr) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115290098A (zh) * | 2022-09-30 | 2022-11-04 | 成都朴为科技有限公司 | 一种基于变步长的机器人定位方法和系统 |
CN115479599A (zh) * | 2022-09-19 | 2022-12-16 | 中国电子科技集团公司第五十四研究所 | 一种基于粒子群优化的光电传感设备的位姿偏差估计方法 |
CN115476362A (zh) * | 2022-10-10 | 2022-12-16 | 苏州大学 | 移动操作机器人定位引导方法 |
CN115727836A (zh) * | 2022-11-23 | 2023-03-03 | 锐趣科技(北京)有限公司 | 一种基于激光反光板和里程计的融合定位方法及系统 |
CN117689698A (zh) * | 2024-02-04 | 2024-03-12 | 安徽蔚来智驾科技有限公司 | 点云配准方法、智能设备及存储介质 |
CN117824667A (zh) * | 2024-03-06 | 2024-04-05 | 成都睿芯行科技有限公司 | 一种基于二维码和激光的融合定位方法及介质 |
CN118154676A (zh) * | 2024-05-09 | 2024-06-07 | 北京理工大学前沿技术研究院 | 一种基于激光雷达的场景定位方法和系统 |
CN118365848A (zh) * | 2024-04-11 | 2024-07-19 | 北京化工大学 | 一种动态场景的智能车定位方法及装置 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113050116B (zh) * | 2021-03-05 | 2024-02-27 | 深圳市优必选科技股份有限公司 | 机器人定位方法、装置、机器人和可读存储介质 |
CN113671527B (zh) * | 2021-07-23 | 2024-08-06 | 国电南瑞科技股份有限公司 | 一种提高配网带电作业机器人的精准作业方法及装置 |
CN113739819B (zh) * | 2021-08-05 | 2024-04-16 | 上海高仙自动化科技发展有限公司 | 校验方法、装置、电子设备、存储介质及芯片 |
CN115267812B (zh) * | 2022-07-28 | 2024-07-30 | 广州高新兴机器人有限公司 | 一种基于高亮区域的定位方法、装置、介质及机器人 |
CN115390530A (zh) * | 2022-09-05 | 2022-11-25 | 北京天玛智控科技股份有限公司 | 柔性装配技术的自适应优化方法及其装置 |
CN118067130B (zh) * | 2024-04-16 | 2024-07-09 | 江苏苏亿盟智能科技有限公司 | 基于数据融合的机器人高精度运动规划方法及系统 |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105487535A (zh) * | 2014-10-09 | 2016-04-13 | 东北大学 | 一种基于ros的移动机器人室内环境探索系统与控制方法 |
CN107702722A (zh) * | 2017-11-07 | 2018-02-16 | 云南昆船智能装备有限公司 | 一种激光导引agv自然导航定位方法 |
CN107991683A (zh) * | 2017-11-08 | 2018-05-04 | 华中科技大学 | 一种基于激光雷达的机器人自主定位方法 |
WO2018180338A1 (fr) * | 2017-03-30 | 2018-10-04 | パイオニア株式会社 | Dispositif de traitement d'informations, dispositif de serveur, procédé de commande, programme et support de stockage |
CN108917759A (zh) * | 2018-04-19 | 2018-11-30 | 电子科技大学 | 基于多层次地图匹配的移动机器人位姿纠正算法 |
CN109579849A (zh) * | 2019-01-14 | 2019-04-05 | 浙江大华技术股份有限公司 | 机器人定位方法、装置和机器人及计算机存储介质 |
CN109932713A (zh) * | 2019-03-04 | 2019-06-25 | 北京旷视科技有限公司 | 定位方法、装置、计算机设备、可读存储介质和机器人 |
CN110285806A (zh) * | 2019-07-05 | 2019-09-27 | 电子科技大学 | 基于多次位姿校正的移动机器人快速精确定位算法 |
US20200080860A1 (en) * | 2018-01-12 | 2020-03-12 | Zhejiang Guozi Robot Technology Co., Ltd. | Method and system for creating map based on 3d laser |
CN110927740A (zh) * | 2019-12-06 | 2020-03-27 | 合肥科大智能机器人技术有限公司 | 一种移动机器人定位方法 |
CN111113422A (zh) * | 2019-12-30 | 2020-05-08 | 深圳市优必选科技股份有限公司 | 机器人定位方法、装置、计算机可读存储介质及机器人 |
CN111508021A (zh) * | 2020-03-24 | 2020-08-07 | 广州视源电子科技股份有限公司 | 一种位姿确定方法、装置、存储介质及电子设备 |
CN111578959A (zh) * | 2020-05-19 | 2020-08-25 | 鲲鹏通讯(昆山)有限公司 | 一种基于改进Hector SLAM算法的未知环境自主定位方法 |
CN111949943A (zh) * | 2020-07-24 | 2020-11-17 | 北京航空航天大学 | 一种面向高级自动驾驶的v2x和激光点云配准的车辆融合定位方法 |
CN112082553A (zh) * | 2020-07-24 | 2020-12-15 | 广州易来特自动驾驶科技有限公司 | 基于wifi和激光雷达的室内定位方法、定位装置和机器人 |
CN113050116A (zh) * | 2021-03-05 | 2021-06-29 | 深圳市优必选科技股份有限公司 | 机器人定位方法、装置、机器人和可读存储介质 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110895408B (zh) * | 2018-08-22 | 2023-05-02 | 杭州海康机器人股份有限公司 | 一种自主定位方法、装置及移动机器人 |
-
2021
- 2021-03-05 CN CN202110246953.6A patent/CN113050116B/zh active Active
- 2021-11-25 WO PCT/CN2021/132992 patent/WO2022183785A1/fr active Application Filing
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105487535A (zh) * | 2014-10-09 | 2016-04-13 | 东北大学 | 一种基于ros的移动机器人室内环境探索系统与控制方法 |
WO2018180338A1 (fr) * | 2017-03-30 | 2018-10-04 | パイオニア株式会社 | Dispositif de traitement d'informations, dispositif de serveur, procédé de commande, programme et support de stockage |
CN107702722A (zh) * | 2017-11-07 | 2018-02-16 | 云南昆船智能装备有限公司 | 一种激光导引agv自然导航定位方法 |
CN107991683A (zh) * | 2017-11-08 | 2018-05-04 | 华中科技大学 | 一种基于激光雷达的机器人自主定位方法 |
US20200080860A1 (en) * | 2018-01-12 | 2020-03-12 | Zhejiang Guozi Robot Technology Co., Ltd. | Method and system for creating map based on 3d laser |
CN108917759A (zh) * | 2018-04-19 | 2018-11-30 | 电子科技大学 | 基于多层次地图匹配的移动机器人位姿纠正算法 |
CN109579849A (zh) * | 2019-01-14 | 2019-04-05 | 浙江大华技术股份有限公司 | 机器人定位方法、装置和机器人及计算机存储介质 |
CN109932713A (zh) * | 2019-03-04 | 2019-06-25 | 北京旷视科技有限公司 | 定位方法、装置、计算机设备、可读存储介质和机器人 |
CN110285806A (zh) * | 2019-07-05 | 2019-09-27 | 电子科技大学 | 基于多次位姿校正的移动机器人快速精确定位算法 |
CN110927740A (zh) * | 2019-12-06 | 2020-03-27 | 合肥科大智能机器人技术有限公司 | 一种移动机器人定位方法 |
CN111113422A (zh) * | 2019-12-30 | 2020-05-08 | 深圳市优必选科技股份有限公司 | 机器人定位方法、装置、计算机可读存储介质及机器人 |
CN111508021A (zh) * | 2020-03-24 | 2020-08-07 | 广州视源电子科技股份有限公司 | 一种位姿确定方法、装置、存储介质及电子设备 |
CN111578959A (zh) * | 2020-05-19 | 2020-08-25 | 鲲鹏通讯(昆山)有限公司 | 一种基于改进Hector SLAM算法的未知环境自主定位方法 |
CN111949943A (zh) * | 2020-07-24 | 2020-11-17 | 北京航空航天大学 | 一种面向高级自动驾驶的v2x和激光点云配准的车辆融合定位方法 |
CN112082553A (zh) * | 2020-07-24 | 2020-12-15 | 广州易来特自动驾驶科技有限公司 | 基于wifi和激光雷达的室内定位方法、定位装置和机器人 |
CN113050116A (zh) * | 2021-03-05 | 2021-06-29 | 深圳市优必选科技股份有限公司 | 机器人定位方法、装置、机器人和可读存储介质 |
Non-Patent Citations (3)
Title |
---|
BOURAINE SARA; BOUGOUFFA ABDELHAK; AZOUAOUI OUAHIBA: "Particle swarm optimization for solving a scan-matching problem based on the normal distributions transform", EVOLUTIONARY INTELLIGENCE, SPRINGER BERLIN HEIDELBERG, BERLIN/HEIDELBERG, vol. 15, no. 1, 3 January 2021 (2021-01-03), Berlin/Heidelberg, pages 683 - 694, XP037707769, ISSN: 1864-5909, DOI: 10.1007/s12065-020-00545-y * |
HAN MINGRUI: "3D Localization and Mapping of Outdoor Mobile Robots Using a LIDAR", MASTER THESIS, TIANJIN POLYTECHNIC UNIVERSITY, CN, no. 3, 15 March 2017 (2017-03-15), CN , XP055964348, ISSN: 1674-0246 * |
LIU YUXIANG: "Mobile Robot Localization Algorithm Based on Multi-sensor Fusion and Point Cloud Matching", MASTER THESIS, TIANJIN POLYTECHNIC UNIVERSITY, CN, no. 7, 15 July 2020 (2020-07-15), CN , XP055964345, ISSN: 1674-0246 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115479599A (zh) * | 2022-09-19 | 2022-12-16 | 中国电子科技集团公司第五十四研究所 | 一种基于粒子群优化的光电传感设备的位姿偏差估计方法 |
CN115290098A (zh) * | 2022-09-30 | 2022-11-04 | 成都朴为科技有限公司 | 一种基于变步长的机器人定位方法和系统 |
CN115476362A (zh) * | 2022-10-10 | 2022-12-16 | 苏州大学 | 移动操作机器人定位引导方法 |
CN115727836A (zh) * | 2022-11-23 | 2023-03-03 | 锐趣科技(北京)有限公司 | 一种基于激光反光板和里程计的融合定位方法及系统 |
CN117689698A (zh) * | 2024-02-04 | 2024-03-12 | 安徽蔚来智驾科技有限公司 | 点云配准方法、智能设备及存储介质 |
CN117689698B (zh) * | 2024-02-04 | 2024-04-19 | 安徽蔚来智驾科技有限公司 | 点云配准方法、智能设备及存储介质 |
CN117824667A (zh) * | 2024-03-06 | 2024-04-05 | 成都睿芯行科技有限公司 | 一种基于二维码和激光的融合定位方法及介质 |
CN117824667B (zh) * | 2024-03-06 | 2024-05-10 | 成都睿芯行科技有限公司 | 一种基于二维码和激光的融合定位方法及介质 |
CN118365848A (zh) * | 2024-04-11 | 2024-07-19 | 北京化工大学 | 一种动态场景的智能车定位方法及装置 |
CN118365848B (zh) * | 2024-04-11 | 2024-10-15 | 北京化工大学 | 一种动态场景的智能车定位方法及装置 |
CN118154676A (zh) * | 2024-05-09 | 2024-06-07 | 北京理工大学前沿技术研究院 | 一种基于激光雷达的场景定位方法和系统 |
Also Published As
Publication number | Publication date |
---|---|
CN113050116B (zh) | 2024-02-27 |
CN113050116A (zh) | 2021-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022183785A1 (fr) | Procédé et appareil de positionnement de robot, robot et support de stockage lisible | |
KR102531197B1 (ko) | 무인 비행체의 최적 경로 생성 방법 및 장치 | |
Troiani et al. | 2-point-based outlier rejection for camera-imu systems with applications to micro aerial vehicles | |
WO2022017131A1 (fr) | Dispositif et procédé de traitement de données en nuage de points, et dispositif et procédé de commande d'entraînement intelligent | |
CN108332758B (zh) | 一种移动机器人的走廊识别方法及装置 | |
CN106599108A (zh) | 一种三维环境中多模态环境地图构建方法 | |
CN106548486A (zh) | 一种基于稀疏视觉特征地图的无人车位置跟踪方法 | |
Strader et al. | Perception‐aware autonomous mast motion planning for planetary exploration rovers | |
CN111273312B (zh) | 一种智能车辆定位与回环检测方法 | |
WO2023273169A1 (fr) | Procédé de construction de carte en deux dimensions et demie (2,5d) à fusion de vision et de laser | |
Gur fil et al. | Partial aircraft state estimation from visual motion using the subspace constraints approach | |
CN110986956A (zh) | 一种基于改进的蒙特卡洛算法的自主学习全局定位方法 | |
CN112444246B (zh) | 高精度的数字孪生场景中的激光融合定位方法 | |
CN112985391B (zh) | 一种基于惯性和双目视觉的多无人机协同导航方法和装置 | |
Van Dalen et al. | Absolute localization using image alignment and particle filtering | |
CN118230231B (zh) | 一种无人车的位姿构建方法、装置、电子设备及存储介质 | |
CN111457923B (zh) | 路径规划方法、装置及存储介质 | |
CN111510704A (zh) | 校正摄像头错排的方法及利用其的装置 | |
CN111045433B (zh) | 一种机器人的避障方法、机器人及计算机可读存储介质 | |
CN116681733A (zh) | 一种空间非合作目标近距离实时位姿跟踪方法 | |
Lu et al. | Shortest paths through 3-dimensional cluttered environments | |
CN113954080B (zh) | 机器人的转向行走轨迹规划方法、装置、设备及介质 | |
CN114115231B (zh) | 移动机器人空间位姿点云校正方法及系统 | |
CN111283730B (zh) | 基于点线特征机器人初始位姿获取方法及开机自定位方法 | |
Kuo et al. | A hybrid approach to RBPF based SLAM with grid mapping enhanced by line matching |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21928857 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21928857 Country of ref document: EP Kind code of ref document: A1 |