WO2022252482A1 - Robot, et procédé de construction de carte d'environnement et appareil associé - Google Patents

Robot, et procédé de construction de carte d'environnement et appareil associé Download PDF

Info

Publication number
WO2022252482A1
WO2022252482A1 PCT/CN2021/126706 CN2021126706W WO2022252482A1 WO 2022252482 A1 WO2022252482 A1 WO 2022252482A1 CN 2021126706 W CN2021126706 W CN 2021126706W WO 2022252482 A1 WO2022252482 A1 WO 2022252482A1
Authority
WO
WIPO (PCT)
Prior art keywords
data frame
sensing data
absolute position
robot
library
Prior art date
Application number
PCT/CN2021/126706
Other languages
English (en)
Chinese (zh)
Inventor
汤煜
熊友军
Original Assignee
深圳市优必选科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市优必选科技股份有限公司 filed Critical 深圳市优必选科技股份有限公司
Publication of WO2022252482A1 publication Critical patent/WO2022252482A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Definitions

  • the present application belongs to the field of robots, and in particular relates to a method and device for constructing a robot and its environment map.
  • a robot Before a robot performs a task, it usually needs to build a map of the scene where the robot is located. According to the map constructed by the robot, the positioning accuracy can be improved, so that the path ruled by the robot is more reasonable and the navigation is safer.
  • the robot When the robot is building a map, it usually recursively deduces the pose of the subsequent keyframes based on the motion estimation of the robot and the initial pose of the first keyframe of the robot. Due to the accumulation of errors, the poses of subsequent key frames will become larger and larger, which is not conducive to the robot's accurate mapping.
  • the robot In order to reduce the cumulative error of the robot, after the robot acquires a new key frame, it performs similarity detection between the acquired key frame and the previous key frame to determine whether the current position of the robot is consistent with the previous position, that is, to detect the loop point through loop detection .
  • the robot can correct the cumulative error according to the loop detection to improve the accuracy of the map.
  • the robot compares and matches the newly acquired key frame with the previous key frame, if the global matching method is used, the calculation amount is large and the matching time is long. If the local matching is used, the matching may not be successful. probability. If there is repetitive content in the scene where the robot is located, such as similar rooms or warehouses, it is easy to generate wrong loops, which is not conducive to accurate and effective map construction.
  • the embodiment of the present application provides a method and device for constructing a robot and its environment map to solve the problem that the matching efficiency of the loop closure detection is not high, the loop closure detection is prone to errors, or the loop closure cannot be detected when the environment map is constructed in the prior art. Point of question.
  • the first aspect of the embodiments of the present application provides a method for constructing a robot environment map, the method comprising:
  • sensing data frame during the movement of the robot, and obtain the absolute position corresponding to the sensing data frame, and generate a sensing data frame library according to the acquired sensing data frame and absolute position;
  • the robot According to the absolute position corresponding to the sensing data frame currently acquired by the robot and the preset distance threshold, search the absolute position matching the current absolute position in the sensing data frame library, and the sensing data corresponding to the matched absolute position frame;
  • the sensor data frame is a key frame.
  • acquiring the sensing data frame during the movement of the robot includes one or more of the following methods:
  • key frames during the movement of the robot are determined.
  • the third possible implementation of the first aspect according to the absolute position corresponding to the sensing data frame currently acquired by the robot and the preset distance threshold, in Find the absolute position matched by the current absolute position in the sensing data frame library, and the sensing data frame corresponding to the matched absolute position, including:
  • the sensing data frame corresponding to the determined absolute position is searched.
  • the determined sensing data frame is matched with the currently acquired sensing data frame, and according to the matching
  • the result is loopback optimization and environment map construction, including:
  • the trajectory of the robot will be optimized based on the selected sensor data frame and absolute position
  • the environment map is constructed according to the optimized trajectory.
  • obtaining the absolute position corresponding to the sensing data frame includes:
  • the absolute position corresponding to the robot when acquiring the sensing data frame is determined.
  • the current sensing data frame library is searched for When the absolute position matches the absolute position, if the absolute position matching the current absolute position is not found in the sensor data frame library, the currently acquired sensor data frame and the absolute position are added to the Sensory data frame library.
  • the second aspect of the embodiment of the present application provides a robot environment map construction device, the device comprising:
  • the data acquisition unit is used to acquire the sensing data frame during the movement of the robot, and acquire the absolute position corresponding to the sensing data frame, and generate the sensing data frame library according to the acquired sensing data frame and absolute position;
  • the data search unit is used to search the absolute position matched by the current absolute position in the sensing data frame library according to the absolute position corresponding to the sensing data frame currently acquired by the robot and the preset distance threshold, and the matched absolute position The sensor data frame corresponding to the position;
  • the sensing data frame matching unit is configured to match the searched sensing data frame with the currently acquired sensing data frame, and perform loopback optimization and environment map construction according to the matching result.
  • the third aspect of the embodiments of the present application provides a robot, including a memory, a processor, and a computer program stored in the memory and operable on the processor.
  • the processor executes the computer program, it realizes The steps of the method according to any one of the first aspect.
  • the fourth aspect of the embodiments of the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and it is characterized in that, when the computer program is executed by a processor, any one of the above aspects of the first aspect can be implemented. steps of the method described in the item.
  • the embodiment of the present application has the beneficial effect that: during the movement of the robot, the sensing data frame currently acquired by the robot and its corresponding absolute position are obtained, and according to the absolute position, the pre-acquired sensing data frame Find the corresponding absolute position in the data frame library, match the sensor data frame corresponding to the searched relative position with the currently acquired sensor data frame, and perform loopback optimization and environment map construction according to the matching result.
  • the sensing data frame is filtered by the absolute position, the problem of detecting a wrong loopback point in a similar scene can be avoided, and the sensing data frame is filtered by the absolute position, which can reduce the probability that the matching cannot be completed, which is beneficial to reduce the sensing data frame
  • the amount of matching calculations can improve the construction efficiency of environmental maps.
  • FIG. 1 is a schematic diagram of a robot loopback detection in the prior art provided by an embodiment of the present application
  • Fig. 2 is a schematic diagram of the implementation flow of a robot environment map construction method provided by the embodiment of the present application;
  • Fig. 3 is a schematic diagram of a position correction provided by the embodiment of the present application.
  • Fig. 4 is a schematic diagram of the moving trajectory of an adjusted robot provided in the embodiment of the present application.
  • FIG. 5 is a schematic diagram of an environment map construction device for a robot provided in an embodiment of the present application.
  • Fig. 6 is a schematic diagram of a robot provided by an embodiment of the present application.
  • the robot Before the robot performs the task, it needs to construct the current scene map of the robot, so as to plan the trajectory or route of the task according to the constructed scene map.
  • the robot When a robot constructs a scene map, the robot usually constructs a continuous and consistent scene map around the environment based on the motion estimation of the robot and combined with the collected scene data.
  • the robot When obtaining the motion estimation of the robot, the robot usually fixes the initial position of the first laser frame, and combines the motion information of the robot to determine the pose of each subsequent laser frame.
  • the robot When an error occurs in one of the pose estimations, the robot will add it to the subsequent pose calculation results, which is not conducive to building a globally consistent trajectory and map.
  • the robot can optimize the pose of the robot by means of loop closure detection. For example, the robot compares and matches the currently collected laser frame with the previously collected laser frame, and determines the loopback point of the robot's motion trajectory according to the matched laser frame. The robot adjusts the trajectory of the robot according to the detected loop points, so as to achieve the purpose of reducing the pose estimation error generated by motion estimation.
  • the robot In loop detection, if the scene where the map is built is large, the robot acquires more laser frames. If global matching is used, that is, the current laser frame is matched with each previously stored laser frame, the amount of calculation is large and time-consuming. longer. If partial matching is used, it is necessary to ensure that the cumulative error is within a small range in order to successfully match.
  • the robot starts from point O, and the solid line trajectory in the figure is obtained according to motion estimation, and the dotted line in the figure is the actual trajectory of the robot.
  • the robot When the robot is at the O' position, the robot searches for laser frames that can be matched within the predetermined range of the estimated position through local matching. Since the matching degree between the searched laser frame and the laser frame at position O' is very low, the robot may not be able to match successfully.
  • the embodiment of the present application proposes a robot environment map construction method, by adding absolute position information during the movement of the robot, to establish the absolute position corresponding to the sensing data frame acquired by the robot.
  • the position that can be matched is screened out through the matching and screening of the absolute position, and then the result of the loop detection is determined through the matching of the sensor data frame, thereby improving the accuracy of the loop detection and the accuracy of the map construction. Due to the screening of the absolute position, the mismatching problem of similar scenes can be avoided, and it is beneficial to improve the matching efficiency.
  • FIG. 2 it is a schematic diagram of the implementation process of a robot environment map construction method proposed in the embodiment of the present application, including:
  • the sensing data frame during the moving process of the robot is acquired, and the absolute position corresponding to the sensing data frame is acquired, and a sensing data frame library is generated according to the acquired sensing data frame and absolute position.
  • the sensing data frame in the embodiment of the present application may include information such as scene images, distances between obstacles in the scene and the robot, and poses of the robot.
  • the scene image may be collected by a camera, or may be a laser image collected by a laser radar.
  • the distance between the obstacle in the scene and the robot can be detected by laser radar, or the distance between the obstacle in the scene where the robot is located and the robot can be collected by a binocular camera.
  • the pose information of the robot can be determined by the odometer and IMU (English full name is Inertial Measurement Unit, Chinese full name is inertial measurement unit) of the robot.
  • the absolute position can be determined by the robot in combination with an auxiliary positioning device.
  • a positioning base station may be set in the scene where the robot is located, and a detection signal is sent through the positioning base station.
  • the robot determines the distance between the robot and the positioning base station according to the received positioning signal.
  • the absolute position of the robot is determined according to the determined two or more than three distances and the preset position of the positioning base station.
  • the positioning base station may be a UWB (English full name is Ultra Wideband, Chinese full name is Ultra Wideband) base station.
  • the positioning base station may also be a Bluetooth base station, such as iBeacon positioning based on Bluetooth 4.0, or Wifi positioning based on a Wifi base station.
  • the sensing data frame obtained from the starting position of the robot can be added to the sensing data frame library.
  • the sensing data frame library filters out the absolute position in the sensing data frame library that matches the currently acquired absolute position. If the absolute position matching the currently acquired absolute position cannot be filtered out, or the sensory data frame corresponding to the filtered absolute position cannot match the current sensory data frame, the currently acquired sensory data frame and its corresponding The absolute position of is added to the sensory data frame library. That is, during the moving process of the robot, the data in the sensing data frame library is updated through comparison and matching.
  • the sensing data frame may be a key frame.
  • the key frame is a representative data frame selected from common data frames.
  • the keyframes may be determined according to repetition information between keyframes.
  • the first key frame can be determined according to the starting position.
  • the subsequent key frame it can be determined by determining the similarity between the current data frame and the adjacent previous key frame. For example, it may be determined to generate a new key frame when the similarity is greater than a predetermined key frame similarity constant.
  • whether to generate a new keyframe can be determined based on the distance the robot moves.
  • the first keyframe can be determined according to the starting position.
  • To determine the next key frame it may be determined whether to generate a new key frame according to the moving distance of the robot relative to the position corresponding to the previous key frame. For example, when the moving distance is greater than a predetermined key frame distance constant, a new key frame is generated.
  • whether to generate a new key frame may be determined according to the rotation angle of the robot.
  • the first keyframe can be determined according to the starting position.
  • whether to generate a new key frame can be determined according to whether the rotation angle of the robot is greater than a predetermined key frame angle constant.
  • sensing data image data, distance data, and trajectory data
  • the sensing data can effectively reduce the amount of sensing data collection and comparison calculation, and improve the construction efficiency of the scene map.
  • the absolute position matching the current absolute position is searched in the sensing data frame library, and the matched absolute position corresponds to sensor data frame.
  • n-1 sensing data frames are stored in the sensing data frame library (the value of n changes with the update of the sensing database).
  • the absolute position corresponding to the nth sensing data frame currently collected is obtained, for example, it may be expressed as pn(xn, yn).
  • preliminary screening can be performed according to the matching degree of the absolute position.
  • the absolute position pn(xn, yn) corresponding to the currently acquired sensing data frame can be calculated to match any absolute position pk(xk, yk) in the data sensing frame library, k ⁇ n.
  • the distance T between the absolute position pn and any absolute position pk in the data sensing frame library can be calculated:
  • the absolute distance in the sensing data frame corresponding to the distance T can be selected, and according to the correspondence between the absolute position in the data sensing frame library and the sensing data frame Relationship, find the sensor data frame corresponding to the selected absolute position.
  • the sensor data frame corresponding to the position is obtained.
  • the absolute positioning device of the robot it can be determined that the absolute position of the robot is O'(P'), and according to the preset distance threshold R, the search range corresponding to the dotted circle in Figure 3 can be obtained, and it is determined that the The absolute position in the search range that belongs to the sensory data frame library. For example, it may be determined that the absolute position O is within the search range. According to the absolute position O, find the corresponding sensing data frame.
  • the absolute position in a sensing data frame library is found in the schematic diagram of position correction shown in Fig. 3 .
  • multiple absolute positions can be found within the determined range. According to the found multiple absolute positions, multiple corresponding sensing data frames are determined.
  • the searched sensing data frame is matched with the currently acquired sensing data frame, and loop-closing optimization and environment map construction are performed according to the matching result.
  • one or more sensor data frames can be matched with the currently acquired sensor data frames, that is, the calculated sensor data frames and the currently acquired sensor data frames The similarity of the sense data frame. If the similarity is greater than a predetermined similarity threshold, the sensing data frame and the absolute position are selected to optimize the moving trajectory of the robot. For example, the loopback point of the robot can be determined according to the absolute position, and the attitude information of the robot at the loopback point can be determined according to the sensor data frame. According to the determined loop point and posture information, the moving track of the robot is adjusted, and the posture change information of the robot during the moving process is adjusted. An environment map is constructed from the adjusted trajectory and adjusted pose change information.
  • the absolute position O'(P') of the robot is determined according to the UWB base station set in the scene where the robot is located, and the current absolute position O'(P') of the robot is detected.
  • P') and the absolute position O in the moving track of the robot are the loop-back point, and the attitude of the moving track of the robot is adjusted according to the loop-back point, and the adjusted moving track of the robot shown in FIG. 4 is obtained.
  • the environment map of the scene where the robot is located can be determined more accurately.
  • the application uses absolute distance screening, it can reduce the amount of matching calculations for sensing data frames, and at the same time avoid false detection of loopback points in similar scenes, reducing the probability of failure to match successfully in partial matching.
  • the application can also correct the robot's movement trajectory according to the absolute position detected by the robot, combined with the robot's movement trajectory, so as to further improve the robot map construction the accuracy.
  • FIG. 5 is a schematic diagram of an apparatus for constructing an environment map for a robot provided in an embodiment of the present application, and the apparatus corresponds to the method for constructing an environment map for a robot in FIG. 2 .
  • the device includes:
  • the data acquisition unit 501 is configured to acquire the sensing data frame during the movement of the robot, and acquire the absolute position corresponding to the sensing data frame, and generate a sensing data frame library according to the acquired sensing data frame and absolute position;
  • the data search unit 502 is configured to search the absolute position matched by the current absolute position in the sensing data frame library according to the absolute position corresponding to the sensing data frame currently acquired by the robot and the preset distance threshold, and the matched absolute position The sensor data frame corresponding to the absolute position;
  • the sensing data frame matching unit 503 is configured to match the searched sensing data frame with the currently acquired sensing data frame, and perform loopback optimization and environment map construction according to the matching result.
  • the sensing data frame may be a key frame.
  • key frames during the movement of the robot are determined.
  • the data acquisition unit may be used for:
  • the sensing data frame corresponding to the determined absolute position is searched.
  • the sensing data frame matching unit may be used for:
  • the trajectory of the robot will be optimized based on the selected sensor data frame and absolute position
  • the environment map is constructed according to the optimized trajectory.
  • the sensor data frame library can be updated in real time, and according to the absolute position corresponding to the sensor data frame currently acquired by the robot and the preset distance threshold, search in the sensor data frame library When the current absolute position matches the absolute position, if the absolute position matching the current absolute position is not found in the sensing data frame library, add the currently acquired sensing data frame and the absolute position to The sensory data frame library.
  • the absolute position in the embodiment of the present application may be determined according to a positioning signal sent by a preset UWB base station.
  • Fig. 6 is a schematic diagram of a robot provided by an embodiment of the present application.
  • the robot 6 of this embodiment includes: a processor 60, a memory 61, and a computer program 62 stored in the memory 61 and operable on the processor 60, such as the environment map construction program of the robot .
  • the processor 60 executes the computer program 62, the steps in the above embodiments of the environment map construction method for each robot are realized.
  • the processor 60 executes the computer program 62, the functions of the modules/units in the above-mentioned device embodiments are implemented.
  • the computer program 62 can be divided into one or more modules/units, and the one or more modules/units are stored in the memory 61 and executed by the processor 60 to complete this application.
  • the one or more modules/units may be a series of computer program instruction segments capable of accomplishing specific functions, and the instruction segments are used to describe the execution process of the computer program 62 in the robot 6 .
  • the robot may include, but not limited to, a processor 60 and a memory 61 .
  • a processor 60 and a memory 61 .
  • Fig. 6 is only an example of the robot 6, and does not constitute a limitation to the robot 6, and may include more or less components than shown in the illustration, or combine certain components, or different components, for example
  • the robot may also include input and output devices, network access devices, buses, and the like.
  • the so-called processor 60 can be a central processing unit (Central Processing Unit, CPU), and can also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the memory 61 may be an internal storage unit of the robot 6 , such as a hard disk or memory of the robot 6 . Described memory 61 also can be the external storage device of described robot 6, for example the plug-in type hard disk that is equipped on described robot 6, smart memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, Flash card (Flash Card), etc. Further, the memory 61 may also include both an internal storage unit of the robot 6 and an external storage device. The memory 61 is used to store the computer program and other programs and data required by the robot. The memory 61 can also be used to temporarily store data that has been output or will be output.
  • the disclosed apparatus/terminal device and method may be implemented in other ways.
  • the device/terminal device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated module/unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium. Based on this understanding, all or part of the processes in the methods of the above embodiments in this application can also be completed by hardware related to computer program instructions.
  • the computer program can be stored in a computer-readable storage medium.
  • the computer program When executed by a processor, the steps in the above-mentioned various method embodiments can be realized.
  • the computer program includes computer program code, and the computer program code may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a USB flash drive, a removable hard disk, a magnetic disk, an optical disk, a computer memory, and a read-only memory (ROM, Read-Only Memory) , Random Access Memory (RAM, Random Access Memory), electrical carrier signal, telecommunication signal and software distribution medium, etc. It should be noted that the content contained in the computer-readable medium may be appropriately increased or decreased according to the requirements of legislation and patent practice in the jurisdiction. For example, in some jurisdictions, computer-readable media Excluding electrical carrier signals and telecommunication signals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Remote Sensing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Robot et procédé de construction de carte d'environnement et appareil associé. Le procédé consiste : à acquérir une trame de données de détection pendant un processus de déplacement d'un robot, et une position absolue, et à générer une bibliothèque de trames de données de détection ; en fonction d'une position absolue correspondant à une trame de données de détection actuellement acquise par le robot, et d'une valeur de seuil de distance prédéfinie, à rechercher, dans la bibliothèque de trames de données de détection, une position absolue qui est mise en correspondance avec la position absolue actuelle, et une trame de données de détection correspondant à la position absolue mise en correspondance ; et à mettre en correspondance la trame de données de détection trouvée avec la trame de données de détection actuellement acquise, et à effectuer une optimisation de fermeture de boucle et une construction de carte d'environnement en fonction d'un résultat de mise en correspondance. Une trame de données de détection est criblée au moyen d'une position absolue, de telle sorte que le problème de détection d'un mauvais point de fermeture de boucle dans un scénario similaire peut être évité ; et la trame de données de détection est repérée au moyen de la position absolue, de telle sorte que la probabilité de ne pas pouvoir achever la mise en correspondance peut être réduite, ce qui facilite la réduction du degré de calcul de mise en correspondance de la trame de données de détection, et améliore l'efficacité de la construction de carte d'environnement.
PCT/CN2021/126706 2021-05-31 2021-10-27 Robot, et procédé de construction de carte d'environnement et appareil associé WO2022252482A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110598469.XA CN113297259A (zh) 2021-05-31 2021-05-31 机器人及其环境地图构建方法和装置
CN202110598469.X 2021-05-31

Publications (1)

Publication Number Publication Date
WO2022252482A1 true WO2022252482A1 (fr) 2022-12-08

Family

ID=77326174

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/126706 WO2022252482A1 (fr) 2021-05-31 2021-10-27 Robot, et procédé de construction de carte d'environnement et appareil associé

Country Status (2)

Country Link
CN (1) CN113297259A (fr)
WO (1) WO2022252482A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113297259A (zh) * 2021-05-31 2021-08-24 深圳市优必选科技股份有限公司 机器人及其环境地图构建方法和装置

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104390643A (zh) * 2014-11-24 2015-03-04 上海美琦浦悦通讯科技有限公司 基于多信息融合实现室内定位的方法
US20180356492A1 (en) * 2015-06-16 2018-12-13 Michael Hamilton Vision based location estimation system
CN109141393A (zh) * 2018-07-02 2019-01-04 北京百度网讯科技有限公司 重定位方法、设备及存储介质
CN109141442A (zh) * 2018-09-07 2019-01-04 高子庆 基于uwb定位与图像特征匹配的导航方法和移动终端
CN109974701A (zh) * 2017-12-28 2019-07-05 深圳市优必选科技有限公司 机器人的定位方法及装置
CN110554396A (zh) * 2019-10-21 2019-12-10 深圳市元征科技股份有限公司 一种室内场景下激光雷达建图方法、装置、设备及介质
CN110727265A (zh) * 2018-06-28 2020-01-24 深圳市优必选科技有限公司 机器人重定位的方法、装置以及存储装置
CN111881233A (zh) * 2020-06-28 2020-11-03 广州文远知行科技有限公司 分布式点云地图构建方法和装置、服务器、计算机可读存储介质
CN113297259A (zh) * 2021-05-31 2021-08-24 深圳市优必选科技股份有限公司 机器人及其环境地图构建方法和装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109813319B (zh) * 2019-03-07 2021-09-28 山东大学 一种基于slam建图的开环优化方法及系统
CN110533587B (zh) * 2019-07-03 2023-06-13 浙江工业大学 一种基于视觉先验信息和地图恢复的slam方法
CN111145634B (zh) * 2019-12-31 2022-02-22 深圳市优必选科技股份有限公司 一种校正地图的方法及装置
GB2597335A (en) * 2020-07-20 2022-01-26 Navenio Ltd Map matching trajectories

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104390643A (zh) * 2014-11-24 2015-03-04 上海美琦浦悦通讯科技有限公司 基于多信息融合实现室内定位的方法
US20180356492A1 (en) * 2015-06-16 2018-12-13 Michael Hamilton Vision based location estimation system
CN109974701A (zh) * 2017-12-28 2019-07-05 深圳市优必选科技有限公司 机器人的定位方法及装置
CN110727265A (zh) * 2018-06-28 2020-01-24 深圳市优必选科技有限公司 机器人重定位的方法、装置以及存储装置
CN109141393A (zh) * 2018-07-02 2019-01-04 北京百度网讯科技有限公司 重定位方法、设备及存储介质
CN109141442A (zh) * 2018-09-07 2019-01-04 高子庆 基于uwb定位与图像特征匹配的导航方法和移动终端
CN110554396A (zh) * 2019-10-21 2019-12-10 深圳市元征科技股份有限公司 一种室内场景下激光雷达建图方法、装置、设备及介质
CN111881233A (zh) * 2020-06-28 2020-11-03 广州文远知行科技有限公司 分布式点云地图构建方法和装置、服务器、计算机可读存储介质
CN113297259A (zh) * 2021-05-31 2021-08-24 深圳市优必选科技股份有限公司 机器人及其环境地图构建方法和装置

Also Published As

Publication number Publication date
CN113297259A (zh) 2021-08-24

Similar Documents

Publication Publication Date Title
CN110657803B (zh) 机器人定位方法、装置以及存储装置
US10996062B2 (en) Information processing device, data management device, data management system, method, and program
CN112179330B (zh) 移动设备的位姿确定方法及装置
CN110979346B (zh) 确定车辆所处车道的方法、装置及设备
JP2020067439A (ja) 移動体位置推定システムおよび移動体位置推定方法
WO2022121018A1 (fr) Robot, procédé de cartographie et appareil associé
CN113313763B (zh) 一种基于神经网络的单目相机位姿优化方法及装置
CN105606102A (zh) 一种基于格网模型的pdr室内定位方法及系统
CN114088081B (zh) 一种基于多段联合优化的用于精确定位的地图构建方法
CN111383246B (zh) 条幅检测方法、装置及设备
CN111177295A (zh) 建图重影消除方法、装置、计算机可读存储介质及机器人
CN115375870B (zh) 回环检测优化方法、电子设备及计算机可读存储装置
CN116592897B (zh) 基于位姿不确定性的改进orb-slam2定位方法
WO2022252482A1 (fr) Robot, et procédé de construction de carte d'environnement et appareil associé
CN111368860B (zh) 重定位方法及终端设备
CN114187418A (zh) 回环检测方法、点云地图构建方法、电子设备及存储介质
WO2022036981A1 (fr) Robot, procédé de construction de carte et dispositif associé
CN112923938A (zh) 一种地图优化方法、装置、存储介质及系统
CN115908498B (zh) 一种基于类别最优匹配的多目标跟踪方法及装置
CN116563352A (zh) 融合深度视觉信息的单线激光雷达回环检测方法及系统
CN115952248A (zh) 终端设备的位姿处理方法、装置、设备、介质及产品
CN112561956B (zh) 视频目标跟踪方法、装置、电子设备及存储介质
CN111814114A (zh) 车道定位校验的方法、设备、电子设备、车辆及存储介质
CN116539026B (zh) 地图构建方法、装置、设备及存储介质
CN112614162B (zh) 基于空间优化策略的室内视觉快速匹配定位方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21943834

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21943834

Country of ref document: EP

Kind code of ref document: A1