WO2023078112A1 - Procédé et appareil de détermination d'état de robot, support de stockage et appareil électronique - Google Patents

Procédé et appareil de détermination d'état de robot, support de stockage et appareil électronique Download PDF

Info

Publication number
WO2023078112A1
WO2023078112A1 PCT/CN2022/127006 CN2022127006W WO2023078112A1 WO 2023078112 A1 WO2023078112 A1 WO 2023078112A1 CN 2022127006 W CN2022127006 W CN 2022127006W WO 2023078112 A1 WO2023078112 A1 WO 2023078112A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
cloud data
point cloud
short
point
Prior art date
Application number
PCT/CN2022/127006
Other languages
English (en)
Chinese (zh)
Inventor
孙伟
Original Assignee
追觅创新科技(苏州)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 追觅创新科技(苏州)有限公司 filed Critical 追觅创新科技(苏州)有限公司
Publication of WO2023078112A1 publication Critical patent/WO2023078112A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Definitions

  • the present disclosure relates to the communication field, and in particular, to a method and device for determining a state of a robot, a storage medium, and an electronic device.
  • Embodiments of the present disclosure provide a method and device for determining a state of a robot, a storage medium, and an electronic device, so as to at least solve the problem in the related art that an abnormality of a robot cannot be judged in time.
  • a method for determining the state of a robot including: acquiring a first position and a second position determined by an odometer when the robot is walking, wherein the first position is the robot The position at the first time point, the second position is the position of the robot at the second time point, and the time difference between the first time point and the second time point is less than a predetermined threshold; using the The first point cloud data of the surrounding environment acquired by the robot at the first time point is used to optimize the first position to obtain a first pose, and using the first point cloud data obtained by the robot at the second time point The obtained second point cloud data of the surrounding environment optimizes the second position to obtain a second attitude; determine a first short-term change in the attitude of the robot based on the first attitude and the second attitude; Determining a second short-term variation of the attitude of the robot based on the first point cloud data and the second point cloud data; by comparing the first short-term variation with the second short-term variation , to determine whether the state
  • using the first point cloud data of the surrounding environment acquired by the robot at the first time point to optimize the first position to obtain a first pose includes: based on the The first point cloud data determines the target point cloud data of a predetermined area centered on the first position; matches the target point cloud data with the area map pre-generated by the robot to obtain the first pose .
  • determining the target point cloud data of a predetermined area centered on the first position based on the first point cloud data includes: extracting the predetermined point cloud data from the first point cloud data The point cloud data of the area; the point cloud data of the predetermined area is rotated multiple times according to the predetermined rotation mode, so as to obtain the point cloud data of the predetermined area after each rotation; the predetermined area after each rotation The point cloud data of is determined as the target point cloud data.
  • performing multiple rotations on the point cloud data of the predetermined area according to a predetermined rotation manner includes: performing multiple rotations on the point cloud data of the predetermined area according to a rotation manner of one degree at a time.
  • determining the second short-term variation of the attitude of the robot based on the first point cloud data and the second point cloud data includes: Match the second point cloud data to obtain a matching result; determine the rotation information of the second point cloud data relative to the first point cloud data based on the matching result; determine the second point cloud data based on the rotation information short-term variation.
  • determining whether the state of the robot is abnormal includes: comparing the first short-term variation and the second short-term variation, determine the pose incremental difference between the first short-term variation and the second short-term variation; if it is determined that the pose incremental difference exceeds a predetermined threshold , determining that the state of the robot is an abnormal state; and determining that the state of the robot is a normal state when it is determined that the pose incremental difference does not exceed the predetermined threshold.
  • a device for determining the state of a robot including: an acquisition module, configured to acquire the first position and the second position determined by the odometer when the robot is walking, wherein the The first position is the position of the robot at the first time point, the second position is the position of the robot at the second time point, and the distance between the first time point and the second time point The time difference is less than a predetermined threshold; the optimization module is configured to use the first point cloud data of the surrounding environment acquired by the robot at the first time point to optimize the first position to obtain a first posture, and use The second point cloud data of the surrounding environment acquired by the robot at the second time point optimizes the second position to obtain a second posture; a first determination module is configured to obtain a second posture based on the first posture and The second pose determines the first short-term change in the pose of the robot; the second determination module is configured to determine the first change in the pose of the robot based on the first point cloud data and the second point cloud data Two short-term
  • the optimization module includes: a determining unit configured to determine target point cloud data of a predetermined area centered on the first position based on the first point cloud data; a matching unit configured to and matching the target point cloud data with the area map pre-generated by the robot to obtain the first pose.
  • a computer-readable storage medium where the computer-readable storage medium includes a stored program, wherein, when the program runs, the program described in any one of the above-mentioned embodiments is executed. described method.
  • an electronic device including a memory and a processor, where a computer program is stored in the memory, and the processor is configured to execute any one of the above implementations through the computer program. method described in the example.
  • FIG. 1 is a block diagram of a hardware structure of a method for determining a robot state according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart of a method for determining a state of a robot according to an embodiment of the present disclosure
  • Fig. 3 is a structural block diagram of an apparatus for determining a state of a robot according to an embodiment of the present disclosure.
  • FIG. 1 is a block diagram of a hardware structure of a method for determining a state of a robot according to an embodiment of the present disclosure.
  • the mobile robot can include one or more (only one is shown in Figure 1) processor 102 (the processor 102 can include but not limited to processing devices such as microprocessor MCU or programmable logic device FPGA, etc.) and a memory 104 for storing data.
  • the above-mentioned mobile robot may also include a transmission device 106 and an input and output device 108 for communication functions.
  • the structure shown in Figure 1 is only schematic, and it does not limit the structure of the above-mentioned mobile robot.
  • the mobile robot may also include more or fewer components than those shown in FIG. 1 , or have a different configuration that is functionally equivalent to or more functionally than that shown in FIG. 1 .
  • the memory 104 can be used to store computer programs, for example, software programs and modules of application software, such as the computer program corresponding to the method for determining the state of the robot in the embodiment of the present disclosure, and the processor 102 runs the computer program stored in the memory 104, thereby Executing various functional applications and data processing is to realize the above-mentioned method.
  • the memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 104 may further include memory located remotely from the processor 102, and these remote memories may be connected to the mobile robot through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • the transmission device 106 is used to receive or transmit data via a network.
  • a specific example of the above-mentioned network may include a wireless network provided by a mobile robot's communication provider.
  • the transmission device 106 includes a network interface controller (NIC for short), which can be connected to other network devices through a base station so as to communicate with the Internet.
  • the transmission device 106 may be a radio frequency (Radio Frequency, referred to as RF) module, which is used to communicate with the Internet in a wireless manner.
  • RF Radio Frequency
  • the robot relies on the odometer combined with the laser point cloud for joint positioning.
  • the wheels will spin idly, causing the odometer to be inaccurate, which in turn will cause inaccurate positioning of the machine and the inability to get out of trouble in time.
  • the robot may need to drive to different areas during its work, and the road types in the path it drives may be diverse.
  • some small debris may be (For example, stones, wire balls, cables, plastic bags, etc.) are involved in the wheels, which may cause the robot to get stuck or slip.
  • the robot has the ability to drive autonomously, there are few dedicated personnel near the robot. exist. In this case, when the robot fails, it cannot be detected in time, resulting in irreversible damage to the robot.
  • an embodiment of the present disclosure proposes a method for determining the state of the robot, which can promptly confirm whether the state of the robot is abnormal.
  • the robots involved in the embodiments of the present disclosure may be sweeping robots, robots with transportation functions (for example, express transportation robots), or guide robots in shopping malls, etc., as long as they are robots with automatic driving capabilities, they can be included in this article. within the scope of public protection.
  • a method for determining the state of a robot includes the following steps:
  • S210 Determine whether the state of the robot is abnormal by comparing the first short-term variation with the second short-term variation.
  • the positioning result of the robot relying on the odometer can be compared with the positioning result completely relying on the laser. If the deviation of the result is large, it means that there is a big problem with the result given by the odometer. At this time, the machine may slip or be stuck in an abnormal situation. Therefore, corresponding processing can be performed in real time.
  • the aforementioned predetermined threshold can be flexibly set, for example, set to 1 second, 2 seconds, 5 seconds, 10 seconds and so on.
  • the odometer is the pulse signal of the robot's wheels.
  • the wheels will give pulse signals during walking, and the number of pulses of the left and right wheels (or other types of wheels) can be inferred how many times the left and right wheels have turned. So as to estimate the trajectory of the robot.
  • the odometry is an input data used to calculate the pose of the machine. In the above embodiment, the approximate position of the machine movement can be inferred by the odometer first.
  • the execution subject of the above-mentioned operations may be an intelligent robot (for example, a sweeping robot, a delivery robot, a guide robot, etc.), or a processor provided in an intelligent robot, or other devices with similar processing capabilities.
  • an intelligent robot for example, a sweeping robot, a delivery robot, a guide robot, etc.
  • a processor provided in an intelligent robot, or other devices with similar processing capabilities.
  • using the first point cloud data of the surrounding environment acquired by the robot at the first time point to optimize the first position to obtain a first pose includes: based on the The first point cloud data determines the target point cloud data of a predetermined area centered on the first position; matches the target point cloud data with the area map pre-generated by the robot to obtain the first pose .
  • the predetermined area centered on the first position can be a circular area, a square area, a rectangular area, or other types of areas (as for the size of the area, it can be flexibly set according to the actual scene,
  • the circular area with the first position as the center and a radius of 10 centimeters can be determined as the above-mentioned predetermined area; or, the square area obtained by taking the first position as the center and extending 5 centimeters up, down, left, and right can be determined as The above-mentioned predetermined area; alternatively, the rectangular area obtained by taking the first position as the center, extending up and down by an additional 10 cm, and left and right by an additional 15 cm can be determined as the above-mentioned predetermined area).
  • the matching method in this embodiment may be the aforementioned brute force matching method, or other matching methods, for example, selecting a predetermined number of point clouds from the target point cloud data to match with the regional map, and so on.
  • determining the target point cloud data of a predetermined area centered on the first position based on the first point cloud data includes: extracting the predetermined point cloud data from the first point cloud data The point cloud data of the area; the point cloud data of the predetermined area is rotated multiple times according to the predetermined rotation mode, so as to obtain the point cloud data of the predetermined area after each rotation; the predetermined area after each rotation The point cloud data of is determined as the target point cloud data.
  • performing multiple rotations on the point cloud data in the predetermined area according to a predetermined rotation method includes: performing multiple rotations on the point cloud data in the predetermined area according to a rotation method of one degree each time, of course, other rotation methods may also be used.
  • the rotation mode can be selected at each rotation angle of 5 degrees, for example, the rotation mode of 5 degrees at a time, and the rotation mode of 10 degrees at a time. It should also be noted that the angle of each rotation can also be different. The following is a description with specific examples:
  • the robot has walked 30 cm.
  • the point cloud of the surrounding environment is acquired by the laser sensor.
  • the attitude corresponding to the point cloud is the most accurate. This attitude can be regarded as the current attitude of the robot to complete the optimization. Assume that the robot moves 30 cm from the odometer and the angle is 0 degrees.
  • determining the second short-term variation of the attitude of the robot based on the first point cloud data and the second point cloud data includes: Match the second point cloud data to obtain a matching result; determine the rotation information of the second point cloud data relative to the first point cloud data based on the matching result; determine the second point cloud data based on the rotation information short-term variation.
  • the iterative closest point algorithm instead of relying on the odometer, the iterative closest point algorithm (Iterative Closest Point) is used to determine the pose of the robot completely relying on laser data.
  • the ICP method is to infer the change of pose by comparing the two point clouds before and after and matching them in pairs. Assume that the machine obtains a frame of point cloud after moving 30 centimeters.
  • the point cloud before 30 centimeters is matched with the point cloud after 30 centimeters, so that the degree of overlap of the point cloud is the highest.
  • the point cloud and the point cloud The distance between point clouds is minimal.
  • the attitude of the previous frame point cloud rotated to the next frame point cloud is the estimated pose of the robot.
  • This method can isolate the influence of the odometer. It can be seen that the short-term variation of the machine's attitude can be obtained by comparing the attitude obtained by laser matching in real time with the attitude obtained by laser matching.
  • determining whether the state of the robot is abnormal includes: comparing the first short-term variation and the second short-term variation, determine the pose incremental difference between the first short-term variation and the second short-term variation; if it is determined that the pose incremental difference exceeds a predetermined threshold , determining that the state of the robot is an abnormal state; and determining that the state of the robot is a normal state when it is determined that the pose incremental difference does not exceed the predetermined threshold.
  • comparing the difference between the two short-term changes is actually comparing the pose incremental difference between the two short-term changes.
  • ICP tracks a posture change
  • brute force search tracks a posture change, which can obtain the posture change amount within a certain period of time, for example, the posture change amount within one second.
  • the attitude change of the two methods within one second should be too large, which indicates that there is a problem with the odometer or point cloud data. If the difference is greater than the preset value. It means that the a priori given by the odometer is inaccurate, and there is a problem of wheel slipping or jamming. At this time, certain means can be used to deal with this abnormal situation.
  • the robot can create a local map.
  • the state is normal, use the local map to perform small-scale relocation on the original map to calibrate the attitude.
  • the method according to the above embodiments can be implemented by means of software plus a necessary general-purpose hardware platform, and of course also by hardware, but in many cases the former is better implementation.
  • the technical solution of the present disclosure can be embodied in the form of a software product in essence or the part that contributes to the prior art, and the computer software product is stored in a storage medium (such as ROM/RAM, disk, CD) contains several instructions to enable a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to execute the methods described in various embodiments of the present disclosure.
  • a device for determining the state of a robot is also provided, and the device is used to implement the above embodiments and preferred implementation modes, and those that have been explained will not be repeated here.
  • the term "module” may be a combination of software and/or hardware that realizes a predetermined function.
  • the devices described in the following embodiments are preferably implemented in software, implementations in hardware, or a combination of software and hardware are also possible and contemplated.
  • Fig. 3 is a structural block diagram of a device for determining the state of a robot according to an embodiment of the present disclosure. As shown in Fig. 3 , the device includes:
  • An acquisition module 32 configured to acquire a first position and a second position determined by the odometer when the robot is walking, wherein the first position is the position of the robot at the first time point, and the second position is the position of the robot at the first time point.
  • the position is the position of the robot at a second time point, and the time difference between the first time point and the second time point is less than a predetermined threshold;
  • An optimization module 34 configured to use the first point cloud data of the surrounding environment acquired by the robot at the first time point to optimize the first position to obtain a first pose, and use the robot to The second point cloud data of the surrounding environment acquired at the second time point is optimized for the second position to obtain a second attitude;
  • a first determination module 36 configured to determine a first short-term change in the posture of the robot based on the first posture and the second posture;
  • a second determining module 38 configured to determine a second short-term change in the attitude of the robot based on the first point cloud data and the second point cloud data;
  • the third determining module 310 is configured to determine whether the state of the robot is abnormal by comparing the first short-term variation with the second short-term variation.
  • the optimization module 34 includes: a determination unit, configured to determine target point cloud data of a predetermined area centered on the first position based on the first point cloud data; a matching unit, It is used to match the target point cloud data with the area map pre-generated by the robot to obtain the first pose.
  • the determining unit is configured to determine the target point cloud data of a predetermined area centered on the first position based on the first point cloud data in the following manner: from the first The point cloud data of the predetermined area is extracted from the point cloud data; the point cloud data of the predetermined area is rotated multiple times according to the predetermined rotation mode, so as to obtain the point cloud data of the predetermined area after each rotation; The point cloud data of the predetermined area after each rotation is determined as the target point cloud data.
  • the determining unit is configured to perform multiple rotations on the point cloud data of the predetermined area according to a predetermined rotation method in the following manner: the point cloud data of the predetermined area is rotated by one degree each time Cloud data undergoes multiple rotations.
  • the second determination module 38 is configured to determine the second short-term variation of the attitude of the robot based on the first point cloud data and the second point cloud data in the following manner : Matching the first point cloud data and the second point cloud data to obtain a matching result; determining the rotation of the second point cloud data relative to the first point cloud data based on the matching result information; determining the second short-term variation based on the rotation information.
  • the third determining module 310 is configured to determine whether the state of the robot is abnormal by comparing the first short-term variation with the second short-term variation in the following manner : By comparing the first short-term variation with the second short-term variation, determine the pose incremental difference between the first short-term variation and the second short-term variation; When the pose incremental difference exceeds a predetermined threshold, it is determined that the state of the robot is an abnormal state; when it is determined that the pose incremental difference does not exceed the predetermined threshold, it is determined that the state of the robot is normal state.
  • the above-mentioned modules can be realized by software or hardware. For the latter, it can be realized by the following methods, but not limited to this: the above-mentioned modules are all located in the same processor; or, the above-mentioned modules can be combined in any combination The forms of are located in different processors.
  • Embodiments of the present disclosure also provide a computer-readable storage medium, in which a computer program is stored, wherein the computer program is configured to execute the steps in any one of the above method embodiments when running.
  • the above-mentioned computer-readable storage medium may be configured to store a computer program for performing the following steps:
  • S5. Determine whether the state of the robot is abnormal by comparing the first short-term variation with the second short-term variation.
  • the above-mentioned computer-readable storage medium may include but not limited to: U disk, read-only memory (Read-Only Memory, referred to as ROM), random access memory (Random Access Memory, referred to as RAM) , mobile hard disk, magnetic disk or optical disk and other media that can store computer programs.
  • ROM read-only memory
  • RAM random access memory
  • mobile hard disk magnetic disk or optical disk and other media that can store computer programs.
  • Embodiments of the present disclosure also provide an electronic device, including a memory and a processor, where a computer program is stored in the memory, and the processor is configured to run the computer program to execute the steps in any one of the above method embodiments.
  • the electronic device may further include a transmission device and an input and output device, wherein the transmission device is connected to the processor, and the input and output device is connected to the processor.
  • the above-mentioned processor may be configured to execute the following steps through a computer program:
  • S5. Determine whether the state of the robot is abnormal by comparing the first short-term variation with the second short-term variation.
  • each module or each step of the above-mentioned disclosure can be realized by a general-purpose computing device, and they can be concentrated on a single computing device, or distributed in a network composed of multiple computing devices In fact, they can be implemented in program code executable by a computing device, and thus, they can be stored in a storage device to be executed by a computing device, and in some cases, can be executed in an order different from that shown here. Or described steps, or they are fabricated into individual integrated circuit modules, or multiple modules or steps among them are fabricated into a single integrated circuit module for implementation. As such, the present disclosure is not limited to any specific combination of hardware and software.

Abstract

Procédé et appareil de détermination de l'état d'un robot, support de stockage et appareil électronique. Le procédé consiste : à acquérir une première position et une seconde position, déterminées par un odomètre lorsqu'un robot marche ; à optimiser la première position à l'aide de premières données de nuage de points d'un environnement acquises par le robot à un premier instant, afin d'obtenir une première posture ; à optimiser la seconde position à l'aide de secondes données de nuage de points de l'environnement acquises par le robot à un second instant, afin d'obtenir une seconde posture ; à déterminer une première variation rapide de la posture du robot, d'après les première et seconde postures ; à déterminer une seconde variation rapide de la posture du robot, d'après les premières et secondes données de nuage de points ; et à déterminer si l'état du robot est anormal par comparaison des première et seconde variations rapides.
PCT/CN2022/127006 2021-11-05 2022-10-24 Procédé et appareil de détermination d'état de robot, support de stockage et appareil électronique WO2023078112A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111309025.6A CN116091593A (zh) 2021-11-05 2021-11-05 机器人状态的确定方法和装置、存储介质、电子装置
CN202111309025.6 2021-11-05

Publications (1)

Publication Number Publication Date
WO2023078112A1 true WO2023078112A1 (fr) 2023-05-11

Family

ID=86185504

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/127006 WO2023078112A1 (fr) 2021-11-05 2022-10-24 Procédé et appareil de détermination d'état de robot, support de stockage et appareil électronique

Country Status (2)

Country Link
CN (1) CN116091593A (fr)
WO (1) WO2023078112A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117689698A (zh) * 2024-02-04 2024-03-12 安徽蔚来智驾科技有限公司 点云配准方法、智能设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110166763A1 (en) * 2010-01-05 2011-07-07 Samsung Electronics Co., Ltd. Apparatus and method detecting a robot slip
CN107643186A (zh) * 2017-09-15 2018-01-30 深圳市杉川机器人有限公司 机器打滑检测的方法、装置及系统
CN108638053A (zh) * 2018-04-03 2018-10-12 珠海市微半导体有限公司 一种机器人打滑的检测方法及其矫正方法
CN110187323A (zh) * 2019-05-14 2019-08-30 北京云迹科技有限公司 机器人空转识别方法及装置
CN111708047A (zh) * 2020-06-16 2020-09-25 浙江大华技术股份有限公司 机器人定位评估方法、机器人及计算机存储介质
CN113376650A (zh) * 2021-08-09 2021-09-10 浙江华睿科技股份有限公司 移动机器人定位方法及装置、电子设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110166763A1 (en) * 2010-01-05 2011-07-07 Samsung Electronics Co., Ltd. Apparatus and method detecting a robot slip
CN107643186A (zh) * 2017-09-15 2018-01-30 深圳市杉川机器人有限公司 机器打滑检测的方法、装置及系统
CN108638053A (zh) * 2018-04-03 2018-10-12 珠海市微半导体有限公司 一种机器人打滑的检测方法及其矫正方法
CN110187323A (zh) * 2019-05-14 2019-08-30 北京云迹科技有限公司 机器人空转识别方法及装置
CN111708047A (zh) * 2020-06-16 2020-09-25 浙江大华技术股份有限公司 机器人定位评估方法、机器人及计算机存储介质
CN113376650A (zh) * 2021-08-09 2021-09-10 浙江华睿科技股份有限公司 移动机器人定位方法及装置、电子设备及存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117689698A (zh) * 2024-02-04 2024-03-12 安徽蔚来智驾科技有限公司 点云配准方法、智能设备及存储介质
CN117689698B (zh) * 2024-02-04 2024-04-19 安徽蔚来智驾科技有限公司 点云配准方法、智能设备及存储介质

Also Published As

Publication number Publication date
CN116091593A (zh) 2023-05-09

Similar Documents

Publication Publication Date Title
KR102403504B1 (ko) 이동 로봇 및 그 제어 방법
Sünderhauf et al. Visual odometry using sparse bundle adjustment on an autonomous outdoor vehicle
US9298183B2 (en) Robot and method for autonomous inspection or processing of floor areas
WO2023078112A1 (fr) Procédé et appareil de détermination d'état de robot, support de stockage et appareil électronique
WO2020192407A1 (fr) Procédé de recharche de dispositif mobile et dispositif mobile
US20210255638A1 (en) Area Division and Path Forming Method and Apparatus for Self-Moving Device and Automatic Working System
EP3974778B1 (fr) Procédé et appareil pour mettre à jour une carte de travail d'un robot mobile, et support de stockage
CN110146098B (zh) 一种机器人地图扩建方法、装置、控制设备和存储介质
WO2023045644A1 (fr) Procédé et dispositif de positionnement pour robot mobile, support de stockage et dispositif électronique
EP4066078A1 (fr) Procédé et appareil de division de zone et de formation de trajet de dispositif automoteur et système de travail automatique
US20210263514A1 (en) Method and Apparatus for Recognizing a Stuck Status as well as Computer Storage Medium
WO2023005377A1 (fr) Procédé de construction de carte pour robot, et robot
CN113064408B (zh) 自主机器人及其控制方法、计算机存储介质
CN111609853A (zh) 三维地图构建方法、扫地机器人及电子设备
CN111679664A (zh) 基于深度相机的三维地图构建方法及扫地机器人
WO2022222490A1 (fr) Procédé de commande de robot et robot
US20220121211A1 (en) Mobile robot and control method therefor
WO2022222345A1 (fr) Procédé et appareil de correction de positionnement pour robot mobile, support de stockage et appareil électronique
CN112526985B (zh) 一种行走禁区规划方法及其装置、自移动机器人
CN112799389B (zh) 自动行走区域路径规划方法及自动行走设备
JP2020021372A (ja) 情報処理方法および情報処理システム
WO2024017032A1 (fr) Procédé de recharge de robot de tonte, robot de tonte et support de stockage
Yuen et al. A comparison between extended Kalman filtering and sequential Monte Carlo techniques for simultaneous localisation and map-building
Zapata-Impata et al. Autotrans: an autonomous open world transportation system
Yuan et al. Lost robot self-recovery via exploration using hybrid topological-metric maps

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22889137

Country of ref document: EP

Kind code of ref document: A1