WO2021253698A1 - Procédé et système de commande de marche de robot, robot et support de stockage - Google Patents

Procédé et système de commande de marche de robot, robot et support de stockage Download PDF

Info

Publication number
WO2021253698A1
WO2021253698A1 PCT/CN2020/123186 CN2020123186W WO2021253698A1 WO 2021253698 A1 WO2021253698 A1 WO 2021253698A1 CN 2020123186 W CN2020123186 W CN 2020123186W WO 2021253698 A1 WO2021253698 A1 WO 2021253698A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
boundary
outer boundary
planned path
actual
Prior art date
Application number
PCT/CN2020/123186
Other languages
English (en)
Chinese (zh)
Inventor
朱绍明
任雪
Original Assignee
苏州科瓴精密机械科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 苏州科瓴精密机械科技有限公司 filed Critical 苏州科瓴精密机械科技有限公司
Publication of WO2021253698A1 publication Critical patent/WO2021253698A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Definitions

  • the invention relates to the field of intelligent control, in particular to a robot walking control method, system, robot and storage medium.
  • Low repetition rate and high coverage rate are the goals pursued by mobile robots such as ergodic robots such as vacuuming, mowing and swimming pool cleaning.
  • the lawn mower robot uses the lawn enclosed by the boundary as the working area to perform mowing operations, and the outside of the lawn is defined as a non-working area.
  • the purpose of the present invention is to provide a robot walking control method, system, robot and storage medium.
  • an embodiment of the present invention provides a robot walking control method.
  • the method includes: preset planning paths for the robot;
  • the outer boundary is the separation boundary between the working area and the non-working area; at least when the robot reaches the outer boundary, the camera device is activated to take environmental images, and the The boundary drives the robot to walk along the actual boundary obtained by analyzing the environment image at least once and work synchronously;
  • the walking and working route of the robot can be selected in combination with the images taken by the robot, and the working efficiency of the robot can be improved.
  • the outer boundary is the separation boundary between the working area and the non-working area; at least when the robot reaches the outer boundary, the camera is started
  • the device takes an image of the environment, and drives the robot to walk along the actual boundary obtained by analyzing the environment image at least once at the outer boundary and work synchronously, including:
  • image recognition is used to perform operations on the outer boundary, and the work efficiency of the robot is improved.
  • the outer boundary is the separation boundary between the working area and the non-working area; at least when the robot reaches the outer boundary, the camera is started
  • the device takes an image of the environment, and drives the robot to walk along the actual boundary obtained by analyzing the environment image at least once at the outer boundary and work synchronously, and further includes:
  • the robot After the robot returns to the starting position, the robot is driven to walk along other paths and work synchronously.
  • the outer boundary is the separation boundary between the working area and the non-working area; at least when the robot reaches the outer boundary, the camera is started
  • the device takes an image of the environment, and drives the robot to walk along the actual boundary obtained by analyzing the environment image at least once at the outer boundary and work synchronously, including:
  • the following steps are performed:
  • the walking and working route of the robot can be selected by combining the images taken by the robot to improve the robot Work efficiency.
  • the outer boundary is the separation boundary between the working area and the non-working area; at least when the robot reaches the outer boundary, the camera is started
  • the device takes an image of the environment, and drives the robot to walk along the actual boundary obtained by analyzing the environment image at least once at the outer boundary and work synchronously, including:
  • the camera is started in real time to take environmental images, and the environmental images are analyzed in real time to determine whether the actual boundary is obtained. If so, when the actual boundary is obtained, the robot is driven to plan the path. Walk along the actual boundary in the planned direction and work synchronously; if not, follow the planned path and work synchronously on other paths that do not include the outer boundary in the planned path;
  • the method of image recognition and the traditional fixed-point positioning method are used to find the path, and the above two methods are alternately implemented through specific judgment conditions to perform operations on the outer boundary and the area within the outer boundary.
  • the image selects the walking and working route of the robot to improve the working efficiency of the robot.
  • the outer boundary is the separation boundary between the working area and the non-working area; at least when the robot reaches the outer boundary, the camera is started
  • the device takes an image of the environment, and drives the robot to walk along the actual boundary obtained by analyzing the environment image at least once at the outer boundary and work synchronously, including:
  • the robot When reaching each pre-planned coordinate point on the planned path, the robot is driven to rotate in place, and the environment image is captured by the camera device;
  • the robot is driven to walk along the actual boundary and work synchronously.
  • the method further includes:
  • the method of image recognition and the traditional fixed-point positioning method are used to find the path, and the above two methods are alternately implemented through specific judgment conditions to perform operations on the outer boundary and the area within the outer boundary.
  • the image selects the walking and working route of the robot to improve the working efficiency of the robot.
  • an embodiment of the present invention provides a robot walking control system, the system includes: an acquisition module for acquiring a planned path preset for the robot;
  • the processing module is used for when the planned path includes at least a part of the outer boundary, the outer boundary is the separation boundary between the working area and the non-working area; at least when the robot reaches the outer boundary, the camera device is activated to take environmental images, At least once at the outer boundary, the robot is driven to walk along the actual boundary obtained by analyzing the environment image and work synchronously.
  • an embodiment of the present invention provides a robot, including a memory and a processor, the memory stores a computer program, and the processor implements the aforementioned robot walking control method when the computer program is executed. A step of.
  • an embodiment of the present invention provides a readable storage medium on which a computer program is stored, and the computer program is executed by a processor to realize the steps of the robot walking control method as described above.
  • the robot walking control method, system, robot and storage medium of the present invention can select the walking and working route of the robot in combination with the images taken by the robot, thereby improving the working efficiency of the robot.
  • FIG. 1 is a schematic flowchart of a robot walking control method provided by an embodiment of the present invention
  • Fig. 2, Fig. 4, Fig. 5, and Fig. 7 are schematic diagrams of the specific realization process of one of the steps in Fig. 1;
  • FIGS 3, 6, and 8 are respectively schematic diagrams of different examples of the present invention.
  • Fig. 9 is a schematic diagram of modules of the robot walking control system provided by the present invention.
  • the robot system of the present invention can be a mowing robot system, a sweeping robot system, a snow sweeper system, a leaf suction system, a golf course ball picker system, etc. Each system can automatically walk in the work area and perform corresponding tasks.
  • the robot system is taken as an example of a lawn mowing robot system.
  • the working area may be a lawn.
  • a lawn mower robot system usually includes: a lawn mower (RM), a charging station, and a boundary line.
  • the lawn mower robot includes a main body, a walking unit and a control unit arranged on the main body.
  • the walking unit is used to control the walking and turning of the robot;
  • the control unit is used to plan the walking direction and walking route of the robot, store the external parameters obtained by the robot, and process and analyze the obtained parameters, and according to the processing, The analysis result specifically controls the robot;
  • the control unit is for example: MCU or DSP.
  • the lawn mower robot of the present invention further includes: a camera device and a fixed-point positioning device are provided in cooperation with the control unit, and the camera device is used to obtain a scene within a certain range of its viewing angle.
  • the camera device locates the outer boundary by means of image analysis
  • the fixed-point positioning system locates the inner boundary area enclosed by the outer boundary by searching for coordinate points on the working path;
  • the control unit combines the camera device and The fixed-point positioning device controls the robot to traverse the working area; it will be described in detail in the following content.
  • the robot also includes: various sensors, storage modules, such as EPROM, Flash or SD card, etc., and a working mechanism for work, and a power supply; in this embodiment, the working mechanism is a lawn mower blade , Various sensors used to sense the walking state of the walking robot, such as: tipping, ground-off, collision sensors, geomagnetism, gyroscopes, etc., which are not detailed here.
  • the robot walking control method provided in the first implementation of the present invention includes the following steps:
  • the outer boundary is the separation boundary between the working area and the non-working area; at least when the robot reaches the outer boundary, the camera device is activated to take environmental images, and the The outer boundary drives the robot to walk along the actual boundary obtained by analyzing the environment image and work synchronously at least once.
  • step S2 further includes: if the outer boundary is not included in the planned path, driving the robot to walk along the planned path and work synchronously.
  • the traditional path planning method usually drives the robot to walk along the work area before the robot works, and uses the fixed-point positioning method to plan the work path in combination with the robot’s walking route.
  • the planned work path Usually does not include the actual outer boundary of the working area, that is, the outer boundary in the planned path is formed by offsetting the actual outer boundary inward.
  • outer boundary specifically refers to the separation boundary between the working area and the non-working area. In this way, the outer boundary can be accurately identified through image recognition.
  • the camera device of the present invention photographs the scene in front of the robot to form an environment image; the environment is the ground in the direction of the robot's advancement; further, when the main controller receives the environment image, it analyzes the environment image Therefore, it can be judged whether the outer boundary exists within the preset distance range of the robot from the current position of the robot through the environment image; the technology of identifying the outer boundary through the image is already relatively mature in the prior art, and will not be described in detail here.
  • the step S2 specifically includes: judging whether the outer boundary included in the planned path is continuous, and if so, executing the following steps:
  • the continuous means that all road sections are connected end to end in sequence.
  • step S2 further includes: judging whether the planned path includes paths other than the outer boundary, and if so, performing the following steps:
  • the robot After the robot returns to the starting position, the robot is driven to walk along other paths and work synchronously.
  • the method also includes: immediately turning off the camera device, thus saving resources; and preventing the robot from repeating operations on the actual outer boundary.
  • the area described in Figure 3 is the work area as a whole, and the planned path constructed by the traditional method is the path L1 formed by the bow-shaped solid line; and there is actually a path L2 formed by the solid rectangular frame in the work area. ;
  • the path L2 is a continuous outer boundary, and the arrow points to the walking direction of the robot; if the mowing operation is carried out according to the traditional positioning method, due to factors such as UWB and other fixed-point positioning accuracy is not high, the robot cannot be formed on the dotted line Work on the rectangle path L2.
  • the complete work area can be mowed; specifically, starting from the starting position A, the camera device is started, and after the environment image is captured by the camera device, further steps can be taken.
  • the actual boundary is obtained by analyzing the environment image, that is, the rectangular frame path L2 formed by the dashed line in the figure.
  • the robot is driven to move from the starting position A to the starting position B of the actual boundary, and starting from the starting position B .
  • the robot keeps working along the path L2
  • the robot works a circle along the path L2 and returns to the starting position B, confirm the completion of the outer boundary operation, at this time, turn off the camera device, the robot returns from the starting position B To the starting point position A;
  • the robot can only walk along the path L1 and work synchronously; when the robot reaches the position C, the work ends.
  • the image recognition method is first used to work on the outer boundary, and then the traditional fixed-point positioning method is used to work on the area within the outer boundary. In this way, it can be photographed by combining with the robot.
  • the image selects the walking and working route of the robot to improve the working efficiency of the robot.
  • the step S2 specifically includes: if the outer boundary included in the planned path is continuous, and the planned path also includes the outer boundary other than the outer boundary For other paths, perform the following steps:
  • the camera device is started to take environmental images, and the robot is driven to walk to the actual boundary obtained by analyzing the environmental image, walk along the actual boundary obtained by analyzing the environmental image, and work synchronously.
  • This second preferred embodiment is similar to the above-mentioned first preferred embodiment. The difference is that the first embodiment first operates on the outer boundary, and then operates on the area within the outer boundary; while the second embodiment first performs operations on the outer boundary. The area is operated, and then the outer boundary is operated.
  • the complete work area can be mowed; specifically, starting from the starting position A, the camera is not activated, and the robot follows the traditional fixed-point positioning method.
  • Path L1 walks and works synchronously. When the robot reaches position C, the task of path L1 is completed; further, the robot is driven to work on L2.
  • the robot can turn on the camera at the current position C, and After starting, choose to walk to the nearest position on the actual boundary from the position C, and start to walk along the path L1 and work a circle from the nearest position; you can also walk to any point on the outer boundary according to the predetermined rules, and then turn on the camera device At this time, after the robot is still driven to the nearest position on the actual boundary according to the principle of proximity, it will start to walk along the path L1 and perform a circle from the nearest position.
  • the robot is driven to return to position A, and then the camera is started.
  • the actual boundary can be obtained by analyzing the environmental image, which is formed by the dotted line in the figure.
  • the robot moves from position A to the starting point position B of the actual boundary, and starts from the starting point position B, and works along the path L2; when the robot works in a circle along the path L2 And when it returns to the starting position B, the work ends.
  • the traditional fixed-point positioning method is first used to work on the area within the outer boundary, and then the image recognition method is used to work on the outer boundary.
  • the robot can be combined with The captured images select the walking and working route of the robot to improve the working efficiency of the robot.
  • the step S2 specifically includes:
  • the camera is started in real time to take environmental images, and the environmental images are analyzed in real time to determine whether the actual boundary is obtained. If so, when the actual boundary is obtained, the robot is driven to plan the path. Walk along the actual boundary in the planning direction and work synchronously; if not, walk along the planned path and work synchronously on other paths that do not include the outer boundary in the planned path.
  • the planned path constructed by the traditional method is the path L1 formed by the bow-shaped solid line and the offset path connecting L1 and not shown.
  • the actual work area is There is also a path L2 formed by a broken line; in addition, the thick solid line is the connecting path when the robot moves between the path L1 and the path L2, and the path L2 is a discontinuous outer boundary; if the mowing operation is performed according to the traditional positioning method , Due to the low accuracy of fixed-point positioning such as UWB, when the robot walks along the planned path, it will miss the job on path L2.
  • the complete work area can be mowed; specifically, starting from the starting position A, the camera device is started, and after the environment image is captured by the camera device, further steps can be taken.
  • the actual boundary is obtained by analyzing the environment image, that is, the B-B1 section of the path L2 formed by the dashed line in the figure and the B1-B2 section connecting the B-B1 section.
  • the robot reaches the position B2, it walks along the solid line to B3 and follows Continue to work on the B3-C section of the path L1. Since the B3-C section is not the outer boundary, at this time, although the camera on the robot is in the activated state, the corresponding outer boundary cannot be found.
  • the method of image recognition and the traditional fixed-point positioning method are used to find the path, and the above two methods are alternately implemented through specific judgment conditions.
  • the outer boundary and the inner boundary In this way, the walking and working route of the robot can be selected by combining the images taken by the robot, and the working efficiency of the robot can be improved.
  • the step S2 specifically includes: driving the robot to walk according to the planned path and work synchronously, and synchronously starting the camera device;
  • the robot When reaching each pre-planned coordinate point on the planned path, the robot is driven to rotate in place, and the environment image is captured by the camera device;
  • the robot is driven to walk along the actual boundary and work synchronously.
  • the method further includes:
  • the preset threshold is a set distance constant, and its size can be specifically set according to needs.
  • the entire area described in Figure 8 is the working area. Due to the low accuracy of fixed-point positioning such as UWB, the planned path is formed by connecting A1-B1-C1-D1 in turn by solid lines. Rectangular path L1, the path L1 contains a discontinuous outer boundary; however, if the machine only operates according to path L1, then the actual path EA, ABCDE connected by the dotted line will be missed; and the method of the fourth embodiment of the present invention is adopted The mowing operation can completely traverse the outer boundary (that is, the actual path L1).
  • the fourth preferred embodiment of the present invention combines image recognition to perform work mowing on the planned working area; specifically, starting from the starting position A1, the camera device is started, and after the environmental image is captured by the camera device, it can be further passed Analyze the environment image to obtain the actual boundary point E; further, under the coordinated action of the camera device, the robot walks along the path EA to point A. Since the camera device is always on, the robot will continue to walk along the AF direction after rotating at point A. In this process, it can be known through judgment that when the robot is on the AF road section, its shortest distance from the planned path L1 exceeds the preset threshold.
  • the robot returns and walks to the positioning point position E1 of the path L1; further, at the position E1
  • the robot rotates and passes through the environment image analysis, it cannot find the actual path. Therefore, it will continue to walk along the E1-B1 segment; when the robot reaches the position B1, the environment image analysis sequentially and continuously obtains the boundaries BC, CD, DE, and return to position E along the path, the work ends.
  • the method of image recognition and the traditional fixed-point positioning method are used to find the path, and the above two methods are alternately implemented through the outer boundary and the inner boundary through specific judgment conditions. Work is performed in the planned area. In this way, the walking and working route of the robot can be selected by combining the images taken by the robot, and the working efficiency of the robot can be improved.
  • a robot including a memory and a processor, the memory stores a computer program, and the processor implements the robot walking control method according to any one of the above embodiments when the computer program is executed A step of.
  • a readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the steps of the robot walking control method described in any of the above embodiments are realized.
  • a robot walking control system As shown in FIG. 9, a robot walking control system is provided.
  • the system includes: an acquisition module 100 and a processing module 200.
  • the obtaining module 100 is used to obtain the planned path preset for the robot
  • the processing module 200 is used for when the planned path includes at least part of the outer boundary, the outer boundary is the separation boundary between the working area and the non-working area; at least when the robot reaches the outer boundary, the camera device is activated to take environmental images, At least once at the outer boundary, the robot is driven to walk along the actual boundary obtained by analyzing the environment image and work synchronously.
  • the obtaining module 100 is used to implement step S1; the processing module 200 is used to implement step S2; those skilled in the art can clearly understand that for the convenience and conciseness of the description, the specific working process of the system described above can be referred to The corresponding process in the foregoing method implementation will not be repeated here.
  • the robot walking control method, system, robot and storage medium of the present invention can select the walking and working route of the robot in combination with the images taken by the robot, thereby improving the working efficiency of the robot.
  • modules described as separate components may or may not be physically separate, and the components displayed as modules may or may not be physical modules, that is, they may be located in one place, or they may be distributed to multiple network modules, Some or all of the modules may be selected according to actual needs to achieve the objectives of the solutions of this embodiment.
  • the functional modules in the various embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, or in the form of hardware plus software functional modules.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne un procédé et un appareil de commande de marche de robot, un robot et un support de stockage. Le procédé comprend les étapes consistant à : prérégler un trajet planifié pour un robot (S1) ; et si le trajet planifié comprend au moins une partie d'une limite extérieure, la limite extérieure étant une limite de séparation entre une zone de travail et une zone de non-travail, au moins lorsque le robot atteint la limite extérieure, démarrer un appareil photographique pour photographier une image de l'environnement, et piloter le robot pour marcher le long d'une limite réelle, qui est obtenue par analyse de l'image de l'environnement, au moins une fois au niveau de la limite extérieure et travailler de façon synchrone (S2). Au moyen du procédé et du système de commande de marche de robot, du robot et du support de stockage, des itinéraires de marche et de travail du robot peuvent être sélectionnés en référence à une image photographiée par le robot, ce qui permet d'améliorer l'efficacité de travail du robot.
PCT/CN2020/123186 2020-06-17 2020-10-23 Procédé et système de commande de marche de robot, robot et support de stockage WO2021253698A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010554973.5 2020-06-17
CN202010554973.5A CN113885485B (zh) 2020-06-17 2020-06-17 机器人行走控制方法、系统,机器人及存储介质

Publications (1)

Publication Number Publication Date
WO2021253698A1 true WO2021253698A1 (fr) 2021-12-23

Family

ID=79011867

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/123186 WO2021253698A1 (fr) 2020-06-17 2020-10-23 Procédé et système de commande de marche de robot, robot et support de stockage

Country Status (2)

Country Link
CN (1) CN113885485B (fr)
WO (1) WO2021253698A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115136781A (zh) * 2022-06-21 2022-10-04 松灵机器人(深圳)有限公司 割草方法、装置、割草机器人以及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103901890A (zh) * 2014-04-09 2014-07-02 中国科学院深圳先进技术研究院 基于家庭庭院的户外自动行走装置及其控制系统和方法
CN106998984A (zh) * 2014-12-16 2017-08-01 伊莱克斯公司 用于机器人清洁设备的清洁方法
US10037029B1 (en) * 2016-08-08 2018-07-31 X Development Llc Roadmap segmentation for robotic device coordination
CN108873880A (zh) * 2017-12-11 2018-11-23 北京石头世纪科技有限公司 智能移动设备及其路径规划方法、计算机可读存储介质
CN109984685A (zh) * 2019-04-11 2019-07-09 云鲸智能科技(东莞)有限公司 清洁控制方法、装置、清洁机器人和存储介质
US10613541B1 (en) * 2016-02-16 2020-04-07 AI Incorporated Surface coverage optimization method for autonomous mobile machines

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007316966A (ja) * 2006-05-26 2007-12-06 Fujitsu Ltd 移動ロボット、その制御方法及びプログラム
CN107402573B (zh) * 2016-05-19 2021-05-14 苏州宝时得电动工具有限公司 自动工作系统,自移动设备及其控制方法
CN111185899B (zh) * 2018-11-14 2022-05-13 苏州科瓴精密机械科技有限公司 机器人控制方法及机器人系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103901890A (zh) * 2014-04-09 2014-07-02 中国科学院深圳先进技术研究院 基于家庭庭院的户外自动行走装置及其控制系统和方法
CN106998984A (zh) * 2014-12-16 2017-08-01 伊莱克斯公司 用于机器人清洁设备的清洁方法
US10613541B1 (en) * 2016-02-16 2020-04-07 AI Incorporated Surface coverage optimization method for autonomous mobile machines
US10037029B1 (en) * 2016-08-08 2018-07-31 X Development Llc Roadmap segmentation for robotic device coordination
CN108873880A (zh) * 2017-12-11 2018-11-23 北京石头世纪科技有限公司 智能移动设备及其路径规划方法、计算机可读存储介质
CN109984685A (zh) * 2019-04-11 2019-07-09 云鲸智能科技(东莞)有限公司 清洁控制方法、装置、清洁机器人和存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115136781A (zh) * 2022-06-21 2022-10-04 松灵机器人(深圳)有限公司 割草方法、装置、割草机器人以及存储介质

Also Published As

Publication number Publication date
CN113885485A (zh) 2022-01-04
CN113885485B (zh) 2023-12-22

Similar Documents

Publication Publication Date Title
US11845189B2 (en) Domestic robotic system and method
JP6915209B2 (ja) 移動ロボットの地図作成方法および当該地図に基づく経路計画方法
WO2021169188A1 (fr) Procédé et système de suivi de chemin, robot, et support d'informations lisible
CN110245599A (zh) 一种智能三维焊缝自主寻迹方法
US20220342426A1 (en) Map building method, self-moving device, and automatic working system
JP4665857B2 (ja) アームを誘導可能な移動体およびアームを誘導する方法
US20150163993A1 (en) Autonomous gardening vehicle with camera
CN113296495B (zh) 自移动设备的路径形成方法、装置和自动工作系统
CN111328017B (zh) 一种地图传输方法和装置
WO2021253698A1 (fr) Procédé et système de commande de marche de robot, robot et support de stockage
CN108490932A (zh) 一种割草机器人的控制方法及自动控制割草系统
CN113031616B (zh) 一种清洁机器人返回路径规划方法、系统和清洁机器人
CN114721385A (zh) 虚拟边界建立方法、装置、智能终端及计算机存储介质
CN114937258A (zh) 割草机器人的控制方法、割草机器人以及计算机存储介质
CN111700553A (zh) 避障方法、装置、机器人和存储介质
WO2023050545A1 (fr) Système et procédé de commande de fonctionnement automatique extérieur basés sur la visionique, et dispositif
WO2021208352A1 (fr) Procédé et système de traversée, robot et support de stockage lisible
CN109213154A (zh) 一种基于Slam定位方法、装置、电子设备及计算机存储介质
CN109782771A (zh) 一种果园移动机器人及地头转向方法
CN114610035A (zh) 回桩方法、装置及割草机器人
WO2022088313A1 (fr) Procédé et système de charge automatique de robot, robot et support d'enregistrement
EP4328698A1 (fr) Dispositif automoteur, procédé de réglage de trajectoire de déplacement et support de stockage lisible par ordinateur
WO2019109228A1 (fr) Procédé et appareil utilisant une relocalisation visuelle pour permettre à une balayeuse de continuer à balayer après une mise hors tension, et balayeuse
CN113126613B (zh) 智能割草系统及其自主建图方法
CN110411452A (zh) 一种基于双目视觉的农田喷药机器人导航路径识别方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20941315

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20941315

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20941315

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 28/06/2023)