CN113397437A - Sweeping robot and obstacle avoidance method thereof - Google Patents

Sweeping robot and obstacle avoidance method thereof Download PDF

Info

Publication number
CN113397437A
CN113397437A CN202110784179.4A CN202110784179A CN113397437A CN 113397437 A CN113397437 A CN 113397437A CN 202110784179 A CN202110784179 A CN 202110784179A CN 113397437 A CN113397437 A CN 113397437A
Authority
CN
China
Prior art keywords
obstacle
sweeping robot
point cloud
depth camera
planned path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110784179.4A
Other languages
Chinese (zh)
Inventor
丁杨
李侃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202110784179.4A priority Critical patent/CN113397437A/en
Publication of CN113397437A publication Critical patent/CN113397437A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Landscapes

  • Manipulator (AREA)

Abstract

The application discloses robot of sweeping floor and obstacle avoidance method of robot of sweeping floor relates to intelligent house technical field, and in this method, include: detecting whether the sweeping robot is in a normal operation state or in an obstacle trapped state; when the sweeping robot is in a state of being trapped by obstacles, calling a first depth camera and a second depth camera which are positioned at two ends of the sweeping robot to respectively acquire a corresponding first depth image and a corresponding second depth image; extracting a first point cloud and a second point cloud corresponding to the first depth image and the second depth image respectively; reconstructing the appearance of the barrier based on the overlapping point cloud areas in the first point cloud and the second point cloud; and determining a planned path according to the reconstructed appearance of the obstacles so that the sweeping robot can bypass the obstacles through the planned path. Therefore, the appearance of the obstacle is reconstructed by using the point cloud matching technology, and the efficiency of the sweeping robot for bypassing the obstacle can be effectively improved.

Description

Sweeping robot and obstacle avoidance method thereof
Technical Field
The application belongs to the technical field of intelligent household equipment, and particularly relates to a sweeping robot and an obstacle avoidance method of the sweeping robot.
Background
Along with the continuous improvement of people's demand for intelligent house life, the function of robot of sweeping the floor is also constantly updating the iteration to promote the competitiveness of product.
The house environment is complicated diversified, and the robot of sweeping the floor cleans the in-process according to established mode, is stranded by various barriers in the environment easily, leads to the operation to break, can't accomplish whole house cleanness voluntarily, has seriously influenced the user experience of product.
In view of the above problems, the industry has not provided a better solution for the moment.
Disclosure of Invention
The embodiment of the application provides a sweeping robot and an obstacle avoidance method of the sweeping robot, which are used for at least solving the problem that in the prior art, the sweeping robot is easily trapped by obstacles in an operation environment and cannot automatically complete whole-house cleaning.
In a first aspect, an embodiment of the present application provides an obstacle avoidance method for a floor sweeping robot, including: detecting whether the sweeping robot is in a normal operation state or in an obstacle trapped state; when the sweeping robot is in an obstacle trapped state, calling a first depth camera and a second depth camera which are positioned at two ends of the sweeping robot to respectively acquire a corresponding first depth image and a corresponding second depth image; the first depth camera and the second depth camera have crossed and non-coincident field angle intervals therebetween; extracting a first point cloud and a second point cloud corresponding to the first depth image and the second depth image respectively; reconstructing an obstacle appearance based on overlapping point cloud regions in the first point cloud and the second point cloud; determining a planned path according to the reconstructed appearance of the obstacle, so that the sweeping robot can bypass the obstacle through the planned path.
In a second aspect, an embodiment of the present application provides a sweeping robot, including: a first depth camera; a second depth camera; the first depth camera and the second depth camera are respectively positioned at two ends of the sweeping robot and are provided with crossed but non-coincident angle of view intervals; an obstacle avoidance device configured to: detecting whether the sweeping robot is in a normal operation state or in an obstacle trapped state; when the sweeping robot is in an obstacle trapped state, calling the first depth camera and the second depth camera to respectively acquire a corresponding first depth image and a corresponding second depth image; extracting a first point cloud and a second point cloud corresponding to the first depth image and the second depth image respectively; reconstructing an obstacle appearance based on overlapping point cloud regions in the first point cloud and the second point cloud; determining a planned path according to the reconstructed appearance of the obstacle, so that the sweeping robot can bypass the obstacle through the planned path.
In a third aspect, an embodiment of the present application provides a machine-readable storage medium, on which a computer program is stored, and the program, when executed by a processor, implements the steps of the above method.
The beneficial effects of the embodiment of the application are that:
in the embodiment of the application, when the sweeping robot is in the obstacle trapped state, the sweeping robot can call the first depth camera and the second depth camera which are positioned at the two ends, and reconstruct the outer surface information of the obstacle by using the corresponding depth images, so that the planned path is determined according to the outer surface of the obstacle. Therefore, point cloud registration is carried out by fusing the overlapped point clouds of the first point cloud and the second point cloud, the accuracy of the reconstructed appearance of the barrier can be effectively improved, the sweeping robot can be quickly separated from the trapped state of the barrier, and efficient cleaning efficiency can be still achieved even under a complex operation environment.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 shows a schematic structural diagram of an example of a sweeping robot according to an embodiment of the present application;
fig. 2 shows a flowchart of an example of an obstacle avoidance method of a sweeping robot according to an embodiment of the present application;
FIG. 3 shows a flowchart of an example of an implementation according to step 240 in FIG. 1;
fig. 4 shows a flow of an example of an obstacle avoidance method of a sweeping robot according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the embodiments of the present disclosure will be described in detail and completely with reference to the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments, which can be derived by one of ordinary skill in the art from the embodiments given herein without making any creative effort, shall fall within the scope of the implementation of the present specification.
As used herein, the term "include" and its variants mean open-ended terms in the sense of "including, but not limited to. The term "based on" means "based at least in part on". The terms "one embodiment" and "an embodiment" mean "at least one embodiment". The term "another embodiment" means "at least one other embodiment". The terms "first," "second," and the like may refer to different or the same object. Other definitions, whether explicit or implicit, may be included below. The definition of a term is consistent throughout the specification unless the context clearly dictates otherwise.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
Finally, it should be further noted that the terms "comprises" and "comprising," when used herein, include not only those elements but also other elements not expressly listed or inherent to such processes, methods, articles, or devices. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Fig. 1 shows a schematic structural diagram of an example of a sweeping robot according to an embodiment of the present application.
As shown in fig. 1, the sweeping robot includes a robot body 110, a first depth camera 121, a second depth camera 123, and a key area 130. Specifically, the first depth camera 121 and the second depth camera 123 are oppositely disposed at both ends (e.g., left end and right end) of the robot, respectively, and the photographing directions of the first depth camera 121 and the second depth camera 123 may be toward the same or similar positions, e.g., with intersecting and non-coinciding field angle sections, so that the first depth camera and the second depth camera can simultaneously obtain depth image information for the same object (e.g., an obstacle). The key area 130 may be provided with various means for user interaction, such as keys or a touch screen, to enable corresponding preset functions, such as start/pause of a cleaning function or an alarm function, etc.
It should be noted that, since the sweeping robot is easy to deposit dust when cleaning in a working environment, the camera lens view angle is easy to block, and frequent cleaning is required, however, the lens of the depth camera needs to be cleaned by a special cleaning solution, and the cleaning agent of the sweeping robot cannot be used.
In some preferred embodiments, the first depth camera 121 and the second depth camera 123 are detachably connected to the sweeping robot through a first connector (not shown) and a second connector (not shown), respectively. Here, various connection methods may be used, and no limitation should be made herein.
Under some specific conditions, first connecting piece and second connecting piece are the buckle connecting piece, can be more convenient for the user to carry out frequent dismantlement operation to the degree of depth camera.
Through this application embodiment, when the user need clean the robot of sweeping the floor, can follow the robot main part of sweeping the floor and take off the degree of depth camera, the user of being convenient for carries out professional washing operation respectively to the fuselage and the camera of the robot of sweeping the floor, has ensured the effect of shooing and life of camera.
Fig. 2 shows a flowchart of an example of an obstacle avoidance method of a sweeping robot according to an embodiment of the present application.
As shown in fig. 2, in step 210, it is detected whether the sweeping robot is in a normal operation state or in an obstacle-trapped state. Here, various robot trapped state detection methods may be adopted to detect whether the robot is in the obstacle trapped state, and no limitation should be made here.
In some examples of the embodiment of the present application, a moving distance of the sweeping robot in a preset detection time window (e.g., 1 minute or 2 minutes, etc.) may be obtained, and it may be determined whether the sweeping robot is in a normal operating state or an obstacle-stuck state according to the moving distance. Specifically, if the moving distance is greater than or equal to a preset distance threshold value, it is determined that the sweeping robot is in a normal running state. And if the moving distance is smaller than the distance threshold value, determining that the sweeping robot is in the obstacle trapped state. For example, absolute values of moving distances of the sweeping robot in a specific direction or directions (for example, a transverse direction or a longitudinal direction) in a moving window can be counted, and when the absolute values of the moving distances are smaller, it can be determined that the sweeping robot is in an obstacle-stuck state.
In step 220, when the sweeping robot is in the obstacle and trapped state, the first depth camera and the second depth camera located at two ends of the sweeping robot are called to respectively collect the corresponding first depth image and the second depth image. Here, the first depth camera and the second depth camera have intersecting and non-overlapping field angle sections therebetween so that the first depth camera and the second depth camera can acquire surface depth information of the obstacle at the same time.
In step 230, a first point cloud and a second point cloud corresponding to the first depth image and the second depth image are extracted. For example, the depth image may be subjected to image preprocessing, and a target point cloud corresponding to the preprocessed depth image may be obtained according to the camera calibration parameters.
In step 240, an obstacle appearance is reconstructed based on the overlapping point cloud regions in the first point cloud and the second point cloud.
Specifically, the point cloud registration is performed by using the overlapping point cloud areas in the first point cloud and the second point cloud, so that the deviation information in the point cloud can be effectively calibrated, and the smooth obstacle appearance information is obtained.
It should be noted that, in the current point cloud registration process, registration is generally performed by using a standard object model, however, various deviations exist in the registration result in the personalized business scene. In the embodiment of the application, the point cloud information of the same area of the surface of an object is obtained by using different cameras, so that the registration precision is guaranteed.
In step 250, a planned path is determined from the reconstructed appearance of the obstacle, so that the sweeping robot can bypass the obstacle by planning the path.
In some examples of the embodiment of the present application, a target obstacle type corresponding to the reconstructed obstacle appearance may be determined based on a preset obstacle recognition model, and a target working mode corresponding to the target obstacle type may be determined according to a preset obstacle type mode table. Here, the obstacle type mode table includes a plurality of obstacle types and corresponding operation modes, such as a rotational travel mode or an S-line travel mode. And then, controlling the sweeping robot to operate according to the target working mode and the planned path so as to bypass the obstacle. For example, when the obstacle is a cable, an S-wire travel mode may be employed to quickly disengage the cable into a normal operating state.
According to the embodiment of the application, the point cloud technology is utilized to register and construct the appearance of the obstacle, and the getting-out path of the sweeping robot is formulated according to the registered appearance, so that the sweeping robot can effectively and quickly get around the obstacle.
Fig. 3 shows a flowchart of an example of an implementation according to step 240 in fig. 1.
As shown in fig. 3, in step 310, first occlusion point information outside the field angle range of the second depth camera in the first point cloud is filtered out to obtain a corresponding first common field of view point cloud.
In step 320, second occlusion point information outside the field angle range of the first depth camera in the second point cloud is filtered out to obtain a corresponding second common field point cloud.
In step 330, an obstacle appearance is reconstructed based on the first common field of view point cloud and the second common field of view point cloud.
Through the above steps 310 and 320, the point cloud information outside the field angle range of another camera in the original point cloud can be effectively filtered, and the original point cloud is updated, so that point cloud fusion in step 330 is performed on point cloud data of a common field of view of the surface area of the same object, and point cloud information is mutually compensated, so as to obtain a better obstacle appearance.
In some more specific embodiments, the coordinate transformation relationship between the camera coordinate systems of the first and second depth cameras and the world coordinate system may also be pre-stored in the processing system of the sweeping robot. Furthermore, by using the coordinate transformation relationship, the above-mentioned occlusion point information filtering operation and the fusion compensation operation based on the common view field point cloud in fig. 3 can be effectively realized.
Fig. 4 shows a flowchart of an example of an obstacle avoidance method of a sweeping robot according to an embodiment of the present application.
As shown in fig. 4, in step 410, a planned path is determined according to the reconstructed appearance of the obstacle, so that the sweeping robot can bypass the obstacle through the planned path.
Regarding the operation of step 410, reference may be made to the related description above with reference to step 250, and further description is omitted here.
In step 420, the sweeping robot is controlled to operate according to the planned path. For example, the sweeping robot may take a planned path of the S-line.
In step 430, it is detected whether the sweeping robot is still in an obstacle-stuck state after running the planned path.
In step 440, when the sweeping robot is still in the obstacle and trapped state, a trapped alarm operation is performed. In some embodiments, the sweeping robot may perform a voice announcement warning message, such as voice announcement "the current sweeping robot encounters a trapped fault, please assist in the process". In other embodiments, the sweeping robot may push alert notifications to bound clients.
Through this application embodiment, when the robot of sweeping the floor uses established operation mode and still can't get rid of poverty, the robot of sweeping the floor can remind the user in time to assist the robot of sweeping the floor in the poverty through the alarm action, avoids the robot of sweeping the floor to last poverty and can't accomplish cleaning operation.
In the embodiment of the present application, the relevant functional module may be implemented by a hardware processor (hardware processor).
In another aspect, an embodiment of the present application provides a machine-readable storage medium, on which a computer program is stored, where the program is executed by a processor to perform the steps of the obstacle avoidance method for a sweeping robot as described above.
The product can execute the method provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in this embodiment, reference may be made to the methods provided in the embodiments of the present application.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
Embodiments of the present description are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the description. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The embodiments of this specification may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The described embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present specification, and is not intended to limit the present specification. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (9)

1. An obstacle avoidance method of a sweeping robot comprises the following steps:
detecting whether the sweeping robot is in a normal operation state or in an obstacle trapped state;
when the sweeping robot is in an obstacle trapped state, calling a first depth camera and a second depth camera which are positioned at two ends of the sweeping robot to respectively acquire a corresponding first depth image and a corresponding second depth image; the first depth camera and the second depth camera have crossed and non-coincident field angle intervals therebetween;
extracting a first point cloud and a second point cloud corresponding to the first depth image and the second depth image respectively;
reconstructing an obstacle appearance based on overlapping point cloud regions in the first point cloud and the second point cloud;
determining a planned path according to the reconstructed appearance of the obstacle, so that the sweeping robot can bypass the obstacle through the planned path.
2. The method of claim 1, wherein determining a planned path from the reconstructed obstacle appearance to enable the sweeping robot to bypass obstacles through the planned path comprises:
determining a target obstacle type corresponding to the reconstructed obstacle appearance based on a preset obstacle recognition model;
determining a target working mode corresponding to the type of the target obstacle according to a preset obstacle type mode table; wherein the obstacle type mode table comprises a plurality of obstacle types and corresponding working modes;
and controlling the sweeping robot to operate according to the target working mode and the planned path so as to bypass the obstacle.
3. The method of claim 1 or 2, wherein the reconstructing an obstacle appearance based on overlapping point cloud regions in the first point cloud and the second point cloud comprises:
filtering first occlusion point information outside the field angle range of the second depth camera in the first point cloud to obtain a corresponding first common field point cloud;
filtering second occlusion point information outside the field angle range of the first depth camera in the second point cloud to obtain a corresponding second common field point cloud;
reconstructing an obstacle appearance based on the first common field of view point cloud and the second common field of view point cloud.
4. The method of any one of claims 1-3, wherein after determining a planned path from the reconstructed exterior of obstacles, the method further comprises:
controlling the sweeping robot to run according to the planned path;
detecting whether the sweeping robot is still in an obstacle trapped state after running the planned path; and
and when the sweeping robot is still in the obstacle trapped state, executing trapped alarm operation.
5. The method of claim 1, wherein detecting whether the sweeping robot is in a normal operating state or an obstacle-stuck state comprises:
acquiring the moving distance of the sweeping robot in a preset detection time window;
if the moving distance is larger than or equal to a preset distance threshold value, determining that the sweeping robot is in a normal running state;
and if the moving distance is smaller than the distance threshold value, determining that the sweeping robot is in an obstacle trapped state.
6. The method of claim 1, wherein the first depth camera and the second depth camera are removably connected to the sweeping robot by a first connector and a second connector, respectively.
7. The method of claim 6, wherein the first connector and the second connector are snap connectors.
8. A sweeping robot comprising:
a first depth camera;
a second depth camera; the first depth camera and the second depth camera are respectively positioned at two ends of the sweeping robot and are provided with crossed but non-coincident angle of view intervals;
an obstacle avoidance device configured to:
detecting whether the sweeping robot is in a normal operation state or in an obstacle trapped state;
when the sweeping robot is in an obstacle trapped state, calling the first depth camera and the second depth camera to respectively acquire a corresponding first depth image and a corresponding second depth image;
extracting a first point cloud and a second point cloud corresponding to the first depth image and the second depth image respectively;
reconstructing an obstacle appearance based on overlapping point cloud regions in the first point cloud and the second point cloud;
determining a planned path according to the reconstructed appearance of the obstacle, so that the sweeping robot can bypass the obstacle through the planned path.
9. A machine readable storage medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out the steps of the method of claims 1-7.
CN202110784179.4A 2021-07-12 2021-07-12 Sweeping robot and obstacle avoidance method thereof Withdrawn CN113397437A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110784179.4A CN113397437A (en) 2021-07-12 2021-07-12 Sweeping robot and obstacle avoidance method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110784179.4A CN113397437A (en) 2021-07-12 2021-07-12 Sweeping robot and obstacle avoidance method thereof

Publications (1)

Publication Number Publication Date
CN113397437A true CN113397437A (en) 2021-09-17

Family

ID=77685940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110784179.4A Withdrawn CN113397437A (en) 2021-07-12 2021-07-12 Sweeping robot and obstacle avoidance method thereof

Country Status (1)

Country Link
CN (1) CN113397437A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838274A (en) * 2021-09-19 2021-12-24 安徽淘云科技股份有限公司 Intelligent desk and learning state monitoring method
WO2023179207A1 (en) * 2022-03-22 2023-09-28 追觅创新科技(苏州)有限公司 Map processing method and apparatus, cleaning device, storage medium, and electronic apparatus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107092252A (en) * 2017-04-11 2017-08-25 杭州光珀智能科技有限公司 A kind of robot automatic obstacle avoidance method and its device based on machine vision
CN109213137A (en) * 2017-07-05 2019-01-15 广东宝乐机器人股份有限公司 sweeping robot, sweeping robot system and its working method
CN110353583A (en) * 2019-08-21 2019-10-22 追创科技(苏州)有限公司 The autocontrol method of sweeping robot and sweeping robot
CN110622085A (en) * 2019-08-14 2019-12-27 珊口(深圳)智能科技有限公司 Mobile robot and control method and control system thereof
WO2021103987A1 (en) * 2019-11-29 2021-06-03 深圳市杉川机器人有限公司 Control method for sweeping robot, sweeping robot, and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107092252A (en) * 2017-04-11 2017-08-25 杭州光珀智能科技有限公司 A kind of robot automatic obstacle avoidance method and its device based on machine vision
CN109213137A (en) * 2017-07-05 2019-01-15 广东宝乐机器人股份有限公司 sweeping robot, sweeping robot system and its working method
CN110622085A (en) * 2019-08-14 2019-12-27 珊口(深圳)智能科技有限公司 Mobile robot and control method and control system thereof
CN110353583A (en) * 2019-08-21 2019-10-22 追创科技(苏州)有限公司 The autocontrol method of sweeping robot and sweeping robot
WO2021103987A1 (en) * 2019-11-29 2021-06-03 深圳市杉川机器人有限公司 Control method for sweeping robot, sweeping robot, and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838274A (en) * 2021-09-19 2021-12-24 安徽淘云科技股份有限公司 Intelligent desk and learning state monitoring method
CN113838274B (en) * 2021-09-19 2023-11-24 安徽淘云科技股份有限公司 Intelligent desk and supervision method for learning state
WO2023179207A1 (en) * 2022-03-22 2023-09-28 追觅创新科技(苏州)有限公司 Map processing method and apparatus, cleaning device, storage medium, and electronic apparatus

Similar Documents

Publication Publication Date Title
CN113397437A (en) Sweeping robot and obstacle avoidance method thereof
US11878425B1 (en) Robot task optimization based on historical task and location correlated durations
CN111209978B (en) Three-dimensional visual repositioning method and device, computing equipment and storage medium
EP3643015A1 (en) Temporal graph system to facilitate network data flow analytics
CN111197985B (en) Area identification method, path planning method, device and storage medium
CN104737085A (en) Robot and method for autonomous inspection or processing of floor areas
CN113110445B (en) Robot path planning method and device, robot and storage medium
Babu et al. An autonomous path finding robot using Q-learning
CN111714028A (en) Method, device and equipment for escaping from restricted zone of cleaning equipment and readable storage medium
Voisan et al. ROS-based robot navigation and human interaction in indoor environment
Bormann et al. Autonomous dirt detection for cleaning in office environments
Cox et al. Probabilistic data association for dynamic world modeling: A multiple hypothesis approach
CN115609594B (en) Planning method and device for mechanical arm path, upper control end and storage medium
CN111160187A (en) Method, device and system for detecting left-behind object
Hu et al. Accuracy enhancement for the front-end tracking algorithm of RGB-D SLAM
CN109491391A (en) The collision recognition method and apparatus of window wiping robot
Pellier et al. Coordinated exploration of unknown labyrinthine environments applied to the pursuit evasion problem
Kobayashi et al. Noise reduction of segmentation results using spatio-temporal filtering for robot navigation
CN111035317A (en) Method and device for detecting and processing local predicament and computing equipment
CN111552286B (en) Robot and movement control method and device thereof
He Face tracking using Kanade-Lucas-Tomasi algorithm
EP3798976B1 (en) Method and system for determining dynamism in a scene by processing a depth image
CN118172514A (en) Robot dynamic SLAM method and system for intelligent storage dynamic environment
Liu et al. DOE: a dynamic object elimination scheme based on geometric and semantic constraints
Zheng et al. Semantic map construction approach for human-robot collaborative manufacturing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210917

WW01 Invention patent application withdrawn after publication