CN116211168A - Operation control method and device of cleaning equipment, storage medium and electronic device - Google Patents

Operation control method and device of cleaning equipment, storage medium and electronic device Download PDF

Info

Publication number
CN116211168A
CN116211168A CN202111464917.3A CN202111464917A CN116211168A CN 116211168 A CN116211168 A CN 116211168A CN 202111464917 A CN202111464917 A CN 202111464917A CN 116211168 A CN116211168 A CN 116211168A
Authority
CN
China
Prior art keywords
target
area
virtual
cleaning device
exclusion zone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111464917.3A
Other languages
Chinese (zh)
Inventor
请求不公布姓名
薄慕婷
丘伟楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dreame Technology Suzhou Co ltd
Original Assignee
Dreame Technology Suzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dreame Technology Suzhou Co ltd filed Critical Dreame Technology Suzhou Co ltd
Priority to CN202111464917.3A priority Critical patent/CN116211168A/en
Priority to PCT/CN2022/131571 priority patent/WO2023098455A1/en
Publication of CN116211168A publication Critical patent/CN116211168A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Inking, Control Or Cleaning Of Printing Machines (AREA)

Abstract

The application provides an operation control method and device of cleaning equipment, a storage medium and an electronic device, wherein the method comprises the following steps: obtaining virtual exclusion zone information corresponding to a target area map, wherein the target area map is an area map of a to-be-cleaned area to be cleaned of cleaning equipment, and the virtual exclusion zone information is used for indicating a virtual exclusion zone in the area map; performing target detection through a first sensor on the cleaning equipment to obtain target object information, wherein the target object information is used for representing a target scene object matched with the virtual exclusion zone in the current area where the cleaning equipment is positioned; under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, controlling the cleaning equipment to execute target escaping operation according to the virtual exclusion zone information and the target object information, wherein the escaping cleaning equipment is located outside the virtual exclusion zone. By the method, the problem that the cleaning equipment is easy to be trapped due to the virtual forbidden zone establishment error in the operation control method of the cleaning equipment in the related technology is solved.

Description

Operation control method and device of cleaning equipment, storage medium and electronic device
[ field of technology ]
The application relates to the field of smart home, in particular to an operation control method and device of cleaning equipment, a storage medium and an electronic device.
[ background Art ]
Currently, an application program matched with a cleaning device (e.g., a cleaning robot) can be run on a user's terminal device. The user can set an area where cleaning of the cleaning device is permitted and an area where cleaning of the cleaning device is not permitted by configuring the virtual exclusion zone on the area map displayed in the configuration interface of the application program. In addition, the cleaning device may also establish a virtual exclusion zone based on the trapped history.
For example, the user may limit a room in which the cleaning robot is allowed to perform area cleaning and a room in which the cleaning device is not allowed to perform area cleaning by providing virtual walls between the rooms. When the cleaning robot is trapped at the bottom of the furniture due to the upper surface stuck, the cleaning robot can set the area where the furniture is located as a virtual forbidden area based on the trapped history.
However, there is a large proportion difference between the area map displayed on the terminal device and the actual room scene, and the created virtual exclusion zone has errors, and the virtual exclusion zone created by the cleaning device itself also has errors due to the limited acquired information. Due to errors in the establishment of the virtual exclusion zone, the cleaning device is easy to be trapped, and thus the use experience of a user is reduced.
As can be seen, the operation control method of the cleaning apparatus in the related art has a problem in that the cleaning apparatus is easily trapped due to the virtual exclusion zone establishment error.
[ invention ]
The present invention provides a method and apparatus for controlling operation of a cleaning device, a storage medium, and an electronic apparatus, so as to at least solve the problem that the cleaning device is prone to be trapped due to errors in establishing a virtual forbidden zone in the related art.
The purpose of the application is realized through the following technical scheme:
according to an aspect of the embodiments of the present application, there is provided an operation control method of a cleaning apparatus, including: obtaining virtual exclusion zone information corresponding to a target area map, wherein the target area map is an area map of an area to be cleaned of the cleaning equipment, and the virtual exclusion zone information is used for indicating a virtual exclusion zone in the area map; performing target detection through a first sensor on the cleaning equipment to obtain target object information, wherein the target object information is used for representing a target scene object matched with the virtual exclusion zone in a current area where the cleaning equipment is located; and under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, controlling the cleaning equipment to execute target trapping operation according to the virtual exclusion zone information and the target object information, wherein the cleaning equipment after trapping is positioned outside the virtual exclusion zone.
In an exemplary embodiment, the detecting the target by the first sensor on the cleaning device, to obtain target object information, includes: and carrying out target recognition on the point cloud data acquired by the first sensor to obtain the target object information, wherein the target object information is the object point cloud of the target scene object.
In an exemplary embodiment, the performing object recognition on the point cloud data acquired by the first sensor to obtain the object information includes: performing target recognition on the point cloud data acquired by the first sensor to obtain candidate object information, wherein the candidate object information is object point clouds of candidate objects contained in the current area; and selecting the target scene object matched with the virtual exclusion zone from the candidate object according to the position information of the candidate object and the virtual exclusion zone information, and obtaining the target object information.
In an exemplary embodiment, the controlling the cleaning device to perform the target getting rid of poverty according to the virtual exclusion zone information and the target object information includes: and controlling the cleaning device to pass over the target scene object and enter the region to be cleaned under the condition that the target scene object is determined to be the scene object allowing the cleaning device to pass over the obstacle crossing scene object according to the target object information and the cleaning device passes through the virtual wall of the virtual exclusion zone and enters the region except the region to be cleaned in the target region map.
In an exemplary embodiment, the controlling the cleaning device to perform the target getting rid of poverty according to the virtual exclusion zone information and the target object information includes: and under the condition that the target scene object is determined to be a scene object of a target type according to the target object information, and the cleaning equipment enters a target object area where the target scene object is located through a virtual wall of the virtual exclusion zone, controlling the cleaning equipment to execute the target escape operation matched with the target type, wherein the target type comprises at least one of the following components: the type in which the bottom does not allow the cleaning device to pass through, and the type in which the distance from the wall is less than or equal to the distance threshold.
In one exemplary embodiment, the controlling the cleaning device to perform the target escape operation matching the target type includes: under the condition that the target type comprises a type that the bottom of the target type does not allow the cleaning equipment to pass through, acquiring point cloud data through a second sensor on the cleaning equipment to obtain target point cloud data; identifying an outlet matched with the movement track of the cleaning equipment entering the target object area according to the target point cloud data, wherein the size of the outlet allows the cleaning equipment to pass through; controlling the cleaning device to move out of the target object area from the outlet along the moving track; controlling the cleaning device to move along a target boundary detected by a distance sensor of the cleaning device until the target object area is moved, wherein the target boundary is at least one of the following: the wall body and the boundary of the target scene object.
In an exemplary embodiment, the controlling the cleaning device to perform the target getting rid of poverty according to the virtual exclusion zone information and the target object information includes: controlling the cleaning device to execute a first escape operation under the condition that the cleaning device is trapped due to the detection of a cliff or the detection of the wheels of the cleaning device in a falling state, wherein the first escape operation is used for controlling the cleaning device to leave the detected cliff or controlling the wheels to leave the falling state, and the cleaning device disregards a virtual forbidden zone in the process of executing the first escape operation; and after the first getting rid of the trapping operation is performed, controlling the cleaning device to perform a second getting rid of the trapping operation under the condition that the cleaning device is detected to pass through the virtual wall of the virtual exclusion zone, wherein the second getting rid of the trapping operation is used for controlling the cleaning device to leave the range of the virtual exclusion zone.
In an exemplary embodiment, after the target detection by the first sensor on the cleaning device, the method further comprises: controlling the cleaning equipment to clean along the boundary of the target scene object under the condition that the boundary of the target scene object is positioned in the to-be-cleaned area; and controlling the cleaning equipment to clean along the virtual wall of the virtual forbidden zone under the condition that the boundary of the target scene object is positioned outside the to-be-cleaned area.
According to another aspect of the embodiments of the present application, there is also provided an operation control device of a cleaning apparatus, including: a first obtaining unit, configured to obtain virtual exclusion area information corresponding to a target area map, where the target area map is an area map to which a to-be-cleaned area to be cleaned by the cleaning device belongs, and the virtual exclusion area information is used to indicate a virtual exclusion area in the area map; the detection unit is used for carrying out target detection through a first sensor on the cleaning equipment to obtain target object information, wherein the target object information is used for representing a target scene object matched with the virtual exclusion zone in the current area where the cleaning equipment is located; and the first control unit is used for controlling the cleaning equipment to execute target escaping operation according to the virtual exclusion zone information and the target object information under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, wherein the cleaning equipment after escaping is positioned outside the virtual exclusion zone.
In an exemplary embodiment, the detection unit includes: the first identification module is used for carrying out target identification on the point cloud data acquired by the first sensor to obtain the target object information, wherein the target object information is the object point cloud of the target scene object.
In one exemplary embodiment, the first identification module includes: the identification sub-module is used for carrying out target identification on the point cloud data acquired by the first sensor to obtain candidate object information, wherein the candidate object information is object point clouds of candidate objects contained in the current area; and the selecting sub-module is used for selecting the target scene object matched with the virtual exclusion zone from the candidate object according to the position information of the candidate object and the virtual exclusion zone information, and obtaining the target object information.
In an exemplary embodiment, the first control unit includes: and the first control module is used for controlling the cleaning equipment to pass through the target scene object and enter the cleaning area under the condition that the target scene object is determined to be the scene object which allows the cleaning equipment to pass through according to the target object information and the cleaning equipment passes through the virtual wall of the virtual exclusion zone and enters the area except the cleaning area in the target area map.
In an exemplary embodiment, the first control unit includes: the second control module is used for controlling the cleaning device to execute the target getting rid of the trapping operation matched with the target type under the condition that the target scene object is determined to be a scene object of the target type according to the target object information and the cleaning device enters a target object area where the target scene object is located through a virtual wall of the virtual exclusion zone, wherein the target type comprises at least one of the following components: the type in which the bottom does not allow the cleaning device to pass through, and the type in which the distance from the wall is less than or equal to the distance threshold.
In an exemplary embodiment, the first control unit includes: the acquisition module is used for acquiring point cloud data through a second sensor on the cleaning equipment under the condition that the target type comprises a type of which the bottom does not allow the cleaning equipment to pass through, so as to obtain target point cloud data; the second identification module is used for identifying an outlet matched with the movement track of the cleaning equipment entering the target object area according to the target point cloud data, wherein the size of the outlet allows the cleaning equipment to pass through; a third control module for controlling the cleaning device to move out of the target object area from the outlet along the movement track; a fourth control module, configured to control, in a case where the target type includes a type in which a distance from a wall is less than or equal to a distance threshold, the cleaning apparatus to move along a target boundary detected by a distance sensor of the cleaning apparatus until the target object area is moved, where the target boundary is at least one of: the wall body and the boundary of the target scene object.
In an exemplary embodiment, the first control unit includes: a fifth control module, configured to control the cleaning device to perform a first escape operation when the cleaning device is trapped due to detection of a cliff or detection of a wheel of the cleaning device being in a falling state, where the first escape operation is used to control the cleaning device to leave the detected cliff or control the wheel to leave the falling state, and during the performing of the first escape operation, the cleaning device disregards a virtual exclusion zone; and the sixth control module is used for controlling the cleaning equipment to execute a second escape operation under the condition that the cleaning equipment is detected to pass through the virtual wall of the virtual exclusion zone after the first escape operation is executed, wherein the second escape operation is used for controlling the cleaning equipment to leave the range of the virtual exclusion zone.
In an exemplary embodiment, the apparatus further comprises: a second control unit for controlling the cleaning device to clean along the boundary of the target scene object in the case that the boundary of the target scene object is located in the cleaning region after the target detection by the first sensor on the cleaning device; and the third control unit is used for controlling the cleaning equipment to clean along the virtual wall of the virtual forbidden zone under the condition that the boundary of the target scene object is positioned outside the to-be-cleaned area.
According to yet another aspect of the embodiments of the present application, there is also provided a computer readable storage medium having a computer program stored therein, wherein the computer program is configured to perform the method of testing an interface as described above when run.
According to still another aspect of the embodiments of the present application, there is further provided an electronic device including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the method for testing the interface described above through the computer program.
In the embodiment of the application, a mode of combining a real getting rid of poverty scene with virtual exclusion zone information is adopted, virtual exclusion zone information corresponding to a target area map is obtained, the target area map is an area map of a to-be-cleaned area to be cleaned of cleaning equipment, and the virtual exclusion zone information is used for indicating a virtual exclusion zone in the area map; performing target detection through a first sensor on the cleaning equipment to obtain target object information, wherein the target object information is used for representing a target scene object matched with the virtual exclusion zone in the current area where the cleaning equipment is positioned; under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, the cleaning equipment is controlled to execute target trapping operation according to the virtual trapping area information and the target object information, and the trapped cleaning equipment is located outside the virtual trapping area.
[ description of the drawings ]
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required to be used in the description of the embodiments or the prior art will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic illustration of a hardware environment of an alternative method of operation control of a cleaning device according to an embodiment of the present application;
FIG. 2 is a flow chart of an alternative method of controlling operation of a cleaning device according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an alternative virtual exclusion zone according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an alternative virtual exclusion zone according to an embodiment of the present application;
FIG. 5 is a schematic diagram of yet another alternative virtual exclusion zone according to an embodiment of the present application;
FIG. 6 is a schematic view of an alternative slide rail according to an embodiment of the present application;
FIG. 7 is a block diagram of an alternative operation control device for a cleaning appliance according to an embodiment of the present application;
fig. 8 is a block diagram of an alternative electronic device according to an embodiment of the present application.
[ detailed description ] of the invention
The present application will be described in detail hereinafter with reference to the accompanying drawings in conjunction with embodiments. It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
According to one aspect of the embodiments of the present application, a method of controlling operation of a cleaning device is provided. Alternatively, in the present embodiment, the above-described operation control method of the cleaning device may be applied to a hardware environment constituted by the terminal device 102, the cleaning device 104, and the server 106 as shown in fig. 1. As shown in fig. 1, the terminal device 102 may be connected to the cleaning device 104 and/or the server 106 (e.g., an internet of things platform or cloud server) through a network to control the cleaning device 104, e.g., bind with the cleaning device 104, configure the cleaning function of the cleaning device 104. The cleaning device 104 may include a host and a base station (e.g., a sweeper and base station, a washer and base station) that may be connected via a network to determine a current status (e.g., a power status, an operational status, location information, etc.) of the opposite end.
The network may include, but is not limited to, at least one of: wired network, wireless network. The wired network may include, but is not limited to, at least one of: a wide area network, a metropolitan area network, a local area network, and the wireless network may include, but is not limited to, at least one of: WIFI (Wireless Fidelity ), bluetooth, infrared. The network used by the terminal device 102 to communicate with the cleaning device 104 and/or the server 106 may be the same or different from the network used by the cleaning device 104 to communicate with the server 106. The terminal device 102 may not be limited to a PC, a cell phone, a tablet computer, etc., and the cleaning device 104 may include, but is not limited to: the server 106 may be a server of an internet of things platform, for example, a self-cleaning robot, such as an automatic mop cleaning robot, a sweeping robot, or the like.
The operation control method of the cleaning device according to the embodiment of the present application may be executed by the terminal device 102, the cleaning device 104, or the server 106 alone, or may be executed by at least two of the terminal device 102, the cleaning device 104, and the server 106 together. The operation control method of the cleaning device according to the embodiment of the present application performed by the terminal device 102 or the cleaning device 104 may be performed by a client installed thereon.
Taking the example of the operation control method of the cleaning device in the present embodiment performed by the cleaning device 104, fig. 2 is a schematic flow chart of an alternative operation control method of the cleaning device according to an embodiment of the present application, and as shown in fig. 2, the flow chart of the method may include the following steps:
step S202, obtaining virtual exclusion zone information corresponding to a target area map, wherein the target area map is an area map of an area to be cleaned of the cleaning device, and the virtual exclusion zone information is used for indicating a virtual exclusion zone in the area map.
The operation control method of the cleaning apparatus in the present embodiment can be applied to the following scenarios: in the process that the cleaning equipment cleans the area to be cleaned in the target area, the operation of the cleaning equipment is controlled by combining a real scene and a virtual forbidden zone, so that the probability of trapping the cleaning equipment is reduced. The target area may be an indoor area in a home, may be another area such as a restaurant, an office, or a factory workshop, or may be another area that can be cleaned by a cleaning device. The cleaning device may be an intelligent dust collector, an intelligent sweeper integrating sweeping and dragging, or may be other robots with cleaning functions, which is not limited in this embodiment.
The area to be cleaned by the cleaning device is the area to be cleaned. Before or after the start of the area cleaning, the cleaning apparatus may acquire a target area map corresponding to the above-described target area, the target area map being an area map to which an area to be cleaned belongs, and the area to be cleaned may be all or part of the target area. Based on the set cleaning region information, the cleaning apparatus can determine the region to be cleaned. The above-mentioned cleaning area information may be default area information, for example, all or a specific portion of the default cleaning target area, or cleaning area information generated by the terminal device in response to detecting a selection operation performed on the target area map, and the cleaning area information may be transmitted to the cleaning device through the server side, where the selection operation may be an operation of selecting a sub-area to be cleaned from a plurality of sub-areas included in the target area.
The target area map may be an area image or an area point cloud formed by the cleaning device using an image acquisition means (e.g., camera, laser sensor), or may be a map obtained by other means, for example, a map received from the terminal device side, which may be an area image or an area point cloud formed by an image acquisition means on other devices. This is not limited in this embodiment.
The target area map may further include virtual exclusion area information, and the cleaning device may further acquire the virtual exclusion area information in the target area map. The virtual exclusion area information is used to indicate a virtual exclusion area in the target area map, the virtual exclusion area being an area identified by a virtual wall, an exclusion area line, or the like, in which access of the cleaning device is prohibited. The virtual wall and the exclusion zone line are virtual zone boundaries, and the cleaning device can pass through the virtual wall, the exclusion zone line, and the like to enter the virtual exclusion zone. The virtual exclusion zone may be an exclusion zone established by the user through a target area map displayed on the terminal device thereof, or an exclusion zone established by the cleaning device based on a history of trapped records. In this embodiment, the method for establishing the virtual exclusion zone is not limited.
For example, in order to avoid damage to the cleaning device or failure to continue operation due to falling or getting trapped at the bottom of the furniture or the like, a virtual exclusion zone may be provided at the stairway entrance of the room area map or at the furniture or the like.
In step S204, target detection is performed by the first sensor on the cleaning device to obtain target object information, where the target object information is used to represent a target scene object matched with the virtual exclusion zone in the current area where the cleaning device is located.
In order to avoid the situation that the cleaning device is trapped due to the fact that an established virtual exclusion zone and a real scene have errors, the cleaning device can perform target detection through a first sensor on the cleaning device, a scene object (namely, a scene object) in the current area where the cleaning device is located is determined, and according to the position information of the virtual exclusion zone and the detected position information of the scene object, a target scene object matched with the virtual exclusion zone can be determined, and then object information of the target scene object, namely, target object information, is obtained.
The first sensor may be an image acquisition device or other devices with an object recognition function, for example, the first sensor may be an LDS (Laser Direct Structuring, laser direct structuring technology) laser ranging sensor, which may perform point cloud data acquisition by emitting laser, and determine target object information according to the acquired point cloud data. The first sensor may be disposed on an outside (e.g., front side, left side, right side, rear side, etc.), top or bottom of the cleaning device, the number of which may be one or more, and one or more may be disposed on different sides of the cleaning device.
The current area where the cleaning device is located is an acquisition area of the first sensor, which indicates an area range that the cleaning device can acquire, and the area range may belong to an area to be cleaned (at this time, the cleaning device may be located in the area to be cleaned), may also belong to other areas except the area to be cleaned in the target area (at this time, the cleaning device may pass through other areas that enter through the virtual exclusion area, etc.), may also partially belong to the area to be cleaned, and partially belong to other areas, which is not limited in this embodiment.
In this embodiment, the cleaning apparatus may perform object recognition (scene object recognition) on the scene data acquired by the first sensor, and determine the detected scene object. The cleaning device may match the collected scene data using the reference object features and the virtual exclusion zone information of the various reference objects, respectively, determine target scene objects matched with the various reference objects, and obtain object information of all target scene objects in the current area, that is, target object information.
It should be noted that, a plurality of reference objects (i.e., objects to be referred to for performing object recognition) to be recognized and reference object characteristics of each reference object may be preconfigured, and each reference object may be a scene object of a predetermined object type, and the corresponding reference object characteristics may be prestored in the cleaning device or may be stored in the server side, so that the cleaning device or the server side may perform object recognition. The corresponding predetermined object types may be the same or different for different region types. This is not limited in this embodiment.
For example, the sweeper can detect obstacles through a sensor on the sweeper to obtain obstacles matched with a virtual forbidden zone, such as steps, sliding rails, dark carpets and the like.
Step S206, under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, controlling the cleaning equipment to execute target escaping operation according to the virtual exclusion zone information and the target object information, wherein the cleaning equipment after escaping is positioned outside the virtual exclusion zone.
During the cleaning of the area to be cleaned, the cleaning device may be trapped at a position, which may be a position within the area to be cleaned or other positions in the target area other than the area to be cleaned, for example, the cleaning device passes through the virtual wall to enter other areas in the target area other than the area to be cleaned, the cleaning device is trapped at the bottom of the furniture, the cleaning device is triggered to look down or fall, etc., due to the avoidance of the obstacle (i.e., avoidance of the obstacle). After determining to be trapped, the cleaning device may be relieved according to a configured relief strategy.
Due to the lack of fusion correction of the virtual exclusion zone with the real scene, the machine (i.e., the cleaning device) cannot appreciate the real purpose of establishing the virtual exclusion zone. In addition, in a complex scene consisting of a real escaping scene (obstacle crossing, steps, narrow areas and the like) combined with a virtual forbidden area, on one hand, a machine regards the virtual forbidden area as a real wall and cannot escape from the narrow area, and on the other hand, the machine alarm is trapped because the escaping in the complex scene may cause the machine to enter the forbidden area by mistake or pass through the forbidden area. Therefore, the escaping scheme of the cleaning device in the related art cannot be identified aiming at the escaping scene of the real complex scene and the virtual forbidden zone, and the corresponding escaping method and the compensating scheme are also lacking.
Therefore, in a complex escaping scene consisting of a real escaping scene and a virtual forbidden zone, the escaping strategy of the cleaning equipment is imperfect, the compensating scheme after being trapped is imperfect, the overall performance is not intelligent enough, and the problem of incorrect identification (for example, false triggering of down-looking/falling) of the escaping scene exists. In this embodiment, when getting rid of poverty, the cleaning device may combine the real scene with the virtual exclusion zone information, comprehensively determine the purpose of establishing the virtual exclusion zone, determine the current getting rid of poverty scene, and use the corresponding getting rid of poverty strategy to pull the ship, so as to improve the getting rid of poverty effect aiming at the virtual exclusion zone, make getting rid of poverty more intelligent, reduce the trapping rate of the cleaning device, and further enhance the experience of the user.
In this embodiment, the cleaning device may determine a current trapped scene type according to the virtual exclusion zone information and the target object information, and control the cleaning device to perform a trapping-free operation matched with the current trapped scene type, that is, a target trapping-free operation, based on the determined scene type. After the target getting rid of the trapping operation is executed, the cleaning device can be separated from the trapped scene, and meanwhile, the cleaning device after getting rid of the trapping is positioned outside the virtual exclusion zone on the map.
Optionally, the cleaning device may combine the real scene (such as an article in the scene) and the virtual exclusion zone information, comprehensively determine the purpose of establishing the virtual exclusion zone, and identify the escape scene corresponding to the established virtual exclusion zone, where the escape scene may include, but is not limited to, at least one of the following:
escape scenario 1: the machine (cleaning equipment) is easy to identify a scene of obstacle crossing and obstacle crossing blocked by an actual forbidden zone, namely, a scene that the machine easily passes over an obstacle and the machine passes over the obstacle blocked by a virtual forbidden zone;
escape scenario 2: the upper surface is blocked or LDS collides and gets rid of the trapping, consider the scene far away from the virtual wall at the same time, namely, the upper surface of the machine will be blocked in the virtual forbidden zone or can get rid of the trapping through LDS collides, need consider the scene far away from the virtual forbidden zone at the same time when getting rid of the trapping;
escape scene 3: the trigger down view or drop is processed preferentially, the virtual forbidden zone (such as the virtual wall of the virtual forbidden zone) is processed again, and finally the scenes of other obstacles are processed.
Optionally, in combination with the identified escaping scenario, the cleaning device may determine an escaping operation to be performed, e.g., away from the virtual exclusion zone, further e.g., first disregarding the virtual exclusion zone, after exiting from the obstacle or passing through the virtual exclusion zone into the area to be cleaned, then away from the virtual exclusion zone, further e.g., first disregarding the virtual exclusion zone to process a downward view or fall, then away from the virtual exclusion zone, etc. This is not limited in this embodiment.
Through the steps S202 to S206, virtual exclusion zone information corresponding to a target area map is obtained, wherein the target area map is an area map to which a cleaning area to be cleaned of the cleaning device belongs, and the virtual exclusion zone information is used for indicating a virtual exclusion zone in the area map; performing target detection through a first sensor on the cleaning equipment to obtain target object information, wherein the target object information is used for representing a target scene object matched with the virtual exclusion zone in the current area where the cleaning equipment is located; under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, the cleaning equipment is controlled to execute target escaping operation according to the virtual exclusion zone information and the target object information, so that the problem that the cleaning equipment is easy to be trapped due to the virtual exclusion zone establishment error in the operation control method of the cleaning equipment in the related technology is solved, the trapping rate of the cleaning equipment is reduced, and the user experience is improved.
In one exemplary embodiment, target object information is obtained by performing target detection by a first sensor on a cleaning device, including:
s11, carrying out target recognition on the point cloud data acquired by the first sensor to obtain target object information, wherein the target object information is the object point cloud of the target scene object.
In this embodiment, the first sensor may be a point cloud sensor, i.e., a sensor for collecting point cloud data. The cleaning device is arranged in the process of cleaning the area, the point cloud sensor can be used for collecting point cloud data, collected point cloud data are obtained, and the collected point cloud data can contain the point cloud data of obstacles in the collection range.
Alternatively, the data acquisition by the first sensor (e.g., a point cloud sensor) may be performed periodically, e.g., the first sensor may perform data acquisition in real time, i.e., data acquisition is performed every target time period (e.g., 1 s), or may be performed triggered by an event, e.g., when a collision of the cleaning device is detected, and the distance between the cleaning device and the virtual exclusion zone is less than or equal to a first distance threshold.
After the acquired point cloud data, the cleaning device may perform target recognition. For example, the cleaning device may match the collected point cloud data by using the reference object point clouds of the various reference objects and the virtual exclusion zone information, and determine target scene objects matched with the various reference objects, so as to obtain object point clouds of all target scene objects in the current area, thereby obtaining target object information.
According to the embodiment, the point cloud sensor is used for identifying the scene object, so that the accuracy and convenience of identifying the scene object can be improved.
In an exemplary embodiment, performing target recognition on the point cloud data acquired by the first sensor to obtain target object information, including:
s21, carrying out target recognition on the point cloud data acquired by the first sensor to obtain candidate object information, wherein the candidate object information is object point clouds of candidate objects contained in a current area;
s22, selecting a target scene object matched with the virtual exclusion zone from the candidate object according to the position information and the virtual exclusion zone information of the candidate object, and obtaining target object information.
In this embodiment, when performing target recognition, the cleaning device may first perform target recognition on the collected point cloud data to obtain candidate object information. The candidate object information is object information identifying a candidate object contained in the current region, where the candidate object may be a scene object of a predetermined object type, and the object information of the candidate object may be an object point cloud of the candidate object.
Alternatively, to identify candidate objects, the cleaning device may first determine the object to be identified, i.e. the detected object which may be the desired scene object, by contour detection; then, object information of the object to be identified (for example, object point cloud) is matched with object information of the reference object, and the object information of the object to be identified matched with the object information of the reference object is determined as object information of a candidate object, that is, object point cloud of the candidate object.
The cleaning device may determine a target scene object matching the virtual exclusion zone according to the position information of the virtual exclusion zone and the object information of the candidate object, so as to obtain target object information, which may be an object point cloud of the target object.
Alternatively, the cleaning apparatus may determine the target scene object matching the virtual exclusion zone based on the position information of the virtual exclusion zone and the position information of the candidate object, which may be included in the object information of the candidate object. The manner in which the target scene object that matches the virtual exclusion zone is determined may include, but is not limited to, at least one of:
determining candidate objects in the virtual exclusion zone as target scene objects matched with the virtual exclusion zone;
determining candidate objects with the distance between the candidate objects and the boundary of the virtual exclusion zone being smaller than or equal to a first distance threshold value as target scene objects matched with the virtual exclusion zone;
and determining the candidate object with the shape matched with the boundary shape of the virtual exclusion zone as a target scene object matched with the virtual exclusion zone.
For example, at a sliding rail or a step in the middle of a living room and a balcony, whether the virtual forbidden zone and the step are nearly parallel and have an approximate width is calculated, and if the virtual forbidden zone and the step are nearly parallel and have an approximate width, it can be determined that a scene object matched with the virtual forbidden zone is the sliding rail or the step.
According to the embodiment, the candidate objects in the scene are firstly identified, and then the scene objects matched with the virtual exclusion zone are determined according to the object information of the identified candidate objects and the position information of the virtual exclusion zone, so that the efficiency of determining the scene objects matched with the virtual exclusion zone can be improved.
In one exemplary embodiment, controlling the cleaning apparatus to perform a target getting rid of poverty according to the virtual exclusion zone information and the target object information includes:
s31, in the case where it is determined from the target object information that the target scene object is a scene object that allows the cleaning apparatus to pass over the obstacle surmounting, and the cleaning apparatus passes through the virtual wall of the virtual exclusion zone to enter an area other than the area to be cleaned in the target area map, controlling the cleaning apparatus to pass over the target scene object to enter the area to be cleaned.
In this embodiment, if it is determined that the current escaping scene is the escaping scene 1 according to the target object information, that is, the scene that the machine is easy to identify and surmount the obstacle and the actual exclusion zone blocks the obstacle, the cleaning device may calculate the distance between the cleaning device and the virtual exclusion zone, and then determine which side of the exclusion zone the cleaning device is located according to the target area map and the target object information.
If the cleaning device is located within the area to be cleaned and does not pass through the virtual wall of the virtual exclusion zone, the cleaning device may take a policy remote from the virtual wall. The cleaning device may also sweep along a specific boundary. If the cleaning device passes through the virtual wall of the virtual exclusion zone and enters the region except the region to be cleaned in the target region map, the cleaning device can be controlled to pass through the target scene object and enter the region to be cleaned. After entering the area to be cleaned, the cleaning device may take a strategy away from the virtual wall and may also sweep along a specific boundary.
It should be noted that the specific boundary is one of the boundary of the virtual exclusion zone and the boundary of the target scene object, which is closer to the region to be cleaned, that is, the specific boundary is the boundary of the virtual exclusion zone in the case where the target scene object is located in the virtual exclusion zone, and the specific boundary is the boundary of the target scene object in the case where the target scene object is located outside the virtual exclusion zone.
For example, at a sliding rail or a step in the middle of a living room and a balcony, if the virtual forbidden zone is calculated to be approximately parallel to the step and to have an approximate width, a escaping scene corresponding to the established virtual forbidden zone can be identified as a scene in which a machine (a sweeper, an example of the cleaning device) easily recognizes that the obstacle is surmounted and that the actual forbidden zone is blocked. The sweeper can calculate the distance between the machine and the forbidden zone and judge which side of the forbidden zone the machine is on according to the map information and the escaping real scene information (an example of target object information). If the machine is inside the room, a policy may be taken away from the virtual wall; if the machine passes through the virtual exclusion zone, the virtual exclusion zone can be omitted, and after the machine passes over the steps or the sliding rails to enter the room through the obstacle crossing strategy, the virtual wall keeping strategy is adopted.
As an example, as shown in fig. 3, when the area cleaning is performed, the sweeper passes through the forbidden zone line and enters the clothes hanger. The sweeper can identify a scene object matched with the virtual exclusion zone as the bottom end of the clothes hanger, and determine that the current escape scene is escape scene 1 and the sweeper passes through the virtual exclusion zone. At the moment, the sweeper can enter a living room through crossing a sliding rail by using the obstacle crossing strategy, and then adopts a strategy of being far away from a virtual wall.
As another example, as shown in fig. 4, the area to be cleaned by the sweeper is a living room. If the sweeper passes through the virtual forbidden zone and enters the balcony, the sweeper can identify that a scene object matched with the virtual forbidden zone is a sliding rail between a living room and the balcony, and determine that the current escaping scene is escaping scene 1 and the sweeper passes through the virtual forbidden zone. At the moment, the sweeper can enter a living room through crossing a sliding rail by using the obstacle crossing strategy, and then adopts a strategy of being far away from a virtual wall.
According to the embodiment, for the easy-to-recognize obstacle crossing and actual obstacle crossing scene of the forbidden zone, when the cleaning equipment passes through the virtual forbidden zone, the virtual forbidden zone is controlled to be ignored by the cleaning equipment to be cleaned, so that the convenience and success rate of the cleaning equipment for getting rid of the obstacle can be improved.
In one exemplary embodiment, controlling the cleaning apparatus to perform a target getting rid of poverty according to the virtual exclusion zone information and the target object information includes:
s41, controlling the cleaning device to execute target escaping operation matched with the target type under the condition that the target scene object is determined to be the scene object of the target type according to the target object information and the cleaning device enters the target object area where the target scene object is located through the virtual wall of the virtual exclusion zone, wherein the target type comprises at least one of the following components: the type in which the bottom does not allow the cleaning device to pass through, and the type in which the distance from the wall is less than or equal to the distance threshold.
In this embodiment, if it is determined that the current getting rid of the trapping scene is the aforesaid getting rid of the trapping scene 2 according to the target object information, that is, the upper surface is stuck or the LDS collides to get rid of the trapping, while considering the scene far from the virtual wall, the cleaning device may determine the object type (that is, the target type) of the target scene object. The object type of the target scene object may be a type where the bottom does not allow the cleaning device to pass, e.g. furniture with too low a bottom, or a type where the distance from the wall is less than or equal to a distance threshold (which may be a second distance threshold), e.g. a bed closer to the wall.
If the virtual wall passing through the virtual exclusion zone enters the target object area where the target scene object is located, the cleaning device can execute the target escape operation matched with the object type of the target scene object, thereby controlling the cleaning device to escape. For different object types, the cleaning device may perform different escaping operations, where the target escaping operation may be preconfigured, that is, the escaping operations corresponding to the different object types are configured, or the cleaning device may respectively try a plurality of escaping operations, and based on the try result, the plurality of escaping operations may include, but are not limited to, at least one of the following: obstacle surmounting operation, local navigation operation and edge operation. This is not limited in this embodiment.
For example, for the outer contour edges of various furniture bottoms, the sweeper can calculate the position of the point cloud obstacle and the position of the forbidden zone identified by the sensor, and the current escaping scene is identified as follows: the upper surface is easy to be blocked or the LDS is easy to collide and get rid of the trouble, and the scene far away from the virtual wall is considered. According to the point cloud outline, the floor sweeper can judge the furniture type, and the trapped area range is combined, and the trapped area of the floor sweeper is removed by matching a trapped area removing strategy with modules such as local navigation and edge.
According to the embodiment, the success rate of the escape of the cleaning equipment can be improved by combining the escape range of the virtual escape zone and adopting the corresponding escape operation according to the object type of the scene object matched with the virtual escape zone.
In one exemplary embodiment, controlling the cleaning device to perform a target override operation that matches a target type includes at least one of:
s51, under the condition that the target type comprises a type that the bottom does not allow the cleaning equipment to pass through, acquiring point cloud data through a second sensor on the cleaning equipment to obtain target point cloud data; according to the cloud data of the target point, identifying an outlet matched with the movement track of the cleaning equipment entering the target object area, wherein the size of the outlet allows the cleaning equipment to pass through; controlling the cleaning device to move out of the target object area from the outlet along the movement track;
s52, in the case that the target type includes a type in which a distance from the wall is less than or equal to a distance threshold, controlling the cleaning apparatus to move along a target boundary detected by a distance sensor of the cleaning apparatus until the target object region is moved, wherein the target boundary is at least one of: wall body, boundary of target scene object.
If the target type includes a type where the bottom does not allow the cleaning device to pass (e.g., the bottom of a wardrobe is high around, low in the middle, and the bottom does not allow the sweeper to pass), the cleaning device may be locally navigated, which may follow a historical movement trajectory or re-plan a new movement trajectory back into the area to be cleaned. And when the movement track planning is performed, the cleaning equipment can acquire point cloud data through the second sensor, so as to obtain target point cloud data. The second sensor may be the same sensor (e.g., a point cloud sensor) as the first sensor, or may be a different sensor.
And for the target point cloud data, the cleaning equipment can identify the target point cloud data, determine a moving path with a size allowing the cleaning equipment to pass through, and move out of the target object area along the determined moving track. Alternatively, the cleaning device may return to within the area to be cleaned along the historical movement trajectory: identifying target point cloud data, identifying an outlet matched with a movement track (namely, a historical movement track) of the cleaning equipment entering the target object area, wherein the size of the identified outlet allows the cleaning equipment to pass through; the target object region is moved out of the identified exit along the historical movement trajectory.
For example, after entering the furniture bottom, the sweeper can identify based on the point cloud data collected by the point cloud sensor, determine an exit or moving channel through which the sweeper is allowed to pass within a local range of the sweeper, and move out of the furniture bottom from the determined exit or moving channel.
If the target type includes a type in which the distance from the wall is less than or equal to the distance threshold, at which time the space in which the cleaning apparatus is allowed to move is small, similar to a narrow passage (i.e., a narrow area), the cleaning apparatus can be controlled to move along the wall by the edge detection until the target object area (into the area to be cleaned) is moved. Alternatively, a distance sensor (e.g., an LDS laser ranging sensor) may be disposed on the cleaning device, which may range by emitting a detection signal to detect an obstacle, e.g., a wall, and the cleaning device may move along the wall detected by the distance sensor.
Alternatively, if the bottom of the target scene object is low, the boundary of the target scene object may also be detected by the distance sensor, and the cleaning device may also be moved along the boundary of the target scene object detected by the distance sensor until it is moved out of the target object area. The distance sensor may be the same sensor as the first sensor and/or the second sensor, or may be different sensors.
According to the embodiment, the cleaning equipment is controlled to get rid of the trapping state by adopting a local navigation and/or edge movement mode based on the type of the scene object, so that the success rate of getting rid of the trapping state of the cleaning equipment can be improved.
In one exemplary embodiment, controlling the cleaning apparatus to perform a target getting rid of poverty according to the virtual exclusion zone information and the target object information includes:
s61, under the condition that the cleaning equipment is trapped due to the fact that a cliff is detected or wheels of the cleaning equipment are in a falling state, controlling the cleaning equipment to execute a first escape operation, wherein the first escape operation is used for controlling the cleaning equipment to leave the detected cliff or controlling the wheels to leave the falling state, and in the process of executing the first escape operation, the cleaning equipment disregards a virtual exclusion zone;
and S62, controlling the cleaning device to execute a second escaping operation after the first escaping operation is executed, wherein the second escaping operation is used for controlling the cleaning device to leave the range of the virtual exclusion zone under the condition that the cleaning device is detected to pass through the virtual wall of the virtual exclusion zone.
The cleaning device may be provided with a down-view sensor for cliff detection and/or a fall sensor for fall detection, i.e. detecting the fall condition of the wheels of the cleaning device. The cleaning device may perform a look-down detection and/or a fall detection by a look-down sensor and/or a fall sensor to determine whether to trigger a look-down (the look-down sensor is triggered) or a fall (the fall sensor is triggered, when the wheels of the cleaning device are in a fall state).
If the trigger is a down view or a drop, the current escaping scene can be determined to be the escaping scene 3, namely, the scene of the down view or the drop triggered by the trigger is preferentially processed, the virtual forbidden zone is processed again, and other obstacles are processed finally. The cleaning device may be preferentially disposed of looking down or falling down, when the cleaning device is disregarding the virtual exclusion zone. The cleaning device may perform a first escape operation when looking down or falling down, the first escape operation being for controlling the cleaning device to leave a detected cliff or for controlling the wheels to leave a falling state.
The first escape operation may be an escape operation that matches a look down or fall. If a look down is triggered, the cleaning device may perform a get-away operation such as a forward operation, a backward operation, a rotating operation (left rotation, right rotation, or in-place rotation, etc.), etc. If a fall is triggered, the cleaning device may perform a back-out operation, a rotation operation, or the like. The first escaping operation is not limited in this embodiment.
In some scenarios, a look down or drop may be false triggered, for example, for some dark objects (e.g., dark carpets), a look down may be false determined to be triggered due to the weak reflection of light by its surface, and for example, some objects with narrower grooves (e.g., sliding tracks) may also be false triggered. To improve the escape efficiency, the cleaning device may determine whether to false trigger a look down or fall. If a false trigger of a look down or a fall is determined, the cleaning device may ignore the trigger of the look down or fall. If the trigger is not a false trigger, the cleaning device may perform a first escape operation.
Alternatively, the amplitude of the reflected wave of the ultrasonic wave is different for cliffs and dark objects. If the downward looking is triggered, the cleaning device can perform object recognition by sending ultrasonic waves and the like, and whether the downward looking is triggered by mistake can be determined based on reflected waves of the ultrasonic waves. If the drop is triggered, the cleaning device can determine the outline, the shape and the like of the scene object which is triggered to look down based on the point cloud data acquired by the point cloud sensor, determine whether the drop is triggered by mistake, and if the scene object which is triggered to drop is determined to be an obstacle which allows the cleaning device to pass over, determine that the drop is triggered by mistake, otherwise, determine that the drop is not triggered by mistake.
For example, when the toilet is cleaned, the sweeper triggers the downward looking when moving to the position shown in fig. 5, and the sweeper can disregard the virtual forbidden zone and preferentially perform the downward looking treatment.
For another example, when the sweeper moves to the slide rail as shown in fig. 6, the depth of the groove in the middle of the slide rail may trigger the down sight or the wheel to fall, and the sweeper may determine that the down sight or the fall is triggered by the ultrasonic wave or the perceived point cloud.
After the first trapping operation is performed, the cleaning device may determine the current position information, and determine the position relationship between the cleaning device and the virtual trapping area based on the current position information and the position information of the virtual trapping area. If it is determined that the cleaning device passes through a virtual wall of the virtual exclusion zone (possibly into the virtual exclusion zone, and possibly also a second escape operation is performed, which may be used to control the cleaning device to leave the range of the virtual exclusion zone, upon performing the second escape operation, the cleaning device may determine a current escape scenario (e.g., escape scenario 1, escape scenario 2), and perform a corresponding escape operation, i.e., a second escape operation, based on the determined escape scenario.
For example, in dark carpets or recessed tracks, the machine is prone to false triggering down or falling, and the sweeper can perform the escaping treatment according to the priority order of preferentially treating down/falling, reprocessing virtual walls and finally treating other escaping scenes. The sweeper can firstly judge whether the downward looking/falling is triggered by mistake, if so, the downward looking/falling is ignored, if not, the downward looking/falling is preferentially processed, and after the trapped people succeed in getting rid of the trapped people, the trapped people combine with the forbidden area range to design a getting rid of the trapped people strategy.
Through the embodiment, the safety of equipment operation can be improved, and the equipment escaping efficiency can be improved by preferentially processing the downward looking/falling, reprocessing the virtual wall and finally processing other escaping scenes.
In one exemplary embodiment, the method further comprises, after the target detection by the first sensor on the cleaning device:
s71, controlling the cleaning equipment to clean along the boundary of the target scene object under the condition that the boundary of the target scene object is positioned in the to-be-cleaned area;
and S72, controlling the cleaning equipment to clean along the virtual wall of the virtual exclusion zone in the case that the boundary of the target scene object is located outside the to-be-cleaned area.
In the present embodiment, after the target detection by the first sensor, the cleaning apparatus may clean the region to be cleaned based on the positional relationship of the region to be cleaned, the virtual exclusion zone, and the target scene object. In performing zone cleaning, the cleaning device may take a strategy away from the virtual wall, sweeping along a particular boundary.
The specific boundary is one of the boundary of the virtual exclusion zone and the boundary of the target scene object, which is closer to the region to be cleaned. If the boundary of the target scene object is positioned in the area to be cleaned, the specific boundary is the boundary of the target scene object, and the cleaning equipment can clean the area along the boundary of the target scene object; if the boundary of the target scene object is located outside the to-be-cleaned area, the specific boundary is the boundary of the virtual exclusion zone, which may be the virtual wall of the virtual exclusion zone, and the cleaning device may perform area cleaning along the virtual wall of the virtual exclusion zone.
According to the embodiment, the to-be-cleaned area is cleaned based on the position relation among the to-be-cleaned area, the virtual exclusion area and the target scene object, so that the rationality of area cleaning can be improved, and the probability of equipment being trapped is reduced.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required in the present application.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM (Read-Only Memory)/RAM (Random Access Memory ), magnetic disk, optical disc), including instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method described in the embodiments of the present application.
According to still another aspect of the embodiments of the present application, there is also provided an operation control device of a cleaning apparatus for implementing the operation control method of a cleaning apparatus described above. Fig. 7 is a block diagram of an operation control device of an alternative cleaning apparatus according to an embodiment of the present application, and as shown in fig. 7, the device may include:
A first obtaining unit 702, configured to obtain virtual exclusion zone information corresponding to a target area map, where the target area map is an area map to which an area to be cleaned of the cleaning device belongs, and the virtual exclusion zone information is used to indicate a virtual exclusion zone in the area map;
the detection unit 704 is connected to the first acquisition unit 702, and is configured to perform target detection through a first sensor on the cleaning device, so as to obtain target object information, where the target object information is used to represent a target scene object that matches with the virtual exclusion zone in the current area where the cleaning device is located;
the first control unit 706 is connected to the detection unit 704, and is configured to control the cleaning device to perform a target escaping operation according to the virtual exclusion zone information and the target object information when the cleaning device is trapped in the cleaning process of the cleaning area, where the cleaning device after escaping is located outside the virtual exclusion zone.
It should be noted that, the first obtaining unit 702 in this embodiment may be configured to perform the above-mentioned step S202, the detecting unit 704 in this embodiment may be configured to perform the above-mentioned step S204, and the first control unit 706 in this embodiment may be configured to perform the above-mentioned step S206.
The virtual exclusion zone information corresponding to a target area map is obtained through the module, wherein the target area map is an area map of a to-be-cleaned area to be cleaned of the cleaning equipment, and the virtual exclusion zone information is used for indicating a virtual exclusion zone in the area map; performing target detection through a first sensor on the cleaning equipment to obtain target object information, wherein the target object information is used for representing a target scene object matched with the virtual exclusion zone in the current area where the cleaning equipment is located; under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, the cleaning equipment is controlled to execute target escaping operation according to the virtual exclusion zone information and the target object information, so that the problem that the cleaning equipment is easy to be trapped due to the virtual exclusion zone establishment error in the operation control method of the cleaning equipment in the related technology is solved, the trapping rate of the cleaning equipment is reduced, and the user experience is improved.
In one exemplary embodiment, the detection unit includes:
the first identification module is used for carrying out target identification on the point cloud data acquired by the first sensor to obtain target object information, wherein the target object information is the object point cloud of the target scene object.
In one exemplary embodiment, the first identification module includes:
the identification sub-module is used for carrying out target identification on the point cloud data acquired by the first sensor to obtain candidate object information, wherein the candidate object information is object point clouds of candidate objects contained in the current area;
and the selecting sub-module is used for selecting a target scene object matched with the virtual exclusion zone from the candidate object according to the position information and the virtual exclusion zone information of the candidate object to obtain target object information.
In one exemplary embodiment, the first control unit includes:
and the first control module is used for controlling the cleaning device to pass through the target scene object to enter the cleaning region under the condition that the target scene object is determined to be the scene object which allows the cleaning device to pass through according to the target object information and the cleaning device passes through the virtual wall of the virtual exclusion zone to enter the region except the cleaning region in the target region map.
In one exemplary embodiment, the first control unit includes:
the second control module is used for controlling the cleaning device to execute target getting rid of the trapping operation matched with the target type under the condition that the target scene object is determined to be the scene object of the target type according to the target object information and the cleaning device enters the target object area where the target scene object is located through the virtual wall of the virtual exclusion zone, wherein the target type comprises at least one of the following components: the type in which the bottom does not allow the cleaning device to pass through, and the type in which the distance from the wall is less than or equal to the distance threshold.
In one exemplary embodiment, the first control unit includes:
the acquisition module is used for acquiring point cloud data through a second sensor on the cleaning equipment under the condition that the target type comprises a type of which the bottom does not allow the cleaning equipment to pass through, so as to obtain the target point cloud data; the second identification module is used for identifying an outlet matched with the movement track of the cleaning equipment entering the target object area according to the target point cloud data, wherein the size of the outlet allows the cleaning equipment to pass through; the third control module is used for controlling the cleaning equipment to move out of the target object area from the outlet along the moving track;
a fourth control module for controlling the cleaning device to move along the target boundary detected by the distance sensor of the cleaning device until the target object area is moved out, wherein the target boundary is at least one of the following: wall body, boundary of target scene object.
In one exemplary embodiment, the first control unit includes:
a fifth control module, configured to control the cleaning device to perform a first escape operation when the cleaning device is trapped due to detection of a cliff or detection of a wheel of the cleaning device being in a falling state, where the first escape operation is used to control the cleaning device to leave the detected cliff or control the wheel to leave the falling state, and during the performing of the first escape operation, the cleaning device disregards the virtual exclusion zone;
And the sixth control module is used for controlling the cleaning device to execute a second escape operation under the condition that the cleaning device is detected to pass through the virtual wall of the virtual exclusion zone after the first escape operation is executed, wherein the second escape operation is used for controlling the cleaning device to leave the range of the virtual exclusion zone.
In an exemplary embodiment, the above apparatus further includes:
a second control unit for controlling the cleaning device to clean along the boundary of the target scene object in a case where the boundary of the target scene object is located within the area to be cleaned after the target detection by the first sensor on the cleaning device;
and a third control unit for controlling the cleaning device to clean along the virtual wall of the virtual exclusion zone in case that the boundary of the target scene object is located outside the area to be cleaned.
It should be noted that the above modules are the same as examples and application scenarios implemented by the corresponding steps, but are not limited to what is disclosed in the above embodiments.
It should be noted that the above modules may be implemented in software or in hardware as part of the apparatus shown in fig. 1, where the hardware environment includes a network environment.
According to yet another aspect of embodiments of the present application, there is also provided a storage medium. Alternatively, in the present embodiment, the above-described storage medium may be used to execute the program code of the operation control method of the cleaning apparatus of any one of the above-described embodiments of the present application.
Alternatively, in this embodiment, the storage medium may be located on at least one network device of the plurality of network devices in the network shown in the above embodiment.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of:
s1, obtaining virtual exclusion zone information corresponding to a target area map, wherein the target area map is an area map of an area to be cleaned of cleaning equipment, and the virtual exclusion zone information is used for indicating a virtual exclusion zone in the area map;
s2, performing target detection through a first sensor on the cleaning equipment to obtain target object information, wherein the target object information is used for representing a target scene object matched with the virtual exclusion zone in the current area where the cleaning equipment is located;
s3, under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, controlling the cleaning equipment to execute target escaping operation according to the virtual exclusion zone information and the target object information, wherein the cleaning equipment after escaping is located outside the virtual exclusion zone.
Alternatively, specific examples in the present embodiment may refer to examples described in the above embodiments, which are not described in detail in the present embodiment.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: various media capable of storing program codes, such as a U disk, ROM, RAM, a mobile hard disk, a magnetic disk or an optical disk.
According to still another aspect of the embodiments of the present application, there is also provided an electronic device for implementing the operation control method of the cleaning apparatus described above, which may be a server, a terminal, or a combination thereof.
Fig. 8 is a block diagram of an alternative electronic device, according to an embodiment of the present application, including a processor 802, a communication interface 804, a memory 806, and a communication bus 808, as shown in fig. 8, wherein the processor 802, the communication interface 804, and the memory 806 communicate with each other via the communication bus 808, wherein,
a memory 806 for storing a computer program;
the processor 802, when executing the computer program stored on the memory 806, performs the following steps:
s1, obtaining virtual exclusion zone information corresponding to a target area map, wherein the target area map is an area map of an area to be cleaned of cleaning equipment, and the virtual exclusion zone information is used for indicating a virtual exclusion zone in the area map;
S2, performing target detection through a first sensor on the cleaning equipment to obtain target object information, wherein the target object information is used for representing a target scene object matched with the virtual exclusion zone in the current area where the cleaning equipment is located;
s3, under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, controlling the cleaning equipment to execute target escaping operation according to the virtual exclusion zone information and the target object information, wherein the cleaning equipment after escaping is located outside the virtual exclusion zone.
Alternatively, in the present embodiment, the communication bus may be a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, or an EISA (Extended Industry Standard Architecture ) bus, or the like. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, only one thick line is shown in fig. 8, but not only one bus or one type of bus. The communication interface is used for communication between the electronic device and other equipment.
The memory may include RAM or nonvolatile memory (non-volatile memory), such as at least one disk memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
As an example, the memory 806 may include, but is not limited to, a first acquisition unit 702, a detection unit 704, and a first control unit 706 in a control apparatus including the device. In addition, other module units in the control device of the above apparatus may be included, but are not limited to, and are not described in detail in this example.
The processor may be a general purpose processor and may include, but is not limited to: CPU (Central Processing Unit ), NP (Network Processor, network processor), etc.; but also DSP (Digital Signal Processing, digital signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field-Programmable Gate Array, field programmable gate array) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments, and this embodiment is not described herein.
It will be understood by those skilled in the art that the structure shown in fig. 8 is only illustrative, and the device implementing the operation control method of the cleaning device may be a terminal device, and the terminal device may be a smart phone (such as an Android mobile phone, an iOS mobile phone, etc.), a tablet computer, a palm computer, a mobile internet device (Mobile Internet Devices, MID), a PAD, etc. Fig. 8 is not limited to the structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 8, or have a different configuration than shown in FIG. 8.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program for instructing a terminal device to execute in association with hardware, the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, ROM, RAM, magnetic or optical disk, etc.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
The integrated units in the above embodiments may be stored in the above-described computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause one or more computer devices (which may be personal computers, servers or network devices, etc.) to perform all or part of the steps of the methods described in the various embodiments of the present application.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution provided in the present embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application and are intended to be comprehended within the scope of the present application.

Claims (11)

1. A method of controlling operation of a cleaning apparatus, comprising:
obtaining virtual exclusion zone information corresponding to a target area map, wherein the target area map is an area map of an area to be cleaned of the cleaning equipment, and the virtual exclusion zone information is used for indicating a virtual exclusion zone in the area map;
performing target detection through a first sensor on the cleaning equipment to obtain target object information, wherein the target object information is used for representing a target scene object matched with the virtual exclusion zone in a current area where the cleaning equipment is located;
and under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, controlling the cleaning equipment to execute target trapping operation according to the virtual exclusion zone information and the target object information, wherein the cleaning equipment after trapping is positioned outside the virtual exclusion zone.
2. The method of claim 1, wherein the target detection by the first sensor on the cleaning device yields target object information, comprising:
and carrying out target recognition on the point cloud data acquired by the first sensor to obtain the target object information, wherein the target object information is the object point cloud of the target scene object.
3. The method of claim 2, wherein the performing object recognition on the point cloud data acquired by the first sensor to obtain the object information includes:
performing target recognition on the point cloud data acquired by the first sensor to obtain candidate object information, wherein the candidate object information is object point clouds of candidate objects contained in the current area;
and selecting the target scene object matched with the virtual exclusion zone from the candidate object according to the position information of the candidate object and the virtual exclusion zone information, and obtaining the target object information.
4. The method of claim 1, wherein controlling the cleaning device to perform a target override operation based on the virtual exclusion zone information and the target object information comprises:
And controlling the cleaning device to pass through the target scene object to enter the to-be-cleaned area under the condition that the target scene object is determined to be the scene object which allows the cleaning device to pass through according to the target object information and the cleaning device passes through the virtual wall of the virtual exclusion zone to enter the area except the to-be-cleaned area in the target area map.
5. The method of claim 1, wherein controlling the cleaning device to perform a target override operation based on the virtual exclusion zone information and the target object information comprises:
and under the condition that the target scene object is determined to be a scene object of a target type according to the target object information, and the cleaning equipment enters a target object area where the target scene object is located through a virtual wall of the virtual exclusion zone, controlling the cleaning equipment to execute the target escape operation matched with the target type, wherein the target type comprises at least one of the following components: the type in which the bottom does not allow the cleaning device to pass through, and the type in which the distance from the wall is less than or equal to the distance threshold.
6. The method of claim 5, wherein the controlling the cleaning device to perform the target override operation that matches the target type comprises:
under the condition that the target type comprises a type that the bottom of the target type does not allow the cleaning equipment to pass through, acquiring point cloud data through a second sensor on the cleaning equipment to obtain target point cloud data; identifying an outlet matched with the movement track of the cleaning equipment entering the target object area according to the target point cloud data, wherein the size of the outlet allows the cleaning equipment to pass through; controlling the cleaning device to move out of the target object area from the outlet along the moving track;
controlling the cleaning device to move along a target boundary detected by a distance sensor of the cleaning device until the target object area is moved, wherein the target boundary is at least one of the following: the wall body and the boundary of the target scene object.
7. The method of claim 1, wherein controlling the cleaning device to perform a target override operation based on the virtual exclusion zone information and the target object information comprises:
Controlling the cleaning device to execute a first escape operation under the condition that the cleaning device is trapped due to the detection of a cliff or the detection of the wheels of the cleaning device in a falling state, wherein the first escape operation is used for controlling the cleaning device to leave the detected cliff or controlling the wheels to leave the falling state, and the cleaning device disregards a virtual forbidden zone in the process of executing the first escape operation;
and after the first getting rid of the trapping operation is performed, controlling the cleaning device to perform a second getting rid of the trapping operation under the condition that the cleaning device is detected to pass through the virtual wall of the virtual exclusion zone, wherein the second getting rid of the trapping operation is used for controlling the cleaning device to leave the range of the virtual exclusion zone.
8. The method of any one of claims 1 to 7, wherein after the target detection by the first sensor on the cleaning device, the method further comprises:
controlling the cleaning equipment to clean along the boundary of the target scene object under the condition that the boundary of the target scene object is positioned in the to-be-cleaned area;
and controlling the cleaning equipment to clean along the virtual wall of the virtual forbidden zone under the condition that the boundary of the target scene object is positioned outside the to-be-cleaned area.
9. An operation control device of a cleaning apparatus, comprising:
a first obtaining unit, configured to obtain virtual exclusion area information corresponding to a target area map, where the target area map is an area map to which a to-be-cleaned area to be cleaned by the cleaning device belongs, and the virtual exclusion area information is used to indicate a virtual exclusion area in the area map;
the detection unit is used for carrying out target detection through a first sensor on the cleaning equipment to obtain target object information, wherein the target object information is used for representing a target scene object matched with the virtual exclusion zone in the current area where the cleaning equipment is located;
and the first control unit is used for controlling the cleaning equipment to execute target escaping operation according to the virtual exclusion zone information and the target object information under the condition that the cleaning equipment is trapped in the process of cleaning the area to be cleaned, wherein the cleaning equipment after escaping is positioned outside the virtual exclusion zone.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a stored program, wherein the program when run performs the method of any one of claims 1 to 7.
11. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to execute the method according to any of claims 1 to 7 by means of the computer program.
CN202111464917.3A 2021-12-02 2021-12-02 Operation control method and device of cleaning equipment, storage medium and electronic device Pending CN116211168A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111464917.3A CN116211168A (en) 2021-12-02 2021-12-02 Operation control method and device of cleaning equipment, storage medium and electronic device
PCT/CN2022/131571 WO2023098455A1 (en) 2021-12-02 2022-11-12 Operation control method, apparatus, storage medium, and electronic apparatus for cleaning device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111464917.3A CN116211168A (en) 2021-12-02 2021-12-02 Operation control method and device of cleaning equipment, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN116211168A true CN116211168A (en) 2023-06-06

Family

ID=86568311

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111464917.3A Pending CN116211168A (en) 2021-12-02 2021-12-02 Operation control method and device of cleaning equipment, storage medium and electronic device

Country Status (2)

Country Link
CN (1) CN116211168A (en)
WO (1) WO2023098455A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117452955A (en) * 2023-12-22 2024-01-26 珠海格力电器股份有限公司 Control method, control device and cleaning system of cleaning equipment
CN117478714A (en) * 2023-11-09 2024-01-30 南京特沃斯清洁设备有限公司 Cleaning equipment control method and device based on Internet of things

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108829115B (en) * 2018-10-09 2019-01-29 上海岚豹智能科技有限公司 A kind of motion control method and its calculating equipment of robot
CN109394086A (en) * 2018-11-19 2019-03-01 珠海市微半导体有限公司 A kind of walk on method, apparatus and chip based on trapped clean robot
TWI700568B (en) * 2019-01-23 2020-08-01 燕成祥 Virtual wall device and robot and control method thereof
CN111568306A (en) * 2019-02-19 2020-08-25 北京奇虎科技有限公司 Cleaning method and device based on cleaning robot, electronic equipment and storage medium
CN111714028A (en) * 2019-03-18 2020-09-29 北京奇虎科技有限公司 Method, device and equipment for escaping from restricted zone of cleaning equipment and readable storage medium
KR102305206B1 (en) * 2019-07-11 2021-09-28 엘지전자 주식회사 Robot cleaner for cleaning in consideration of floor state through artificial intelligence and operating method thereof
CN112137509A (en) * 2020-09-24 2020-12-29 江苏美的清洁电器股份有限公司 Virtual forbidden zone setting method and device and cleaning robot
CN112890692A (en) * 2021-02-08 2021-06-04 美智纵横科技有限责任公司 Method and device for setting cleaning forbidden zone, cleaning equipment and storage medium
CN113469000B (en) * 2021-06-23 2024-06-14 追觅创新科技(苏州)有限公司 Regional map processing method and device, storage medium and electronic device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117478714A (en) * 2023-11-09 2024-01-30 南京特沃斯清洁设备有限公司 Cleaning equipment control method and device based on Internet of things
CN117478714B (en) * 2023-11-09 2024-03-08 南京特沃斯清洁设备有限公司 Cleaning equipment control method and device based on Internet of things
CN117452955A (en) * 2023-12-22 2024-01-26 珠海格力电器股份有限公司 Control method, control device and cleaning system of cleaning equipment
CN117452955B (en) * 2023-12-22 2024-04-02 珠海格力电器股份有限公司 Control method, control device and cleaning system of cleaning equipment

Also Published As

Publication number Publication date
WO2023098455A1 (en) 2023-06-08

Similar Documents

Publication Publication Date Title
US10102429B2 (en) Systems and methods for capturing images and annotating the captured images with information
KR102490996B1 (en) A robotic cleaning device that changes its operating speed based on the environment
US10350762B2 (en) Autonomously moving body, movement controlling method, and recording medium storing movement controlling program
CN110605713B (en) Robot positioning method, robot, and storage medium
CN116211168A (en) Operation control method and device of cleaning equipment, storage medium and electronic device
CN110989630B (en) Self-moving robot control method, device, self-moving robot and storage medium
JP2019528535A (en) Mobile robot and control method thereof
US20210007572A1 (en) Mobile robot using artificial intelligence and controlling method thereof
KR102548936B1 (en) Artificial intelligence Moving robot and control method thereof
US20210370511A1 (en) Cleaning robot and task performing method therefor
CN115151172A (en) Control of autonomous mobile robot
CN111714028A (en) Method, device and equipment for escaping from restricted zone of cleaning equipment and readable storage medium
CN113475977A (en) Robot path planning method and device and robot
CN110687903A (en) Mobile robot trapped judging method and device and motion control method and device
KR20190101326A (en) Method for dividing moving space and moving robot for moving divided moving space
CN113786125A (en) Operation method, self-moving device and storage medium
KR102022877B1 (en) Apparatus of detecting and removing condensation and mold
CN109947094B (en) Travel method, self-moving device and storage medium
US20220147050A1 (en) Methods and devices for operating an intelligent mobile robot
CN113741441A (en) Operation method and self-moving equipment
CN112927278A (en) Control method, control device, robot and computer-readable storage medium
WO2023142711A1 (en) Robot control method and device, and robot and storage medium
CN117281433A (en) Cleaning equipment and mapping method and system thereof as well as storage medium
KR102500525B1 (en) Moving robot
CN118058658A (en) Movement control method of cleaning robot and cleaning robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination