WO2024021111A1 - Procédé de commande de robot de nettoyage, et procédé, dispositif et système de traitement, de génération, de division de zone et d'exploration - Google Patents

Procédé de commande de robot de nettoyage, et procédé, dispositif et système de traitement, de génération, de division de zone et d'exploration Download PDF

Info

Publication number
WO2024021111A1
WO2024021111A1 PCT/CN2022/109209 CN2022109209W WO2024021111A1 WO 2024021111 A1 WO2024021111 A1 WO 2024021111A1 CN 2022109209 W CN2022109209 W CN 2022109209W WO 2024021111 A1 WO2024021111 A1 WO 2024021111A1
Authority
WO
WIPO (PCT)
Prior art keywords
cleaning
area
preset
cleaning robot
robot
Prior art date
Application number
PCT/CN2022/109209
Other languages
English (en)
Chinese (zh)
Inventor
龚鼎
王宇谦
王锦涛
肖思仪
黄靖文
沈晓倩
杨永森
Original Assignee
云鲸智能(深圳)有限公司
云鲸智能创新(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 云鲸智能(深圳)有限公司, 云鲸智能创新(深圳)有限公司 filed Critical 云鲸智能(深圳)有限公司
Priority to PCT/CN2022/109209 priority Critical patent/WO2024021111A1/fr
Publication of WO2024021111A1 publication Critical patent/WO2024021111A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/28Floor-scrubbing machines, motor-driven

Definitions

  • the present application relates to the field of cleaning technology, and in particular to a control method, processing, generation, area division, exploration method, device, system and storage medium of a cleaning robot.
  • Cleaning robots can be used to automatically clean floors, and their application scenarios can include household indoor cleaning, large-scale place cleaning, etc.
  • basic cleaning modes are generally provided, such as whole-house sweeping, whole-house mopping, etc.
  • a single cleaning mode is difficult to meet user needs.
  • users often need to manually set at least one of the following for different rooms: different cleaning modes, cleaning parameters of each room to be cleaned, cleaning frequency, etc., in order to achieve the expected cleaning effect. This type of cleaning robot is still not smart enough.
  • This application provides a control method, processing, generation, area division, exploration method, device, system and storage medium for a cleaning robot, aiming to solve technical problems such as the single cleaning mode of the cleaning robot being difficult to meet user needs and not intelligent enough.
  • embodiments of the present application provide a control method for a cleaning robot, including:
  • the cleaning robot is controlled to clean at least part of the non-carpet area in the cleaning task map through at least a mopping element.
  • embodiments of the present application provide a control method for a cleaning robot, including:
  • the cleaning robot is controlled to perform maintenance.
  • embodiments of the present application provide a method for controlling a cleaning robot, which is used to control the cleaning robot to clean a preset cleaning area, including:
  • the cleaning robot is controlled to perform a boundary leak-filling cleaning task on the preset cleaning area.
  • control device for a cleaning robot, where the control device includes a memory and a processor;
  • the memory is used to store computer programs
  • the processor is configured to execute the computer program and when executing the computer program, implement:
  • embodiments of the present application provide a cleaning system, including:
  • the cleaning robot includes a walking unit, a mopping part and a brushing part, the walking unit is used to drive the cleaning robot to move, the mopping part and the brushing part are used to clean the ground;
  • a base station which is used at least for cleaning the mopping parts of the cleaning robot.
  • embodiments of the present application provide a method for processing a cleaning image of a cleaning device, which is used to generate a cleaning image after the cleaning device performs a cleaning task and completes cleaning of one or at least two preset cleaning areas with a cleaning piece, include:
  • a cleaning image is generated based on the degree of dirtiness corresponding to one or at least two preset cleaning areas.
  • embodiments of the present application provide a processing device for cleaning images of a cleaning device, where the processing device includes a memory and a processor;
  • the memory is used to store computer programs
  • the processor is configured to execute the computer program and, when executing the computer program, implement the steps of the aforementioned method for processing an image cleaned by a cleaning device.
  • embodiments of the present application provide a cleaning equipment system, including:
  • the cleaning equipment includes a movement mechanism and a cleaning piece, the movement mechanism is used to drive the cleaning equipment to move, so that the cleaning piece cleans the preset cleaning area;
  • a base station the base station is at least used to clean the cleaning parts of the cleaning equipment.
  • embodiments of the present application provide a cleaning equipment system, including:
  • the cleaning equipment includes a movement mechanism, cleaning parts and a maintenance mechanism.
  • the movement mechanism is used to drive the cleaning equipment to move so that the cleaning parts clean the preset cleaning area.
  • the maintenance mechanism is used to Clean the cleaning parts; and,
  • embodiments of the present application provide a method for generating a visual interface, including:
  • a visual interface is generated based on the acquired execution items of all or part of the cleaning equipment, and the visual interface is used to indicate all or part of the execution items of the cleaning equipment through animation.
  • embodiments of the present application provide a device for generating a visual interface, where the processing device includes a memory and a processor;
  • the memory is used to store computer programs
  • the processor is configured to execute the computer program and implement the steps of the aforementioned visual interface generation method when executing the computer program.
  • embodiments of the present application provide a cleaning equipment system, including:
  • the cleaning equipment includes a movement mechanism and an actuator, the movement mechanism is used to drive the cleaning equipment to move, so that the actuator performs cleaning;
  • the aforementioned generating device The aforementioned generating device.
  • embodiments of the present application provide a cleaning area division method for a cleaning robot, including:
  • the dividing line of the room is determined so that the dividing line and the boundary of the room form at least two preset cleaning areas, each of the preset cleaning areas
  • the magnitude of the workload in the region is less than or equal to the upper limit of the workload value range, and greater than or equal to the lower limit of the workload value range; or
  • the magnitude of the work load of only one of the preset cleaning areas is less than the lower limit value, and the cleaning sequence of the preset cleaning area whose magnitude value is less than the lower limit value is in the other preset cleaning areas. After cleaning the area.
  • embodiments of the present application provide a control method for a cleaning robot, including:
  • the cleaning robot is controlled to clean the room according to the preset cleaning area.
  • control device for a cleaning robot, where the control device includes a memory and a processor;
  • the memory is used to store computer programs
  • the processor is configured to execute the computer program and implement the steps of the foregoing method when executing the computer program.
  • embodiments of the present application provide a cleaning system, including:
  • the cleaning robot includes a movement mechanism and cleaning parts, the movement mechanism is used to drive the cleaning robot to move, so that the cleaning parts clean the floor;
  • a base station for at least maintaining the cleaning robot
  • this application provides a ground medium exploration method, including:
  • the cleaning robot When the cleaning robot detects the preset ground medium, the cleaning robot obtains the status information of the cleaning robot, and determines the edge exploration mode according to the status information, wherein the edge exploration mode includes an inner edge exploration mode and an outer edge exploration mode. ;
  • the preset ground medium is edge-explored according to the edge-edge exploration mode to obtain the outline of the preset ground medium.
  • the present application also provides a cleaning robot.
  • the cleaning robot includes a processor, a memory, and a computer program stored on the memory and executable by the processor, wherein the computer program is When the processor is executed, the steps of the above ground medium exploration method are implemented.
  • embodiments of the present application provide a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program.
  • the processor causes the processor to implement the above method. step.
  • Embodiments of the present application provide a control method, processing, generation, area division, exploration method, device, system and storage medium for a cleaning robot.
  • the method includes: obtaining a cleaning task map; determining whether the cleaning task map includes a carpet area, And when the cleaning task map includes a carpet area, the cleaning robot is controlled to clean the carpet in the carpet area through a brushing component; the cleaning robot is controlled to clean at least one of the carpets in the cleaning task map through a mopping component. Clean some non-carpet areas; by automatically brushing the carpet according to the carpet area in the cleaning task map, and at least mopping the non-carpet areas, it does not require the user to set different cleaning methods for different areas, improving Improves the intelligence of cleaning robots.
  • Figure 1 is a schematic flowchart of a control method for a cleaning robot provided by an embodiment of the present application
  • Figure 2 is a schematic diagram of a cleaning system in an embodiment
  • Figure 3 is a schematic block diagram of a cleaning robot in an embodiment
  • Figure 4 is a schematic structural diagram of a cleaning robot in one embodiment
  • Figure 5 is a schematic structural diagram of a base station in an embodiment
  • Figure 6 is a schematic block diagram of a base station in an embodiment
  • Figure 7 is a schematic diagram of an inner edge exploration scenario provided by an embodiment of the present application.
  • Figure 8 is a schematic diagram of an outer edge exploration scenario provided by an embodiment of the present application.
  • Figure 9 is a schematic flowchart of a control method for a cleaning robot provided by an embodiment of the present application.
  • Figure 10 is a schematic diagram of workload per unit area in an embodiment
  • Figure 11 is a schematic flow chart of a control method in an embodiment
  • Figure 12 is a schematic flowchart of a control method for a cleaning robot provided by an embodiment of the present application.
  • Figure 13 is a schematic structural diagram of a cleaning robot in one embodiment
  • Figure 14 is a schematic diagram of the cleaning blind area during edge cleaning movement in one embodiment
  • Figures 15 and 16 are schematic diagrams of a cleaning robot performing a boundary leak-filling cleaning task in some embodiments
  • Figure 17 is a schematic diagram of a narrow area in an embodiment
  • Figures 18 to 20 are schematic diagrams of cleaning robots performing boundary leak repair cleaning tasks in other embodiments.
  • Figures 21 and 22 are schematic diagrams of the cleaning blind area when cleaning along the columnar body in some embodiments.
  • Figure 23 is a schematic diagram of the boundary leakage cleaning strategy corresponding to cleaning along the columnar body in one embodiment
  • Figure 24 is a schematic block diagram of a control device of a cleaning robot provided by an embodiment of the present application.
  • Figure 25 is a schematic flowchart of a method for processing images cleaned by a cleaning device according to an embodiment of the present application
  • Figure 26 is a schematic block diagram of a cleaning equipment system provided by an embodiment of the present application.
  • Figure 27 is a schematic block diagram of a cleaning equipment system provided by another embodiment of the present application.
  • Figure 28 is a schematic diagram of changes in the degree of dirtiness of the mopping parts in one embodiment
  • Figure 29 is a schematic diagram of the preset cleaning area and image area provided by the embodiment of the present application.
  • Figure 30 is a schematic diagram of a cleaning image in an embodiment
  • Figure 31 is a schematic diagram of a cleaning image according to an embodiment
  • Figure 32 is a schematic diagram of another cleaning image according to an embodiment
  • Figure 33 is a schematic diagram of another cleaning image according to an embodiment
  • Figure 34 is a schematic diagram of a cleaning image in an embodiment
  • Figure 35 is a schematic diagram of a cleaning image according to an embodiment
  • Figure 36 is a schematic diagram of another cleaning image according to an embodiment
  • Figure 37 is a schematic diagram of another cleaning image according to an embodiment
  • Figure 38 is a schematic diagram of a cleaning image in an embodiment
  • Figure 39 is a schematic diagram of a cleaning image in an embodiment
  • Figure 40 is a schematic diagram of a cleaning image according to an embodiment
  • Figure 41 is a schematic diagram of another cleaning image according to an embodiment
  • Figure 42 is a schematic diagram of another cleaning image according to an embodiment
  • Figure 43 is a schematic diagram of a cleaning image in an embodiment
  • Figure 44 is a schematic diagram of a cleaning image in an embodiment
  • Figure 45 is a schematic diagram of a cleaning image according to an embodiment
  • Figure 46 is a schematic diagram of a room and a room area provided by an embodiment of the present application.
  • Figure 47 is a schematic diagram of a room cleaning image in one embodiment
  • Figure 48 is a schematic diagram of a track cleaning image in an embodiment
  • Figure 49 is a schematic diagram of a track cleaning image in an embodiment
  • Figure 50 is a cleaning image related to an embodiment of the present application.
  • Figure 51 is a schematic block diagram of a cleaning image processing device provided by a cleaning device according to an embodiment of the present application.
  • Figure 52 is a schematic flowchart of a method for generating a visual interface provided by an embodiment of the present application.
  • Figure 53 is a schematic diagram of a visual interface in an embodiment
  • Figure 54 is a schematic diagram of a visual interface in an embodiment
  • Figure 55 is a schematic diagram of a visual interface in an embodiment
  • Figure 56 is a schematic diagram of a visual interface in an embodiment
  • Figure 57 is a schematic diagram of a visual interface in an embodiment
  • Figure 58 is a schematic diagram of a visual interface in an embodiment
  • Figure 59 is a schematic diagram of a visual interface in an embodiment
  • Figure 60 is a schematic flowchart of a method for generating a visual interface provided by another embodiment of the present application.
  • Figure 61 is a schematic diagram of a pop-up window in an embodiment
  • Figure 62 is a schematic diagram of a pop-up window in an embodiment
  • Figure 63 is a schematic diagram of a picture corresponding to the first execution item involved in an embodiment
  • Figure 64 is a schematic diagram of saving identification in an embodiment
  • Figure 65 is a schematic block diagram of a visual interface generation device provided by an embodiment of the present application.
  • Figure 66 is a schematic block diagram of a cleaning equipment system provided by an embodiment of the present application.
  • Figure 67 is a schematic flowchart of a cleaning area division method for a cleaning robot provided by an embodiment of the present application.
  • Figure 68 is a schematic diagram of a room in the cleaning task map in one embodiment
  • Figure 69 is a schematic diagram of the workload per unit area in an embodiment
  • Figure 70 is a schematic diagram of different partitioning schemes in an embodiment
  • Figure 71 is a schematic diagram of an obstacle-dense area in an embodiment
  • Figure 72 is a schematic diagram of the movement trajectory of the cleaning robot when there is no dense obstacle area
  • Figure 73 is a schematic diagram of the movement trajectory of the cleaning robot when aggregating areas with dense obstacles
  • Figure 74 is a schematic flowchart of a cleaning method for mopping parts provided by an embodiment of the present application.
  • Figure 75 is a schematic flow chart of the steps of a ground medium exploration method provided by an embodiment of the present application.
  • Figure 76 is a schematic diagram of a cleaning robot provided by an embodiment of the present application performing inner edge exploration and performing a first predetermined action on a preset ground medium;
  • Figure 77 is a schematic diagram of a scene in which a cleaning robot performs inner edge exploration on a preset ground medium according to an embodiment of the present application;
  • Figure 78 is a schematic structural diagram of a cleaning robot provided by an embodiment of the present application.
  • Figure 79 is a schematic diagram of a scene in which a cleaning robot performs outer edge exploration on a preset ground medium according to an embodiment of the present application;
  • Figure 80 is a schematic diagram of another scene in which a cleaning robot according to an embodiment of the present application performs outer edge exploration on a preset ground medium;
  • Figure 81 is a schematic diagram of a scene in which a cleaning robot determines the contour of a preset ground medium by connecting contour points according to an embodiment of the present application;
  • Figure 82 is a schematic diagram of a scene for fitting the contour of a preset ground medium provided by an embodiment of the present application.
  • Figure 83 is a schematic diagram of a scene in which a cleaning robot according to an embodiment of the present application determines the outline of a preset ground medium by performing graphic matching processing on outline points;
  • Figure 84 is a schematic diagram of a scene in which a cleaning robot merges adjacent media areas provided by an embodiment of the present application;
  • Figure 85 is a schematic diagram of the trajectory of a cleaning robot cleaning preset ground media provided by an embodiment of the present application.
  • Figure 86 is a schematic diagram of a cleaning robot using different trajectories to explore preset ground media provided by an embodiment of the present application;
  • Figure 87 is a schematic flow chart of the steps of a ground medium exploration method provided by an embodiment of the present application.
  • Figure 88 is a schematic structural block diagram of a cleaning robot provided by an embodiment of the present application.
  • FIG. 1 is a schematic flowchart of a control method for a cleaning robot provided by an embodiment of the present application.
  • the control method of the cleaning robot can be applied in a cleaning system to control the cleaning robot in the system so that the cleaning robot performs cleaning tasks and cleans the area corresponding to the cleaning task map.
  • the area corresponding to the cleaning task map can be any area to be cleaned such as a family space, a room unit of a family space, a partial area of a room unit, a large place, or a part of a large place.
  • the area corresponding to the cleaning task map can refer to the larger area that is cleaned for the first time, such as the entire room unit; it can also refer to the area that needs to be cleaned after the first cleaning of the larger area, such as the backrest in the room unit. Wall area, or obstacle area.
  • the cleaning task map can be established by exploring the current space in response to mapping instructions, or it can be updated based on obstacles, carpets, etc. identified by the cleaning robot during the cleaning process; optionally, the cleaning
  • the task map may be a map of a cleaning area specified by the user. For example, in response to the user selecting a cleaning area such as one or more rooms on the map, determining the one or more rooms as a cleaning task map, or in response to the user selecting the cleaning area on the map.
  • the cleaning area circled above is, for example, a partial area of one or more rooms.
  • the partial area of one or more rooms is determined as a cleaning task map, and is certainly not limited to this.
  • the cleaning system includes one or more cleaning robots 100 and one or more base stations 200 .
  • the base station 200 is used in conjunction with the cleaning robot 100.
  • the base station 200 can charge the cleaning robot 100, and the base station 200 can provide a parking position for the cleaning robot 100, etc.
  • the base station 200 can also clean the mopping member 110 of the cleaning robot 100, where the mopping member 110 is used to mop the floor.
  • the cleaning system also includes a control device 300.
  • the control device 300 can be used to implement the steps of the cleaning robot control method according to the embodiment of the present application.
  • the robot controller 104 of the cleaning robot 100 and/or the base station controller 206 of the base station 200 can be used as the control device 300 alone or in combination to implement the steps of the cleaning robot control method according to the embodiment of the present application; in other cases,
  • the cleaning system includes a separate control device 300 for implementing the steps of the cleaning robot control method in the embodiment of the present application.
  • the control device 300 can be provided on the cleaning robot 100 or can be provided on the base station 200; of course It is not limited thereto.
  • the control device 300 may be a device other than the cleaning robot 100 and the base station 200, such as a home smart terminal, a master control device, etc.
  • the cleaning robot 100 can be used to automatically mop the floor.
  • the application scenarios of the cleaning robot 100 can be household indoor cleaning, large-scale place cleaning, etc.
  • FIG. 3 is a schematic block diagram of the cleaning robot 100 in an embodiment.
  • the cleaning robot 100 includes a robot body, a driving motor 102, a sensor unit 103, a robot controller 104, a battery 105, a walking unit 106, a robot memory 107, a robot communication unit 108, a robot interaction unit 109, a mopping part 110, a charging part, etc. .
  • the mopping member 110 is used for mopping the ground, and the number of the mopping member 110 may be one or more.
  • the mopping member 110 is, for example, a mop.
  • the wiping member 110 is disposed at the bottom of the robot body, specifically at a rear position of the bottom of the robot body.
  • a driving motor 102 is provided inside the robot body. Two rotating shafts extend from the bottom of the robot body, and the mopping member 110 is sleeved on the rotating shafts. The driving motor 102 can drive the rotating shaft to rotate, so that the rotating shaft drives the mopping member 110 to rotate.
  • the cleaning robot 100 further includes a brushing part 120 , and the brushing part 120 includes a side brushing part 121 and/or a middle sweeping part 122 .
  • the cleaning robot 100 is a cleaning robot that integrates sweeping and mopping.
  • the brushing part 120 and the mopping part 110 can work together.
  • the brushing part 120 and the mopping part 110 work at the same time, and the brushing part 120 and the mopping part 110 continue to work alternately, etc. ;
  • the brushing and sweeping part 120 and the mopping part 110 can also work separately, that is, the brushing and sweeping part 120 performs cleaning work alone, or the mopping part 110 performs mopping work alone.
  • the side brushing part 121 sweeps dust and other dirt from the outside to the middle area, and the middle sweeping part 122 continues to sweep the dirt in the middle area to the dust collector.
  • the number of side brush members 121 is not limited. As shown in Figure 4, the cleaning robot 100 has two side brush members 121 arranged on the left and right sides. Alternatively, only one side brush member 121 is arranged on the left or right side.
  • the brushing and sweeping member 120 can be disposed on the front side of the mopping member 110. Therefore, when the brushing and sweeping member 120 and the mopping member 110 work together, the cleaning robot 100 can perform front sweeping and rear mopping. Compared with brushing and sweeping, The member 120 is disposed behind the mopping member 110, which can prevent the brushing member 120 from being wetted by the wet area dragged by the mopping member 110, and also prevent the dirty brushing member 120 from soiling the mopped area in front.
  • the walking unit 106 is a component related to the movement of the cleaning robot 100 and is used to drive the cleaning robot 100 to move so that the mopping member 110 and/or the brushing member 120 mops the ground.
  • the robot controller 104 is disposed inside the robot body, and the robot controller 104 is used to control the cleaning robot 100 to perform specific operations.
  • the robot controller 104 may be, for example, a central processing unit (CPU) or a microprocessor (Microprocessor).
  • the robot controller 104 is electrically connected to components such as the battery 105, robot memory 107, drive motor 102, walking unit 106, sensor unit 103, and robot interaction unit 109 to control these components.
  • the battery 105 is provided inside the robot body, and the battery 105 is used to provide power to the cleaning robot 100 .
  • the robot body is also provided with a charging component, which is used to obtain power from an external device to charge the battery 105 of the cleaning robot 100 .
  • the robot memory 107 is provided on the robot body, and a program is stored in the robot memory 107. When the program is executed by the robot controller 104, corresponding operations are implemented.
  • the robot communication unit 108 is provided on the robot body. The robot communication unit 108 is used to allow the cleaning robot 100 to communicate with external devices.
  • the cleaning robot 100 can communicate with the terminal and/or the base station 200 through the robot communication unit 108 .
  • the base station 200 is a cleaning device used in conjunction with the cleaning robot 100 .
  • the sensor unit 103 provided on the robot body includes various types of sensors, such as lidar, collision sensor, distance sensor, drop sensor, counter, and gyroscope.
  • the lidar is set on the top of the robot body.
  • the surrounding environment information can be obtained, such as the distance and angle of obstacles relative to the lidar, etc.
  • cameras can also be used instead of lidar. By analyzing the obstacles in the images captured by the cameras, the distance and angle of the obstacles relative to the camera can also be obtained.
  • the crash sensor includes a crash housing and a trigger sensor. When the cleaning robot 100 collides with an obstacle through the collision housing, the collision housing moves toward the inside of the cleaning robot 100 and compresses the elastic buffer.
  • the collision housing After the collision housing moves a certain distance into the cleaning robot 100, the collision housing contacts the trigger sensor, and the trigger sensor is triggered to generate a signal, which can be sent to the robot controller 104 in the robot body for processing. After hitting the obstacle, the cleaning robot 100 moves away from the obstacle, and under the action of the elastic buffer, the collision shell moves back to its original position.
  • the distance sensor can specifically be an infrared detection sensor, which can be used to detect the distance from the obstacle to the distance sensor.
  • the distance sensor is arranged on the side of the robot body, so that the distance value from the obstacle located near the side of the cleaning robot 100 to the distance sensor can be measured by the distance sensor.
  • the distance sensor may also be an ultrasonic ranging sensor, a laser ranging sensor, a depth sensor, etc.
  • the drop sensor is provided at the bottom edge of the robot body. When the cleaning robot 100 moves to the edge of the ground, the drop sensor can detect that the cleaning robot 100 is at risk of falling from a high place, thereby performing corresponding anti-fall reactions, such as cleaning robot 100 stops moving, or moves away from the falling position, etc.
  • the gyroscope is used to detect the rotation angle of the cleaning robot 100, thereby determining the orientation of the cleaning robot 100.
  • the robot interaction unit 109 is provided on the robot body, and the user can interact with the cleaning robot 100 through the robot interaction unit 109 .
  • the robot interaction unit 109 includes, for example, switch buttons, speakers, microphones, touch switches/screens and other components.
  • the user can control the cleaning robot 100 to start or stop working by pressing the switch button or touch switch/screen, and can also display the working status information of the cleaning robot through the touch screen.
  • the cleaning robot 100 can play prompt sounds to the user through the speaker, obtain the user's control instructions through the microphone, or locate the user's location by obtaining the user's voice.
  • the cleaning robot 100 described in the embodiment of the present application is only a specific example and does not constitute a specific limitation to the cleaning robot 100 of the embodiment of the present application.
  • the cleaning robot 100 of the embodiment of the present application can also be implemented in other specific ways.
  • the cleaning robot may have more or fewer components than the cleaning robot 100 shown in FIG. 1 .
  • FIG. 5 is a schematic three-dimensional diagram of the base station 200 in this embodiment
  • FIG. 6 is a schematic block diagram of the base station 200 in this embodiment.
  • the base station 200 is used in conjunction with the cleaning robot 100.
  • the base station 200 can charge the cleaning robot 100, and the base station 200 can provide a parking position for the cleaning robot 100, etc.
  • the base station 200 can also clean the mopping member 110 of the cleaning robot 100.
  • the mopping member 110 is used for mopping the ground.
  • the base station 200 in this embodiment of the present application includes a base station body 202, a cleaning tank 203 and a water tank (not shown).
  • the cleaning tank 203 is provided on the base station body 202, and the cleaning tank 203 is used to clean the mopping member 110 of the cleaning robot.
  • the cleaning ribs 2031 provided on the cleaning tank 203 can scrape and clean the mopping member 110 .
  • the base station main body 202 is provided with an inlet 205, and the inlet 205 leads to the cleaning tank 203.
  • the cleaning robot 100 can drive into the base station 200 through the entry slot 205, so that the cleaning robot 100 can park at a preset parking position on the base station 200.
  • the water tank is provided in the base station body 202, and specifically includes a clean water tank and a sewage tank. The clean water tank is used to store clean water.
  • the cleaning robot 100 is parked on the base station 200 , the mopping member 110 of the cleaning robot 100 is accommodated in the cleaning tank 203 .
  • the clean water tank provides clean water to the cleaning tank 203, and the clean water is used to clean the mopping member 110.
  • the base station body 202 is provided with a top cover (not shown), and the user can take out the water tank from the base station body 202 by opening the top cover.
  • the water tank can be connected to a water inlet pipe (such as a tap water pipe) and a sewage pipe (such as a drainage pipe), in which case the water tank can be fixed in the base station body 202; in other embodiments, the base station 200
  • a water inlet pipe such as a tap water pipe
  • a sewage pipe such as a drainage pipe
  • the water tank can be fixed in the base station body 202
  • the base station 200 One or both of the clean water tank and the sewage tank may not be provided.
  • the water inlet pipe may directly provide clean water to the cleaning tank 203, and the dirty sewage after cleaning the mop member 110 may also be directly discharged from the sewage pipe.
  • the base station 200 further includes a dirt detection device 210 .
  • the dirt detection device 210 is used to detect the degree of dirt of the mop element 110 .
  • the dirt detection device 210 includes at least one of the following: a visual sensor and a sewage detection sensor.
  • the image or color information of the mop element 110 can be obtained according to the visual sensor.
  • the information determines the degree of dirtiness of the mopping element 110. For example, the darker the grayscale of the surface of the mopping element 110, the greater the degree of dirtiness of the mopping element.
  • the sewage detection sensor can obtain the detection information of the sewage obtained by cleaning the mopping member 110, and the degree of dirtiness of the mopping member 110 can be determined based on the obtained detection information; optionally, the sewage detection sensor includes at least the following: One: visible light detection sensor, infrared detection sensor, total dissolved solids detection sensor; for example, the infrared detection sensor collects turbidity information of sewage, the visible light detection sensor collects chromaticity information of sewage, and the total dissolved solids detection sensor Collect water conductivity information of sewage; the degree of dirtiness of the mop parts can be determined based on one or more of the turbidity information, chromaticity information, and water conductivity information; for example, the greater the turbidity of the sewage, the higher the water conductivity The larger it is, the greater the dirtiness of the mopping parts.
  • the base station 200 may also include a base station controller 206, a base station communication unit 207, a base station memory 208, a water pump 209, a base station interaction unit 220, and the like.
  • the base station controller 206 is provided inside the base station main body 202, and the base station controller 206 is used to control the base station 200 to perform specific operations.
  • the base station controller 206 may be, for example, a central processing unit (Central Processing Unit, CPU) or a microprocessor (Microprocessor). Among them, the base station controller 206 is electrically connected to the base station communication unit 207, the base station memory 208, the water pump 209 and the base station interaction unit 220.
  • CPU Central Processing Unit
  • Microprocessor Microprocessor
  • the base station memory 208 is provided on the base station main body 202.
  • the base station memory 208 stores programs, which implement corresponding operations when executed by the base station controller 206.
  • Base station memory 208 is also used to store parameters for use by base station 200.
  • the base station memory 208 includes but is not limited to disk memory, CD-ROM, optical memory, etc.
  • the water pump 209 is provided inside the base station body 202.
  • One water pump 209 is used to control the clean water tank to provide clean water to the cleaning tank 203, and the other water pump 209 is used to clean the dirty water behind the mop 110. Sewage is collected into the sewage tank.
  • the water inlet pipe directly provides cleaning water to the cleaning tank 203, and the cleaning water can be provided to the cleaning tank 203 by controlling the solenoid valve on the water inlet pipe.
  • the base station communication unit 207 is provided on the base station body 202 and is used to communicate with external devices, such as connecting to a WI-FI router to communicate with a terminal, or communicating with the cleaning robot 100 .
  • the base station interaction unit 220 is used to interact with users.
  • the base station interaction unit 220 includes, for example, a display screen and control buttons.
  • the display screen and control buttons are provided on the base station body 202.
  • the display screen is used to display information to the user, and the control buttons are used for the user to press to control the startup or operation of the base station 200. Downtime etc.
  • the base station body 202 is also provided with a power supply component, and the cleaning robot is provided with a charging component.
  • the charging component of the cleaning robot 100 comes into contact with the power supply component of the base station 200, thereby The base station 200 charges the cleaning robot 100 .
  • the electric energy of the base station 200 may come from commercial power.
  • the cleaning robot 100 cleans the floor of the room.
  • the cleaning robot 100 automatically drives to the base station 200 .
  • the cleaning robot 100 enters the base station 200 through the entry slot 205 on the base station 200 and stops at a preset parking position on the base station 200 .
  • the charging component on the cleaning robot 100 is electrically connected to the power supply component on the base station 200.
  • the base station 200 obtains power from the commercial power and charges the battery 105 of the cleaning robot 100 through the power supply component and the charging component. After the cleaning robot 100 completes charging as needed, it drives away from the base station 200 and continues to clean the room floor.
  • the cleaning robot 100 can be used to mop the floor.
  • the cleaning robot 100 mops the room floor for a period of time.
  • the cleaning robot 100 drives to the base station 200 .
  • the cleaning robot 100 enters the base station 200 through the entry slot 205 on the base station 200 and stops at a preset parking position on the base station 200 .
  • the mopping part 110 of the cleaning robot 100 is accommodated in the cleaning tank 203.
  • the cleaning water in the clean water tank in the base station 200 flows to the cleaning tank 203, and is lubricated through the liquid inlet structure on the cleaning tank 203.
  • the wet mopping member 110 is scraped against the protruding cleaning ribs 2031 in the cleaning tank at the same time, thereby cleaning the mopping member 110 .
  • the dirty sewage after cleaning the mopping part 110 flows out of the cleaning tank 203 from the drainage structure on the cleaning tank. Under the action of the water pump 209, the dirty sewage is collected into the sewage tank.
  • the aforementioned base station 200 is only a specific example and does not constitute a specific limitation to the base station in the embodiment of the present application.
  • the base station in the embodiment of the present application may also be implemented in other specific ways.
  • the base station in the embodiment of the present application may not include The water tank and the main body of the base station can be connected to the tap water pipe and the drain pipe, so that the tap water from the tap water pipe is used to clean the mopping part 110 of the cleaning robot 100.
  • the dirty sewage after cleaning the mopping part 110 flows out of the base station 200 through the cleaning tank 203 through the drainage pipe.
  • the base station may have more or fewer components than the base station 200 shown in FIG. 5 .
  • control method of a cleaning robot includes steps S110 to S130.
  • the cleaning task map is used to represent the area to be cleaned or the sub-area to be cleaned within the area to be cleaned, and the sub-area to be cleaned is the uncleaned area within the area to be cleaned.
  • the area to be cleaned can be any area to be cleaned such as a family space, a room unit of a family space, a part of a room unit, a large place or a part of a large place.
  • the sub-area to be cleaned may be an area after the cleaning robot performs edge cleaning in the area to be cleaned.
  • a pre-stored cleaning task map can be obtained, for example, the cleaning task map is pre-stored in the cleaning robot, base station, user terminal or server; or the cleaning task map can be obtained after the cleaning robot detects the area to be cleaned, for example, through lidar One or more sensors among the visual sensor, the inertial measurement unit and the collision sensor detect the area to be cleaned; or the cleaning task map can be obtained when the area to be cleaned is cleaned, for example, the cleaning task map is obtained by cleaning the area to be cleaned along the edge. , for example, clean the edge of the area to be cleaned, and obtain a cleaning task map based on the cleaning trajectory of the edge part.
  • the cleaning robot When the area to be cleaned is a home space, and the home space includes multiple room units, the cleaning robot will The edge of each room unit in the space to be cleaned is cleaned; or the map input by the user through the terminal can be obtained from the user terminal.
  • the cleaning task map can be obtained through any one or more of the above methods.
  • the acquired cleaning task map can be integrated and corrected through any method.
  • the cleaning robot normally completes several (at least once) room cleaning tasks. During this period, the cleaning robot can complete the creation of the map, divide and process the rooms, and obtain the floor material information of each room, such as the user The material information of each room can be set through the APP.
  • the cleaning robot when the cleaning robot completes the cleaning of the whole house, it can complete the information of the current map during the cleaning process, such as the number and area of the current carpet, and the carpet area corresponding to the carpet can be marked on the map.
  • steps of map data analysis may be performed.
  • the area to be cleaned based on the cleaning task map may be called a preset cleaning area.
  • a room may be a preset cleaning area, or a room may have multiple preset cleaning areas; of course, it is not limited thereto.
  • a preset cleaning area includes one room and at least part of another room.
  • the preset cleaning area can be divided and determined based on the user's operation on the cleaning task map, or based on preset area dividing rules.
  • the graphical characteristics of all current rooms are analyzed, such as length, width and area, the number and area of carpets in the room, the size and distribution of obstacles, and based on the graphical characteristics of the room.
  • Features merge, split and sort areas to delineate areas to be cleaned, or can be called preset cleaning areas.
  • statistical analysis of historical cleaning parameters may be performed. For example, the time records of recent room cleaning can be counted to adjust the cleaning parameters according to the frequency of cleaning, including but not limited to the ground pressure of the mopping parts, humidity, type and amount of cleaning fluid, etc.; or the cleaning parameters of recent room cleaning can also be counted. settings to estimate the current degree of room dirtiness to facilitate subsequent dynamic adjustment of cleaning parameters.
  • the hosting mode can be started based on user operations.
  • the user can trigger the hosting mode through a human-computer interaction unit such as a base station, cleaning robot, user terminal, smart speaker, etc., including but not limited to triggering by key presses, voice, etc. .
  • the user when initiating the hosting mode, the user may be notified through the human-computer interaction unit.
  • the hosting mode can be started when the user needs sweeping and mopping and carpet cleaning.
  • initiating the hosting mode based on user operations may also be referred to as the user's one-click management cleaning mode.
  • the cleaning robot can at least automatically clean the carpet in the carpet area by brushing and sweeping, and by mopping.
  • the wiper cleans at least part of the non-carpet areas in the cleaning task map; there is no need to repeatedly set the cleaning mode based on room conditions. For example, by obtaining the user's daily cleaning habits, the best cleaning strategy that the current cleaning robot should implement is given to achieve better cleaning effects, higher cleaning efficiency, or to balance cleaning efficiency and cleaning effects; users With fewer operations, the robot's cleaning scheduling logic is more in line with user expectations, and the overall cleaning effect is better.
  • the cleaning robot can autonomously determine carpet cleaning decisions (including whether to clean the carpet, when to clean, etc.) and its execution strategy; the cleaning robot can determine the cleaning robot control based on historical cleaning data during sweeping and mopping.
  • Strategy a strategy for residue re-sweeping; the cleaning robot can determine the path planning strategy and the soil re-mopping area selection strategy, such as the path selection strategy to prevent pollution and the sequential processing of soil re-mopping, such as based on real-time soil map Area selection, area selection based on room setting information, and re-mopping area selection based on historical cleaning frequency; the cleaning robot can also process cleaning reports after completing dirty re-mopping.
  • the cleaning robot can judge whether to perform a carpet cleaning first based on whether there is a carpet in the area to be cleaned; it can judge whether to perform edge leak repair in this cleaning based on the length of time since the last edge leak repair, and can calculate the time based on the time.
  • the user when the hosting mode is started, the user can be informed through the human-computer interaction unit that the cleaning robot will first perform a carpet cleaning task, and then perform a simultaneous sweeping and mopping task; or it can also perform a simultaneous sweeping and mopping task first, and then perform a carpet cleaning task. Cleaning tasks.
  • the carpet area is not limited to areas including carpets.
  • it may also include areas such as floor mats, mats, or other areas that are not suitable or set by the user and do not allow wet mopping.
  • the carpet area may be called a carpet area, and the carpet includes carpets, foot mats, children's climbing mats, and mats laid on the ground.
  • the carpet may also be used for other cleaning robots that need to be cleaned when they encounter it. Specially processed media laid on the ground are not restricted here.
  • the carpet area is mainly used as the area where carpets/floor mats are laid.
  • the cleaning robot By controlling the cleaning robot to clean the carpet/floor mat in the carpet area through the brushing member, it is possible to prevent the mopping member from wetting the carpet and the wet carpet from being contaminated with dirt and breeding bacteria.
  • the cleaning robot By automatically determining whether the cleaning task map includes a carpet area, and only cleaning the carpet in the carpet area by brushing when the cleaning task map includes a carpet area, the cleaning robot is more intelligent and the user does not need to focus on the carpet separately. Area sets the working mode of the cleaning robot.
  • controlling the cleaning robot to clean the carpet in the carpet area through a brushing element includes: controlling the cleaning robot to move to the carpet area; and after moving to the carpet area Clean the carpet in said carpeted area.
  • controlling the cleaning robot to move to the carpet area includes: controlling the mopping member to be in a raised position when the cleaning robot reaches the edge of the carpet area; controlling the operation of the cleaning robot A walking unit, so that the robot can move above the carpet driven by the walking unit.
  • the mopping member is controlled to be in a raised position, so that the cleaning robot moves to the top of the carpet with the mopping member lifted, which can prevent the mopping member from getting wet. carpet.
  • the cleaning robot when the cleaning robot detects the carpet through the sensor unit, it can be determined that the cleaning robot has reached the edge of the carpet area; for example, based on the current detection data of sensors such as collision sensors and/or ultrasonic sensors, including preset features, It can be determined that the cleaning robot reaches the edge of the carpet area; the preset characteristics can be determined based on detection data of sensors such as collision sensors and/or ultrasonic sensors when the cleaning robot is at the edge of the carpet area. For example, it is determined that the cleaning robot has reached the edge of the carpet area when the visual sensor detects that the distance to the edge of the carpet area is less than or equal to a predetermined distance.
  • whether the cleaning robot reaches the edge of the carpet area can be determined based on the current position of the cleaning robot and the boundary of the carpet area in the cleaning task map.
  • it is not limited to this.
  • it can be determined based on the detection data of the sensor unit whether it has reached the edge of the carpet area.
  • the brush member 120 includes a side brush member 121 .
  • the side brush member 121 includes one or more bundles of bristles. When the side brush member 121 rotates, the bristles sweep dust and other dirt from the outside of the cleaning robot 100 . to the middle area.
  • the side brush member 121 includes two bundles of bristles, and the angle between the two bundles of bristles is greater than 0 degrees and less than 180 degrees. When the side brush member 121 rotates to a preset angle, the bristles of the side brush member 121 all extend toward The interior of the cleaning robot 100 , in other words, the bristles of the side brush member 121 extend toward the interior of the cleaning robot 100 .
  • controlling the cleaning robot to move to the carpet area includes: controlling the side brush to be in a retracted position when the cleaning robot reaches the edge of the carpet area, so that the side brush The bristles of the piece extend into the interior of the cleaning robot.
  • the side brush is controlled to be in the retracted position, and the bristles of the side brush extend toward the inside of the cleaning robot.
  • the bristles extend toward Cleaning the inside of the robot without the bristles extending toward the carpet can prevent the bristles from contacting the edge of the carpet when moving upwards, causing the bristles to deform and bifurcate, thereby extending the service life of the side brush.
  • controlling the cleaning robot to move to the carpet area includes: controlling the side brush to be in a free-rotating state when the cleaning robot reaches the edge of the carpet area.
  • the side brush is in a free-rotating state.
  • the drive motor of the side brush can be cut off or the drive motor of the side brush can be decoupled.
  • the side brush can rotate freely following the resistance; the cleaning robot can rotate freely when the side brush is in the state.
  • the free-rotating state and when moving above the carpet even if the bristles of the side brush extend toward the carpet, when the bristles are deformed to a certain extent, they can freely rotate out of contact with the carpet and restore their original shape, which can prevent the brush from moving toward the carpet.
  • the bristles contact the edge of the carpet, causing the bristles to deform and bifurcate, thereby extending the service life of the side brush.
  • controlling the cleaning robot to move to the carpet area includes: controlling the side brush to be in a raised position when the cleaning robot reaches the edge of the carpet area.
  • the cleaning robot moves above the carpet when the side brush is in the raised position, which can prevent the bristles of the side brush from contacting the edge of the carpet and cause the bristles to deform and bifurcate, thereby extending the service life of the side brush.
  • the cleaning robot when the cleaning robot reaches the edge of the carpet area, the cleaning robot is controlled to stop moving, and the mopping member is controlled to be in a raised position, and the side brush member is controlled to be in a retracted position, in a freely rotating state, or in a state of free rotation.
  • control the cleaning robot After lifting the position, control the cleaning robot to move above the carpet; this can prevent the mopping part and the side brush part from going up on the carpet without switching to the corresponding state.
  • controlling the cleaning robot to clean the carpet in the carpet area through a brushing element includes: when the cleaning robot reaches the edge of the carpet area, controlling the cleaning robot to clean the carpet area.
  • the carpet is explored along the edge to obtain the outline of the carpet; according to the outline of the carpet, the cleaning robot is controlled to clean the carpet through the brushing member in a first arcuate path; according to the outline of the carpet , controlling the cleaning robot to clean the carpet in a second arcuate path through the brushing member, wherein the second arcuate path is orthogonal to the first arcuate path.
  • the edge-edge exploration is inner edge-edge exploration or outer edge-edge exploration.
  • controlling the cleaning robot to explore the carpet along the edge includes: controlling the cleaning robot to explore the inside of the carpet along the edge. That is, the inner edge exploration mode is used to perform edge exploration on the carpet.
  • controlling the cleaning robot to explore the carpet along the edge includes: controlling the cleaning robot to explore the outer edge of the carpet. That is, the outer edge exploration mode is used to perform edge exploration on the carpet.
  • the outer edge exploration mode means that the cleaning robot performs edge exploration on the outside of the carpet.
  • the orthographic projection of the geometric center of the cleaning robot does not fall into the orthographic projection of the carpet.
  • the inner edge exploration mode means that the cleaning robot explores the carpet edge on the inside of the carpet, and the cleaning robot moves on the inside of the carpet.
  • the trajectory formed by the orthographic projection of the geometric center of the cleaning robot at least partially coincides with the orthographic projection of the carpet.
  • the cleaning robot is in the outer edge exploration mode.
  • the overlap between the robot's orthographic projection and the carpet's orthographic projection is less than 50% of the robot's orthographic projection, thereby reducing drag.
  • the extent to which the wiper wets or soils the carpet area; the cleaning robot is in the inner edge exploration mode.
  • edge exploration is performed on the inside of the carpet, the overlap between the orthographic projection of the cleaning robot and the orthographic projection of the carpet is greater than or equal to the percentage of the orthographic projection of the robot. Fifty.
  • the integrity of the carpet contour determined by the cleaning robot after exploring the carpet using the inner edge exploration mode is higher. For example, if there are obstacles placed around the carpet, and the cleaning robot uses the outer edge exploration mode to explore the carpet, the obstacles may hinder the cleaning robot's exploration behavior, making it impossible to obtain the complete outline of the carpet.
  • the edge exploration mode is the outer edge exploration mode
  • the cleaning robot explores the outer edge of the carpet, which can reduce the degree to which the mopping element wets or stains the carpet. For example, if the cleaning robot is in mopping mode, and if the cleaning robot adopts the inner edge exploration mode when detecting the carpet, it will wet a large area or even contaminate the carpet.
  • FIG. 7 is a schematic diagram of a scene in which a cleaning robot 100 performs inner edge exploration mode exploration on the carpet 2 according to an embodiment of the present application.
  • the cleaning robot 100 when the cleaning robot 100 enters the carpet area from a non-carpet area, when the cleaning robot 100 detects the carpet 2 for the first time, the cleaning robot 100 is controlled to perform an inner edge exploration task to obtain The contour point of the carpet 2 until the cleaning robot 100 reaches the first contour point 20 position again.
  • the first contour point 20 position is the position where the cleaning robot 100 obtains the first contour point. It can be understood that when the cleaning robot 100 reaches the first contour point position again, it means that the cleaning robot has returned to the position where the first contour point 20 was detected, and the exploration path of the carpet has formed a closed loop, that is, the cleaning robot has completed the exploration of the carpet.
  • the cleaning robot 100 when the cleaning robot 100 reaches the first contour point 20 again, it is not limited to the cleaning robot 100 necessarily returning to a position coinciding with the first contour point 20.
  • the distance between the cleaning robot 100 and the first contour point 20 is less than a predetermined
  • a distance threshold it can also be determined that the cleaning robot 100 reaches the first contour point 20 again.
  • the coordinates of the cleaning robot may be determined by the coordinates of the sensor that detects the carpet.
  • FIG. 8 is a schematic diagram of a scene in which a cleaning robot 100 according to an embodiment of the present application performs an outer edge exploration mode on the carpet 2 .
  • the cleaning robot 100 detects the carpet for the first time, and the current position of the cleaning robot 100 is recorded as the first contour point 20 . If there are no obstacles on the edge of the carpet 2 area, the cleaning robot 100 can return to the first contour point 20 by moving around the edge of the carpet area, or when the distance between the cleaning robot 100 and the first contour point 20 is less than the preset distance valve When the value is reached, the exploration task is completed.
  • the cleaning task map includes a carpet area, for example, it is first determined whether the room that currently needs to be cleaned has a carpet; if there is a carpet, the distance between the carpet and the cleaning robot can be determined by The near-to-far rule controls the cleaning robot to clean multiple carpets; if there is no carpet, it will not clean the carpet.
  • the cleaning robot when no carpet is detected in the carpet area, such as when the carpet is removed, the cleaning robot can be controlled to detect the carpet within a preset range of the carpet area (such as 1 square meter); when a carpet is detected in the preset range When arriving at the carpet, the cleaning robot can be controlled to explore the carpet along the edge to obtain the outline of the carpet, clean the carpet according to the outline of the carpet, and can also update the carpet according to the outline of the carpet.
  • the carpet area in the cleaning task map; when no carpet is detected in the preset range, the carpet area can be removed from the cleaning task map.
  • the cleaning robot detects a carpet, such as a newly added carpet or a transferred carpet, and can explore the detected carpet along its edges to obtain the outline of the carpet, and according to the The carpet is cleaned according to the outline of the carpet, and the carpet area in the cleaning task map can also be updated according to the outline of the carpet.
  • a carpet such as a newly added carpet or a transferred carpet
  • controlling the cleaning robot to clean the carpet in the carpet area through a brushing element includes: controlling the cleaning robot to operate the side brushing element and operating a walking unit to clean the carpet in the carpet area.
  • the walking unit is driven to move on the carpet.
  • the cleaning robot When the cleaning robot is cleaning on the carpet, it moves on the carpet driven by the walking unit, for example, moves on the carpet in an arcuate path, and at the same time runs the side brush to remove dust and other dirt on the carpet. Sweep from the outside of the cleaning robot to the middle area.
  • the brushing and sweeping parts include a middle sweeping part
  • controlling the cleaning robot to clean the carpet in the carpet area through the brushing and sweeping parts includes: controlling the cleaning robot to run the middle sweeping part. , and operate the walking unit to move on the carpet driven by the walking unit.
  • the middle sweep can be controlled to sweep the dirt in the middle area to the vacuum device of the cleaning robot or the middle sweep is controlled.
  • the dirt in the middle area is raised, and the vacuum device is controlled to collect the dirt through negative pressure.
  • the fan speed of the vacuum cleaner is increased to a strong level, such as the maximum level, to improve the cleaning effect on the carpet.
  • the driving speed of the cleaning robot when moving on the carpet to clean the carpet is smaller than the driving speed of the cleaning robot moving toward the carpet, so as to improve the cleaning effect on the carpet.
  • the cleaning robot when the side brush triggers overcurrent protection, the cleaning robot is controlled to switch the side brush to a free-rotating state.
  • the drive motor of the side brush rotates at a preset speed.
  • the current of the drive motor of the side brush exceeds the preset speed.
  • the middle sweeping piece or the walking unit when the middle sweeping piece or the walking unit is entangled with fluff or hair on the carpet, or when overcurrent protection is triggered for other reasons, the middle sweeping piece or the walking unit can be controlled to stop running. Start running after the preset time.
  • the cleaning robot when the number of times the scanned object or the walking unit triggers overcurrent protection exceeds the first threshold, or the number of times the walking unit slips exceeds the second threshold, the cleaning robot is controlled to perform at least the following: One operation: stop cleaning the carpet, mark the carpet area as a prohibited area, and provide an exception prompt. For example, when the number of times the overcurrent protection is triggered by the scanned object or the walking unit exceeds the first threshold, or the number of times the walking unit slips exceeds the second threshold, it can be determined that the cleaning robot is not suitable for cleaning the carpet.
  • Cleaning such as the pile of the carpet is too long or the hair is too much; you can stop cleaning the carpet and mark the carpet area as a prohibited area; when the carpet area is marked as a prohibited area, you can do it in subsequent cleaning tasks
  • the carpet area will not be cleaned; an exception prompt can also be provided, and the user can check the carpet according to the exception prompt; the carpet area can also be marked as a prohibited area through manual settings, for example, after removing the hair on the carpet, all Mark the prohibited area so that the cleaning robot can clean the carpet in that carpet area.
  • the embodiment of the present application does not limit the order of the step of cleaning the carpet in step S120 and the step of cleaning the non-carpet area in step S130.
  • the carpet can be cleaned first and then the non-carpet area, or the non-carpet area can be cleaned first.
  • the non-carpet area is then cleaned, or the non-carpet area can be cleaned after cleaning several carpets, and then the other carpets can be cleaned.
  • non-carpet areas are cleaned before carpets.
  • the non-carpet areas are first brushed and swept at the same time according to the cleaning order of the non-carpet areas, that is, sweeping and mopping at the same time.
  • the outside of the carpet can be Exploring the edges of the carpet to determine the outline of the carpet, and then continuing to clean the non-carpet areas; after cleaning all the non-carpet areas, perform maintenance on the mop parts, such as cleaning the mop parts and drying them, and then follow the instructions Carpet Area Cleaning Sequence Carpet Cleaning. Clean the carpet after the mop is dry to avoid wetting the carpet.
  • the non-carpet area is the majority of the areas that the cleaning robot needs to clean, is the non-carpet area the main area or the area where the dirt is easier to identify with the naked eye?
  • the fan power needs to be increased to clean the carpet. , consumes a lot of power, is limited by the battery life of the cleaning robot, and affects the cleaning of non-carpet areas; for example, it may cause the cleaning robot to stop cleaning the non-carpet areas before the carpet is cleaned, or even before cleaning the non-carpet areas. It requires charging and maintenance, which makes users feel that the cleaning robot has stopped working before the main work is completed. This does not meet user expectations and the user experience is not good.
  • the cleaning task map includes multiple carpets
  • all the carpets are cleaned first, and then the non-carpet areas are cleaned; compared to cleaning the non-carpet areas first and then maintaining the mopping parts Cleaning the carpet later can reduce the workload of maintaining the mop parts during the cleaning process, such as drying the mop parts. For example, clean the carpet while the mop is dry to prevent wetting the carpet.
  • carpet cleaning can be turned off or carpet cleaning can be set in the carpet cleaning options. If it is set to turn off carpet cleaning, the cleaning robot is not allowed to go on the carpet during the cleaning task; for example, when the cleaning robot reaches the edge of the carpet area, it will explore the carpet along the outside of the carpet to obtain the outline of the carpet, and then Clean the area beyond the carpet. If it is set to turn on carpet cleaning, after cleaning the carpet in the carpet area, you can perform carpet avoidance actions, such as bypassing the carpet, when cleaning at least part of the non-carpet area in the cleaning task map. .
  • controlling the cleaning robot to clean non-carpet areas in the cleaning task map at least through mopping elements includes: wetting the mopping elements; controlling movement of the cleaning robot to the non-carpet area; control the cleaning robot to clean the non-carpet area through the brushing member and the mopping member.
  • the cleaning robot is controlled to move to the base station, and the base station is controlled to wet the mopping member.
  • the mopping member can also be cleaned to moisten the mopping member; of course, it is not limited to this.
  • the cleaning robot is provided with a water tank, and the water in the water tank can be supplied to the mopping member to moisten the mopping member.
  • the non-carpet area is cleaned by the brushing part and the mopping part, that is, the non-carpet area is brushed and mopped at the same time, that is, the cleaning robot is controlled to perform simultaneous sweeping and mopping tasks.
  • the contamination of the carpet can be reduced to a greater extent, for example, it can prevent the adsorption of dirt.
  • the wet mop will contaminate the carpet when cleaning it.
  • the mopping parameters when cleaning the non-carpet area can be determined according to the ground material of the non-carpet area; and the mopping member can be moistened or cleaned according to the mopping parameters.
  • the humidity of the mopping member is adjusted; and/or according to the mopping parameters, when the non-carpet area is cleaned, the cleaning robot is controlled to adjust the mopping member. of ground pressure.
  • the ground pressure of the mopping member can be adjusted by adjusting the height at which the mopping member is lowered relative to the robot body.
  • the ground pressure is 12 N (N), and the humidity of the mopping parts is normal or relatively wet; for wooden floors, the ground pressure is 5 N (N), and the humidity of the mopping parts is Drier.
  • the ground pressure is 12 N, and the humidity of the mop parts is normal.
  • the cleaning task map includes floor material information of at least one area, such as wooden floor, tile floor, etc.
  • the ground material information input by the user can be obtained through the map management interface, such as the ground material of each room; or the ground material information can be obtained when the cleaning robot explores and draws the map on the ground; for example, the ground can be photographed through a visual sensor Image, identify the ground material based on the characteristics of the ground image.
  • the dragging parameters corresponding to different ground materials can be preset, or can be set by the user; for example, when the user clicks on an area of a certain ground material, the dragging parameters corresponding to the ground material are displayed, and the user can adjust the displayed dragging parameters. Erase parameters to modify.
  • the mopping parameters set in the cleaning task can prevail.
  • controlling the cleaning robot to clean non-carpet areas in the cleaning task map at least through a mopping element further includes: when the non-carpet area includes a target area, when the mopping member After the wiping member is cleaned, the cleaning robot is controlled to move to the target area, and the target area is mopped by the mopping member.
  • the target area is an area that needs to be repeatedly mopped.
  • the base station cleans the mopping parts or the cleaning robot autonomously cleans the mopping parts.
  • the target area After sweeping and mopping the floor, if there are at least some areas on the floor, that is, the target area has not been cleaned yet and needs to be mopped again, the target area can be mopped by the mopping member; Larger dirt has been cleaned by then. At this time, you can only mop the target area to reduce energy consumption.
  • the brushing element includes a middle sweeping element, and when the cleaning robot is controlled to move to the target area, both the middle sweeping element and the mopping element are controlled to be in a raised position; preventing cleaning Clean mopping elements become contaminated when passing over unmopped floors or floors that have not been mopped clean, and/or prevent the loss of water on wet mopping elements and reduce the cleaning effect on the target area.
  • the movement trajectory of the cleaning robot is the same as the movement trajectory before leaving the target area, and the movement trajectory is modified by the dragging member. Drag and wipe. It should be noted that the same includes exactly the same or substantially the same. For example, after the cleaning robot leaves the target area after repeatedly mopping the target area, the dirt on the mopping member has the opportunity to cause contamination on the ground that has been mopped clean on the movement path; by cleaning the mopping member Then the ground along the movement path is mopped, so that the mopped ground can be kept clean.
  • controlling the cleaning robot to clean non-carpet areas in the cleaning task map at least through mopping elements includes: cleaning the non-carpet areas in an arcuate trajectory.
  • the side brush on one side works to sweep the dirt from the unmoved area to the middle area, and the middle sweep continues to sweep the dirt from the middle area to the vacuum device.
  • the carpet may be detected when cleaning the non-carpet area, for example, through a sensor unit.
  • the carpet is explored along the edge, the carpet area corresponding to the carpet is determined, and the carpet area is updated into the cleaning task map.
  • the cleaning robot can be controlled to clean the carpet in the carpet area by brushing immediately; or the cleaning robot can be controlled to clean the carpet in the carpet area by brushing after cleaning the non-carpet area.
  • the component cleans the carpets in the carpet area, that is, non-carpet areas can be cleaned first, ensuring a user-friendly experience.
  • step of cleaning the carpet in the carpet area reference can be made to the description of the aforementioned step S120 and will not be described again here.
  • control method further includes: controlling the cleaning robot to lift the mopping member when a first preset condition is met; controlling the cleaning robot to lift the mopping member when a second preset condition is met.
  • the brushing and sweeping parts after controlling the cleaning robot to lift the brushing and sweeping parts and/or the mopping parts, when controlling the cleaning robot to lift the brushing and sweeping parts and/or the mopping parts location for cleaning.
  • the dirt on the brushing and/or mopping parts may fall to the ground.
  • Cleaning can reduce or eliminate contamination of the location.
  • the first preset condition includes at least one of the following: determining that the cleaning robot is performing a navigation task, determining that the robot performs a crossing action, determining that the cleaning robot is on the carpet, determining that the cleaning robot enters the base station ;
  • the second preset condition includes at least one of the following: determining that the cleaning robot is performing a navigation task, or determining that the cleaning robot has entered the base station.
  • the cleaning robot when the cleaning robot ends cleaning a non-carpet area with the mopping member and starts moving toward the base station, when the cleaning robot ends cleaning a non-carpet area with the mopping member and starts moving toward another non-carpet area, the cleaning robot When starting to cross the carpet, and/or when the cleaning robot enters the base station, the cleaning robot is controlled to lift the mopping member. For example, when the cleaning robot ends cleaning a non-carpet area through the brushing and mopping elements and starts to move toward the base station, the cleaning robot ends cleaning a non-carpet area through the brushing and mopping elements and starts to move toward another non-carpet area. When the carpet area moves and/or when the cleaning robot enters the base station, the cleaning robot is controlled to lift the sweeping piece.
  • the action status of the actuators such as the brushing part and/or the mopping part can be judged based on the current cleaning object of the cleaning robot, the movement trajectory and other information. For example, to determine which parts should be moved and which ones should not be moved, the corresponding The actuator can achieve at least one of the following effects: improving the cleaning intelligence of the cleaning robot, improving cleaning efficiency, preventing contamination of the ground, preventing contamination of the cleaned mopping parts, and increasing the service life of the mopping parts. Of course, it is not limited to this.
  • controlling the cleaning robot to clean the position of the brushing member and/or the mopping member when it is lifted includes: controlling the cleaning robot to retreat a preset distance, rotate, and raise the suction level.
  • the power or rotation speed of the fan in the dust device is used to absorb the dirt dropped when the brushing member and/or the mopping member is lifted. For example, when the robot returns to the base station after sweeping and mopping, a small amount of residue will fall from the brushing and/or mopping parts to the ground. You can control the cleaning robot to take a half step back and rotate on its own, and continue to turn on the The fan cleans up the dirt falling on the ground.
  • the control method of a cleaning robot includes: obtaining a cleaning task map; determining whether the cleaning task map includes a carpet area, and when the cleaning task map includes a carpet area, controlling the cleaning robot to sweep through cleaning the carpet in the carpet area; controlling the cleaning robot to clean at least part of the non-carpet area in the cleaning task map through at least the mopping component; automatically cleaning the carpet according to the carpet area in the cleaning task map Brushing and at least mopping non-carpet areas do not require the user to set different cleaning methods for different areas, which improves the intelligence of the cleaning robot.
  • the floor is swept and mopped simultaneously with the mopping and mopping parts.
  • a preset period of time such as one week. If so, the leakage filling action is performed when cleaning the room along the edge. Otherwise, the room is cleaned along the edge separately. No leak-proofing action is performed; after cleaning the edge of the room, clean the floor of the room.
  • the ground pressure and mopping humidity can be set according to the ground material.
  • the cleaning robot can be controlled to clean the position of the brushing and/or mopping member when it is lifted to prevent repeated contamination. contaminate the ground.
  • the preset cleaning area includes a target area that needs to be repeatedly mopped based on the degree of dirt in the preset cleaning area; and at least part of the target area may be repeatedly mopped.
  • an image indicating changes in the degree of dirtiness of each area (which may be called a dirt heat map) is output, and/or an image indicating a cleaning trajectory is output.
  • a visual interface for executing tasks such as animation or short video.
  • the brushing part when in the hosting mode, first determine whether there is a carpet. If there is a carpet, lift the actuators such as the brushing part and the dragging part and then drive out of the base station; after driving out of the base station, the brushing part works to keep the scanning part and Lift the mopping part and navigate to the carpet; after reaching the edge of the carpet, place the bristles of the side brush part under the robot body or allow the side brush part to rotate freely, put down the middle sweeping part and control it to start working and then go up to the carpet, on the carpet When it is on, the side brush can rotate freely; after cleaning the current carpet, navigate to another carpet and clean the next carpet until all carpets are cleaned and navigate back to the base station.
  • the side brush keeps working and the middle scan and the mopping element remain in the raised state; stow away each cleaning actuator when arriving at the base station.
  • the cleaning tank of the mopping element After arriving at the cleaning tank of the mopping element, maintain the stowed/raised state of the side brush element and the middle sweeping element, and put down the mopping element to align the Mop pieces for cleaning.
  • the cleaning of the mopping part After the cleaning of the mopping part is completed, keep the brushing and mopping parts raised, lift the mopping part, and drive out of the base station; after driving out of the base station, brush the parts, keep the mid-scanning part and the mopping part in the lifted state, navigate to the task point, and arrive at Put the mop down after the task point to at least clean the non-carpeted areas of the floor.
  • the cleaning of the floor may include cleaning of each area on the floor, such as block cleaning, and may also include edge cleaning or edge leakage cleaning.
  • cleaning the floor when the cleaning conditions of the mop parts are met, such as when the cleaned floor area or time reaches the preset value, it navigates back to the base station; or when the cleaning robot has insufficient power or completes the cleaning task, it navigates back to the base station; The base station can clean the mopping parts and charge the cleaning robot.
  • the degree of dirtiness of the ground that has been mopped can be determined based on the degree of dirtiness of the mopping parts. When the degree of dirtiness of the mopped ground is relatively large, you can navigate to the ground for repeated mopping.
  • the mopping parts can be cleaned at the base station. After the cleaning is completed, the mopping parts can be dried and To charge the cleaning robot, all actuators can also be stowed away.
  • Cleaning robots can be used to automatically clean floors, and their application scenarios can include household indoor cleaning, large-scale place cleaning, etc.
  • the cleaning robot can mop the floor through mopping parts. After mopping the floor for a period of time, the mopping parts often become dirty or have insufficient power. They need to return to the base station to clean or recharge the mopping parts; currently, they are usually cleaned after receiving When the robot returns to the base station for maintenance instructions, it stops cleaning and returns to the base station immediately. Sometimes it needs to return to the stop point to continue cleaning after maintenance.
  • the cleaning strategy of going back and forth to the stop point reduces the cleaning efficiency to a certain extent.
  • FIG. 9 is a schematic flowchart of a control method for a cleaning robot provided by an embodiment of the present application.
  • the control method of the cleaning robot can be applied in a cleaning system to control the cleaning robot in the system so that the cleaning robot performs cleaning tasks and cleans the area corresponding to the cleaning task map.
  • the area corresponding to the cleaning task map can be any area to be cleaned such as a family space, a room unit of a family space, a partial area of a room unit, a large place, or a part of a large place.
  • the area corresponding to the cleaning task map can refer to the larger area that is cleaned for the first time, such as the entire room unit; it can also refer to the area that needs to be cleaned after the first cleaning of the larger area, such as the backrest in the room unit. Wall area, or obstacle area.
  • the cleaning task map can be established by exploring the current space in response to mapping instructions, or it can be updated based on obstacles, carpets, etc. identified by the cleaning robot during the cleaning process; optionally, the cleaning
  • the task map may be a map of a cleaning area specified by the user. For example, in response to the user selecting a cleaning area such as one or more rooms on the map, determining the one or more rooms as a cleaning task map, or in response to the user selecting the cleaning area on the map.
  • the cleaning area circled above is, for example, a partial area of one or more rooms.
  • the partial area of one or more rooms is determined as a cleaning task map, and is certainly not limited to this.
  • the cleaning system includes one or more cleaning robots 100 and one or more base stations 200 .
  • the base station 200 is used in conjunction with the cleaning robot 100, at least for maintaining the cleaning robot; for example, the base station 200 can charge the cleaning robot 100, the base station 200 can provide a docking position for the cleaning robot 100, etc.
  • the base station 200 can also clean or replace the cleaning parts of the cleaning robot 100.
  • the cleaning parts may include brushing parts, such as side brushes and middle brushes. The brushing parts are used to sweep the ground to remove garbage or dirt on the ground.
  • the cleaning part may also include a mopping part, and the mopping part is used to mop the ground to clean stains on the ground.
  • the cleaning system also includes a control device 300.
  • the control device 300 can be used to implement the steps of the cleaning robot control method according to the embodiment of the present application.
  • the robot controller of the cleaning robot 100 and/or the base station controller of the base station 200 can serve as the control device 300 alone or in combination, for implementing the steps of the method in the embodiment of the present application.
  • the cleaning system includes a separate control device 300 for implementing the steps of the method in the embodiment of the present application.
  • the control device 300 can be provided on the cleaning robot 100 or can be provided on the base station 200; of course, it can also It is not limited thereto.
  • the control device 300 may be a device other than the cleaning robot 100 and the base station 200, such as a home smart terminal, a master control device, etc.
  • the cleaning robot 100 can be used to automatically clean the floor.
  • the application scenarios of the cleaning robot 100 can be household indoor cleaning, large-scale place cleaning, etc.
  • control method of the cleaning robot according to the embodiment of the present application includes steps S110 to S130.
  • the amount of dirt that the cleaning robot's mopping parts can absorb is limited. When the mopping parts absorb more dirt, the cleaning effect on the ground is poor.
  • the mopping parts need to be maintained, such as cleaning or replacing them. To ensure that the mopping parts have a better cleaning effect after maintenance; for example, the cleaning robot returns to the base station to clean or replace the mopping parts; optionally, the mopping parts can also be maintained by cleaning without returning to the base station.
  • Robot completion for example, the cleaning robot has its own water tank, which can directly clean the mopping parts, or the cleaning robot has its own mopping part replacement device, which can directly replace the mopping parts.
  • the power of the cleaning robot is also limited. If the power drops to the preset power when cleaning the floor, it needs to return to the base station for charging; the amount of water on the mopping part of the cleaning robot and/or the water supply on the cleaning robot is used to supply water to the mopping part.
  • the amount of water in the water tank is also limited. When the water amount is insufficient, you can return to the base station to add water; the dust box on the cleaning robot used to collect dirt can also hold a limited amount of dirt. When the amount of dirt contained is large, maintenance is also required. .
  • the first maintenance instruction is generated according to a user's maintenance control operation.
  • the user performs a maintenance control operation before going to bed, such as performing the maintenance control operation on an interactive unit of a cleaning robot, a base station or a user terminal, and generating the first maintenance instruction according to the maintenance control operation.
  • the maintenance rules are determined according to the user's maintenance control operation. For example, if the cleaning robot is maintained at nine o'clock every night, the first maintenance instruction is generated at nine o'clock every night. Of course, it is not limited to this. For example, the cleaning robot can be maintained every six hours. Cleaning robot for maintenance.
  • the first maintenance instruction is generated based on the time since the last maintenance. For example, when the time since the last maintenance is six hours, the first maintenance instruction is generated to periodically control the cleaning robot to perform maintenance. Or the first maintenance instruction can be generated according to the working time of the cleaning robot; for example, the cleaning robot generates the first maintenance instruction every time it cleans for 10 minutes.
  • obtaining the first maintenance instruction includes: obtaining the workload completed by the cleaning robot when the cleaning robot cleans the preset cleaning area; when the workload completed by the cleaning robot exceeds When the magnitude reaches the first workload threshold, the first maintenance instruction is generated.
  • the first maintenance instruction is generated when the workload completed by the cleaning robot reaches the first workload threshold to control the cleaning robot to perform maintenance, so that the maintained cleaning robot can have a better cleaning effect.
  • the workload includes at least one of the following: the amount of dirt adsorbed by the cleaning robot's mopping part when mopping the floor, the power consumption of the cleaning robot when cleaning the floor, the amount of time the cleaning robot cleans the floor.
  • the amount of water consumed when the cleaning robot cleans the floor the amount of dirt collected when the cleaning robot cleans the floor
  • the amount of sewage collected when the cleaning robot cleans the floor the area of the floor cleaned by the cleaning robot, and the path length of the floor cleaned by the cleaning robot.
  • the first workload threshold N1 can be determined based on the performance parameters of the cleaning robot, such as the battery life, clean water tank capacity, dust collection tank capacity, sewage tank capacity, the maximum dirt value d_max of the mopping part, etc. At least one of them is confirmed.
  • the first workload threshold can be determined based on the changing relationship between the cleaning effect of the mopping element and the area of the floor cleaned by the cleaning robot; for example, after maintenance of the cleaning robot The cleaning robot is controlled to clean the floor.
  • the first workload threshold is determined according to the area of the floor cleaned by the cleaning robot.
  • the first workload threshold can be determined based on the capacity of the clean water tank; the workload includes sewage collection when the cleaning robot cleans the floor.
  • the first workload threshold may be determined based on the capacity of the sewage tank.
  • the first workload threshold may be based on the cleaning effect of the mopping part and the amount of dirt adsorbed by the mopping part when mopping the floor.
  • the relationship between the changes in quantity is determined; for example, when the floor is mopped after the mop part is maintained, when the mopping effect of the mop part on the ground is very poor, the amount of dirt absorbed by the mop part will be determined based on the amount of dirt absorbed by the mop part.
  • the dirt value d determines the first workload threshold; of course, it is not limited to this.
  • the first workload threshold can be determined based on the maximum dirt value d_max of the mopping element, and the maximum dirt value d_max is an empirical value. , which can be measured in a laboratory, for example.
  • the cleaning robot can still perform at least one maintenance, such as when the power consumption of the cleaning robot when cleaning the floor reaches the corresponding power consumption. After the first workload threshold, the remaining power of the cleaning robot can still support the cleaning robot to return to the base station for charging.
  • the workload completed by the cleaning robot when the workload completed by the cleaning robot reaches the first workload threshold, it can still work for a period of time, such as cleaning part of the floor, and after working for a period of time, it can still perform at least one maintenance.
  • the first workload threshold is smaller than the second workload threshold, for example, the first workload threshold is 0.6 to 0.8 times the second workload threshold, and is certainly not limited thereto.
  • the workload completed by the cleaning robot reaches the first workload threshold, it can still work for a period of time.
  • the workload completed by the cleaning robot reaches the second workload threshold, the work is forcibly ended. and perform maintenance.
  • the second workload threshold N2 is an absolute backwash threshold. When the amount of dirt adsorbed by the cleaning robot's mopping part reaches the absolute backwashing threshold when mopping the floor, it is determined that the mopping part reaches the maximum dirt value d_max, and it is impossible to Before mopping and cleaning, the mopping parts must be maintained.
  • the preset cleaning area can be obtained by dividing the cleaning task map, that is, the room in the area to be cleaned.
  • the area to be cleaned can be any area to be cleaned such as a family space, a room unit of a family space, a part of a room unit, a large place or a part of a large place.
  • a room may be a preset cleaning area, or a room may have multiple preset cleaning areas; of course, it is not limited thereto.
  • a preset cleaning area includes one room and at least part of another room.
  • the preset cleaning area can be determined based on the user's division on the cleaning task map, or the division can be determined based on preset area division rules.
  • obtaining the workload completed by the cleaning robot when the cleaning robot cleans the preset cleaning area includes: obtaining the workload of each unit area in the preset cleaning area. ; Determine the workload completed by the cleaning robot based on the workload of the unit area cleaned by the cleaning robot.
  • the graphical characteristics of the room where the preset cleaning area is located and/or the room identification of the room can be obtained, and each element in the preset cleaning area is determined based on the graphical characteristics and/or the room identification of the room.
  • the workload per unit area can be obtained, and each element in the preset cleaning area is determined based on the graphical characteristics and/or the room identification of the room.
  • the unit area can be a grid in the cleaning task map, but of course it is not limited to this.
  • the unit area can be an area of any size, such as an area of 0.5 square meters or an area of 1 square meter; the unit area It may be a rectangle or a square, and of course is not limited thereto, for example, it may be a parallelogram.
  • the workload can be determined according to the boundaries of the unit area and the room. and/or the distance between obstacles in the room determines the workload per unit area.
  • the workload of the unit area is negatively correlated with the distance between the unit area and the boundary of the room and/or obstacles in the room.
  • the environment map of the preset cleaning area contains multiple grids.
  • One grid is a unit area, or multiple grids are combined into a unit area.
  • Each grid in the environment map is calculated.
  • the distance to the obstacle is used to determine the workload of each unit area.
  • the maximum distance, the minimum distance, or the average of the distances between the multiple grids and the obstacle is determined. The workload per unit area.
  • the workload of the unit area includes the amount of dirt absorbed by the cleaning robot's mopping member when mopping the unit area;
  • the amount of dirt is inversely related to the distance between the unit area and the boundaries of the room and/or the obstacles in the room.
  • the workload of the unit area includes the amount of dirt collected when the cleaning robot cleans the unit area; the amount of dirt collected when the cleaning robot cleans the unit area is equal to the unit area and The boundaries of a room and/or the distance between obstacles in said room are inversely related.
  • A represents the obstacle
  • the thick black line represents the wall
  • the workload of the unit area includes the amount of dirt absorbed by the cleaning robot's mopping member when mopping the unit area;
  • the amount of dirt is positively related to the dirt level of the unit area.
  • the workload of the unit area includes the amount of dirt collected by the cleaning robot when cleaning the unit area; the amount of dirt collected by the cleaning robot when cleaning the unit area, and the amount of dirt collected by the cleaning robot when cleaning the unit area.
  • the degree of dirtiness is positively correlated.
  • the graphical characteristics of the room include the distribution of dirt in the room (such as a dirt heat map), such as the dirt level of each unit area; each unit area can be determined based on the dirt level of the unit area in the room.
  • the workload of the unit area for example, the workload of the unit area is positively correlated with the degree of dirt of the unit area.
  • the distribution of dirt in the room can be determined based on the detection results of the cleaning robot's sensors, such as vision sensors; or a separate image sensor can be used to photograph the floor in the room, identify the captured images, and determine the room.
  • the distribution of dirt in the room; or after the cleaning robot completes cleaning different areas in the room, the dirt status in different areas can be determined based on the detection information of the sewage in the cleaning mop parts, or based on the amount of dirt in the dust box Determine the dirtiness of different areas.
  • the distribution of dirt in the room can be the current distribution of dirt in the room, or it can also be the distribution of dirt in the room in historical data.
  • the amount of dirt absorbed or the amount of dirt collected when mopping the floor can more accurately reflect the workload of the cleaning robot than determining the workload such as the cleaned area and path length; for example, when the ground is less dirty When the mopping part is used, the area that can be cleaned is more. If only the cleaned area is used as the workload to control the cleaning robot for maintenance, the number of maintenance will be more and the cleaning efficiency will be reduced. According to the dirt adsorbed when mopping the floor, Controlling the amount of dirt or dirt collected by the cleaning robot for maintenance can reduce the number of maintenance times and thereby improve cleaning efficiency.
  • the magnitude of the workload of the preset cleaning area is less than or equal to a second workload threshold, and the second workload threshold is greater than the first workload threshold, so that the cleaning robot When the preset cleaning area is cleaned, the magnitude of the workload of the cleaning robot does not exceed the second workload threshold.
  • the graphical characteristics of the room can be obtained, and the graphical characteristics include the boundaries of the room; according to the second workload threshold and the graphical characteristics, the dividing line of the room is determined, so that the dividing line and the boundary of the room form There are at least two preset cleaning areas, and the magnitude of the workload of each preset cleaning area is less than or equal to the second workload threshold. So that the cleaning robot can interrupt the current cleaning task after completing the workload of a preset cleaning area; it can make the cleaning robot achieve better cleaning effect and higher cleaning efficiency when cleaning the room according to the preset cleaning area. efficiency.
  • determining the task execution status of the cleaning robot includes: in response to the first maintenance instruction, determining whether the current cleaning task of the cleaning robot has been completed. Tasks performed and/or tasks not performed.
  • the current cleaning task of the cleaning robot is to clean the preset cleaning area.
  • determining the task execution status of the cleaning robot includes determining at least one of the following: determining the progress of the current cleaning task of the cleaning robot, determining the type of tasks that the cleaning robot has performed, determining the type of tasks that have not been performed, and determining the number of tasks that have not been performed. workload.
  • the cleaning robot when the first maintenance instruction is obtained, the cleaning robot is not directly controlled to perform maintenance, but the task execution status of the cleaning robot is determined in response to the first maintenance instruction, and According to the task execution status, the cleaning robot is controlled to perform maintenance.
  • controlling the cleaning robot to perform maintenance according to the task execution status includes: according to at least one of the following: the progress of the current cleaning task of the cleaning robot, the type of tasks that the cleaning robot has performed, the The type of tasks not performed by the cleaning robot and the workload of the tasks not performed are described, and the cleaning robot is controlled to perform maintenance.
  • controlling the cleaning robot to perform maintenance includes: generating a second maintenance instruction to directly control the cleaning robot to perform maintenance, or generating a second maintenance instruction after the cleaning robot completes at least part of the unperformed tasks. To control the cleaning robot for maintenance.
  • the maintenance timing of the cleaning robot can have a certain loose adjustment range to reduce the impact of cleaning robot maintenance on cleaning efficiency.
  • controlling the cleaning robot to perform maintenance includes but is not limited to at least one of the following: controlling the cleaning robot to return to the base station, and the base station performs maintenance on the cleaning robot; having the cleaning robot perform maintenance on its own, such as Cleaning and mopping parts; prompting the user to perform maintenance on the cleaning robot, for example, sending prompt information to the user terminal to prompt the user to maintain the dust box on the cleaning robot for collecting dirt.
  • controlling the cleaning robot to perform maintenance according to the task execution status includes: when the task execution status is determined to mean that the cleaning robot completes the current cleaning task, controlling the cleaning robot to perform maintenance.
  • Maintenance means immediately controlling the cleaning robot to perform maintenance according to the task execution status. For example, when the cleaning robot completes a preset cleaning, and the workload completed by the cleaning robot reaches the first workload threshold, the cleaning robot can be immediately controlled to perform maintenance; the cleaning robot can also be controlled to perform maintenance again after the maintenance is completed.
  • the above preset cleaning area can be repeatedly cleaned, or other preset cleaning areas in the cleaning task map can be cleaned.
  • controlling the cleaning robot to perform maintenance according to the task execution status includes: when it is determined that the workload of the unexecuted task is less than or equal to the preset workload, controlling the cleaning robot to complete all tasks. Perform maintenance after the tasks described above are not performed.
  • the preset workload can be determined based on the difference between the second workload threshold and the first workload threshold.
  • the preset workload is 0.6 to 1 times the difference, such as 0.8 times or 0.9 times.
  • the current cleaning task of the cleaning robot When the first maintenance instruction is obtained and the current cleaning task of the cleaning robot has not been completed, for example, there are still several unit areas that have not been mopped in the preset cleaning area, then it is determined that the current cleaning task has not been executed. Whether the workload of the task is less than or equal to the preset workload, for example, determine whether the sum of the workload of several unit areas is less than or equal to the preset workload.
  • the control The cleaning robot performs maintenance after completing the unexecuted tasks; optionally, after the maintenance is completed, there is no need to return to the preset cleaning area to perform the current cleaning task, so as to reduce the number of maintenance times of the cleaning robot.
  • controlling the cleaning robot to perform maintenance according to the task execution status includes: when the workload of the unexecuted tasks is greater than the preset workload, and the type of the unexecuted tasks includes all
  • the cleaning robot moves to the door of the room, the cleaning robot is controlled to perform maintenance after moving to the door of the room.
  • Match the cleaning trajectory of the preset cleaning area with the trajectory returning to the base station for maintenance (such as backwashing trajectory).
  • the cleaning robot reaches the door of the room according to the planned path, and then returns to the base station for maintenance, the path consumption of returning to the base station can be reduced.
  • At least part of the path of the cleaning robot is a path moving toward the doorway of the room.
  • the cleaning robot is moving along the at least part of the path toward the door of the room, that is, the type of task that the cleaning robot is not performing is moving along the at least part of the path toward the door of the room, then You can continue to move at least part of the distance along the at least part of the path toward the door of the room (reach the door or arrive near the door) and clean the area corresponding to the at least part of the path, and then generate the second maintenance instruction to control the cleaning Robots perform maintenance.
  • the area on the path moving to the door of the room is not cleaned.
  • at least part of the area may be cleaned; after the maintenance is completed, the area may not be cleaned. Either the room needs to be returned to continue cleaning, or the remaining part of the path does not need to be cleaned.
  • the cleaning robot when the first maintenance instruction is obtained, the cleaning robot is moving along the at least part of the path toward the door of the room, that is, the type of task the cleaning robot has performed is moving toward the room along the at least part of the path. If the type of the unexecuted task is to move to the door of the room along the at least part of the path, you can continue to move and clean along the at least part of the path to the door of the room until the workload completed by the cleaning robot reaches the At a second workload threshold, or until the cleaning robot moves and cleans to the doorway of the room, the second maintenance instruction is generated to control the cleaning robot to perform maintenance.
  • the cleaning robot when the first maintenance instruction is obtained, the cleaning robot is moving toward the doorway of the room along the at least part of the path, and the path where the task is not performed includes a path moving toward the doorway of the room and away from the doorway.
  • the path to the door of the room that is, the type of task the cleaning robot has performed is to move toward the door of the room along at least part of the path
  • the type of task that the cleaning robot has not performed is to move to the door of the room along the at least part of the path, and cleaning can be controlled.
  • the robot moves toward the door instead of controlling the cleaning robot to move away from the door of the room, so as to reduce the number of turns of the cleaning robot between the door and the interior of the room and improve cleaning efficiency.
  • controlling the cleaning robot to perform maintenance according to the task execution status includes: when the type of the unexecuted task includes the cleaning robot cleaning along the edge of an obstacle, controlling the cleaning robot Perform maintenance after completing cleaning along the edges of said obstructions. After the maintenance is completed, the cleaning robot does not need to return to the obstacle again, which can save path consumption and improve cleaning efficiency.
  • the cleaning robot when the type of the unexecuted task is that the cleaning robot cleans along the edge of the obstacle, and the executed task is along part of the edge of the obstacle, the cleaning robot is controlled to finish cleaning.
  • the barrier is maintained after the edge.
  • the cleaning robot is controlled to start cleaning along the edge of the obstacle.
  • the edge of the obstacle is maintained before cleaning, that is, the cleaning robot can be directly controlled to perform maintenance, and continue to perform the unexecuted tasks after maintenance.
  • the method further includes: when it is determined that the cleaning robot is to be controlled to perform maintenance according to the task execution status, outputting prompt information, the prompt information being used to prompt the user that the cleaning robot is ready for maintenance.
  • the prompt information is output when the second maintenance instruction is generated.
  • the prompt information is output through an interactive unit of a cleaning robot, a base station or a user terminal.
  • the control method of a cleaning robot includes: obtaining a first maintenance instruction; in response to the first maintenance instruction, determining the task execution status of the cleaning robot; and controlling the cleaning robot according to the task execution status.
  • Cleaning robot for maintenance By controlling the cleaning robot to perform maintenance according to the task execution status of the cleaning robot with a corresponding control strategy, the maintenance timing of the cleaning robot can have a certain loose adjustment range to reduce the impact of cleaning robot maintenance on cleaning efficiency.
  • control method of the cleaning robot includes:
  • the first maintenance instruction is generated
  • the cleaning robot is controlled to complete the unexecuted tasks, and after completing The second maintenance instruction is generated after the unexecuted tasks; when the workload of the unexecuted tasks is greater than the preset workload, and the unexecuted tasks include the cleaning robot moving to the door of the room, the cleaning robot is controlled to move to the door, and when the workload Generate a second maintenance instruction when reaching the second workload threshold or moving to the door;
  • the cleaning robot is controlled to perform maintenance according to the second maintenance instruction.
  • the cleaning robot when the amount of work completed by the cleaning robot reaches the second workload threshold, the cleaning robot is forcibly controlled to perform maintenance. For example, it can ensure that the cleaning robot can return to the base station for maintenance.
  • control method of a cleaning robot includes: when the cleaning robot cleans a preset cleaning area, obtaining the workload completed by the cleaning robot; when the workload completed by the cleaning robot When the magnitude reaches the first workload threshold, the cleaning robot is controlled to perform maintenance.
  • existing cleaning robots usually control the cleaning robot to return to the base station for maintenance according to a fixed cleaning area.
  • the degree of dirt in different areas is different. If the cleaning area is set to be large, the cleaning ability of the cleaning robot will not be able to complete the cleaning well when encountering a dirty area, and the cleaning effect will be greatly affected; If the designated area is small, when the floor is not very dirty, the cleaning robot will return to the base station for maintenance when it still has more cleaning capabilities, and the cleaning efficiency will be affected. That is, the cleaning robot returns to the base station for maintenance according to a fixed cleaning area plan, which cannot take into account both cleaning effect and cleaning efficiency.
  • Embodiments of the present application can obtain both the cleaning effect and the performance of the cleaning robot by obtaining the workload completed by the cleaning robot, and controlling the cleaning robot to perform maintenance when the magnitude of the workload completed by the cleaning robot reaches the first workload threshold. Cleaning efficiency.
  • the workload includes at least one of the following: the amount of dirt absorbed by the cleaning robot's mopping part when mopping the floor, the power consumption of the cleaning robot when cleaning the floor, the amount of time the cleaning robot cleans the floor. The amount of water consumed when the cleaning robot cleans the floor, the amount of dirt collected when the cleaning robot cleans the floor, and the amount of sewage collected when the cleaning robot cleans the floor.
  • obtaining the workload completed by the cleaning robot includes: obtaining the graphical characteristics of the room where the preset cleaning area is located and/or the room identification of the room. According to the graphical characteristics and/or The room identification of the room determines the amount of work performed by the cleaning robot. Exemplarily, the workload of each unit area in the room is determined based on the graphical features and/or the room identification of the room, and the workload of the unit area cleaned by the cleaning robot is determined. The amount of work done by the cleaning robot. For the step of determining the workload of the unit area, reference may be made to the description of the foregoing embodiments and will not be described again here.
  • the preset cleaning areas are divided according to the workload of each unit area in the room.
  • one or more rooms can be divided into one or more preset cleaning areas according to the workload of each unit area in the room or rooms.
  • the workload value domain range determines the dividing line of the room based on the workload value domain range and the workload of each unit area in the room, so that the dividing line and the boundary of the room form
  • the upper limit of the workload value range may be used as the first workload threshold.
  • the task execution status of the cleaning robot is determined; and based on the task execution status, the cleaning robot is controlled to perform maintain.
  • the control method of the cleaning robot shown in Figure 9 can be described, which will not be described again here.
  • the method further includes: determining the cleaning order of the plurality of preset cleaning areas based on the workload of each unit area in the plurality of preset cleaning areas.
  • Clean area For example, when the cleaning requirements of the cleaning mode are higher (such as deep cleaning mode), the preset cleaning area with the larger workload sum per unit area is cleaned first; when the cleaning requirements of the cleaning mode are lower (such as quick cleaning mode) In this case, first clean the preset cleaning area where the sum of the workload per unit area is smaller.
  • the method further includes: determining a cleaning path when cleaning the preset cleaning area based on the workload of each unit area in the preset cleaning area.
  • the cleaning path may preferentially cover unit areas with a larger workload to achieve an obvious cleaning effect faster; or it may also preferentially cover unit areas with a smaller workload to reduce the risk of adsorbing more dirt. Mopping elements reduce cleaning effectiveness.
  • it can also be determined according to the cleaning mode that the cleaning path preferentially covers the unit area with a large workload, or the cleaning path preferentially covers the unit area with a small workload. For example, in the deep cleaning mode, the cleaning path preferentially covers the unit area with a large workload, but of course it is not limited to this.
  • Cleaning robots can be used to automatically clean floors, and their application scenarios can include household indoor cleaning, large-scale place cleaning, etc.
  • the body of the cleaning robot is usually controlled to maintain a predetermined distance from the wall or obstacle.
  • the cleaning mechanism Structural design limitations.
  • cleaning along room walls or obstacle areas there is an area between the cleaning robot and the room wall or obstacle that cannot be cleaned by the cleaning structure. That is, the cleaning robot maintains a predetermined distance along the room wall or obstacle.
  • cleaning areas there are areas that exceed the cleaning limit of the cleaning structure.
  • FIG. 12 is a schematic flowchart of a control method for a cleaning robot provided by an embodiment of the present application.
  • the control method of the cleaning robot can be applied in a cleaning system to control the cleaning robot in the system so that the cleaning robot performs cleaning tasks and cleans the area corresponding to the cleaning task map.
  • the area corresponding to the cleaning task map can be any area to be cleaned such as a family space, a room unit of a family space, a partial area of a room unit, a large place, or a part of a large place.
  • the area corresponding to the cleaning task map can refer to the larger area that is cleaned for the first time, such as the entire room unit; it can also refer to the area that needs to be cleaned after the first cleaning of the larger area, such as the backrest in the room unit. Wall area, or obstacle area.
  • the cleaning system includes one or more cleaning robots 100 and one or more base stations 200 .
  • the base station 200 is used in conjunction with the cleaning robot 100.
  • the base station 200 can charge the cleaning robot 100, and the base station 200 can provide a parking position for the cleaning robot 100, etc.
  • the base station 200 can also clean the mopping member 110 of the cleaning robot 100, where the mopping member 110 is used to mop the floor.
  • the cleaning system also includes a control device 300.
  • the control device 300 can be used to implement the steps of the cleaning robot control method according to the embodiment of the present application.
  • the robot controller of the cleaning robot 100 and/or the base station controller of the base station 200 can be used alone or in combination as the control device 300 to implement the steps of the cleaning robot control method according to the embodiment of the present application; in other embodiments , the cleaning system includes a separate control device 300 for implementing the steps of the cleaning robot control method in the embodiment of the present application.
  • the control device 300 can be provided on the cleaning robot 100, or can be provided on the base station 200; of course, it is not Limited to this, for example, the control device 300 may be a device other than the cleaning robot 100 and the base station 200, such as a home smart terminal, a master control device, etc.
  • the cleaning robot 100 includes a robot body, a driving motor, a sensor unit, a robot controller, a battery, a walking unit, a robot memory, a robot communication unit, a robot interaction unit, a mopping member 110, a charging component, and the like.
  • the mopping member 110 is used to mop the ground, and the number of the mopping member 110 may be one or more.
  • the mopping member 110 is, for example, a mop.
  • the wiping member 110 is disposed at the bottom of the robot body, specifically at a front position of the bottom of the robot body.
  • a driving motor is provided inside the robot body, and two rotating shafts protrude from the bottom of the robot body, and the mopping member 110 is sleeved on the rotating shafts. The driving motor can drive the rotating shaft to rotate, so that the rotating shaft drives the wiping member 110 to rotate.
  • the cleaning robot 100 further includes a brushing part 120 , and the brushing part 120 includes a side brushing part 121 and/or a middle sweeping part 122 .
  • the cleaning robot 100 is a cleaning robot that integrates sweeping and mopping.
  • the brushing part 120 and the mopping part 110 can work together.
  • the brushing part 120 and the mopping part 110 work at the same time, and the brushing part 120 and the mopping part 110 continue to work alternately, etc. ;
  • the brushing and sweeping part 120 and the mopping part 110 can also work separately, that is, the brushing and sweeping part 120 performs cleaning work alone, or the mopping part 110 performs mopping work alone.
  • the side brushing part 121 sweeps dust and other dirt from the outside to the middle area, and the middle sweeping part 122 continues to sweep the dirt in the middle area to the dust collector.
  • the number of side brushes 121 is not limited. As shown in FIG. 13 , the cleaning robot 100 has two side brushes 121 arranged on the left and right sides. Alternatively, only one side brush 121 is arranged on the left or right side.
  • the brushing and sweeping member 120 can be disposed on the front side of the mopping member 110. Therefore, when the brushing and sweeping member 120 and the mopping member 110 work together, the cleaning robot 100 can perform front sweeping and rear mopping. Compared with brushing and sweeping, The member 120 is disposed behind the mopping member 110, which can prevent the brushing member 120 from being wetted by the wet area dragged by the mopping member 110, and also prevent the dirty brushing member 120 from soiling the mopped area in front.
  • the walking unit is a component related to the movement of the cleaning robot 100 and is used to drive the cleaning robot 100 to move so that the mopping member 110 and/or the brushing member 120 mops the ground.
  • the robot controller is arranged inside the robot body, and is used to control the cleaning robot 100 to perform specific operations.
  • the robot controller may be, for example, a central processing unit (Central Processing Unit, CPU) or a microprocessor (Microprocessor).
  • CPU Central Processing Unit
  • Microprocessor Microprocessor
  • the robot controller is electrically connected to components such as batteries, robot memory, drive motors, walking units, sensor units, and robot interaction units to control these components.
  • the battery is provided inside the robot body, and the battery is used to provide power to the cleaning robot 100 .
  • the robot body is also provided with a charging component, which is used to obtain power from an external device to charge the battery of the cleaning robot 100 .
  • the robot memory is set on the robot body, and the robot memory stores programs. When the program is executed by the robot controller, the corresponding operations are implemented.
  • the robot communication unit is provided on the robot body. The robot communication unit is used to allow the cleaning robot 100 to communicate with external devices.
  • the cleaning robot 100 can communicate with the terminal and/or communicate with the base station 200 through the robot communication unit.
  • the base station 200 is a cleaning device used in conjunction with the cleaning robot 100 .
  • the sensor unit provided on the robot body includes various types of sensors, such as lidar, collision sensor, distance sensor, drop sensor, counter, and gyroscope.
  • the lidar is set on the top of the robot body.
  • the surrounding environment information can be obtained, such as the distance and angle of obstacles relative to the lidar, etc.
  • cameras can also be used instead of lidar. By analyzing the obstacles in the images captured by the cameras, the distance and angle of the obstacles relative to the camera can also be obtained.
  • the crash sensor includes a crash housing and a trigger sensor. When the cleaning robot 100 collides with an obstacle through the collision housing, the collision housing moves toward the inside of the cleaning robot 100 and compresses the elastic buffer.
  • the collision housing After the collision housing moves a certain distance into the cleaning robot 100, the collision housing contacts the trigger sensor, and the trigger sensor is triggered to generate a signal, which can be sent to the robot controller in the robot body for processing. After hitting the obstacle, the cleaning robot 100 moves away from the obstacle, and under the action of the elastic buffer, the collision shell moves back to its original position.
  • the distance sensor can specifically be an infrared detection sensor, which can be used to detect the distance from the obstacle to the distance sensor.
  • the distance sensor is arranged on the side of the robot body, so that the distance value from the obstacle located near the side of the cleaning robot 100 to the distance sensor can be measured by the distance sensor.
  • the distance sensor may also be an ultrasonic ranging sensor, a laser ranging sensor, a depth sensor, etc.
  • the drop sensor is provided at the bottom edge of the robot body. When the cleaning robot 100 moves to the edge of the ground, the drop sensor can detect that the cleaning robot 100 is at risk of falling from a high place, thereby performing corresponding anti-fall reactions, such as cleaning robot 100 stops moving, or moves away from the falling position, etc.
  • the gyroscope is used to detect the rotation angle of the cleaning robot 100, thereby determining the orientation of the cleaning robot 100.
  • the robot interaction unit is provided on the robot body, and the user can interact with the cleaning robot 100 through the robot interaction unit.
  • the robot interaction unit includes, for example, switch buttons, speakers, microphones, touch switches/screens and other components.
  • the user can control the cleaning robot 100 to start or stop working by pressing the switch button or touch switch/screen, and can also display the working status information of the cleaning robot through the touch screen.
  • the cleaning robot 100 can play prompt sounds to the user through the speaker, obtain the user's control instructions through the microphone, or locate the user's location by obtaining the user's voice.
  • the cleaning robot 100 described in the embodiment of the present application is only a specific example and does not constitute a specific limitation to the cleaning robot 100 of the embodiment of the present application.
  • the cleaning robot 100 of the embodiment of the present application can also be implemented in other specific ways.
  • the cleaning robot may have more or fewer components than the cleaning robot 100 shown in FIG. 1 .
  • the control method of a cleaning robot includes steps S410 to S420.
  • one or at least two preset frequencies of the boundary leak cleaning task are obtained, and based on the preset frequencies, the cleaning robot is controlled to clean the preset cleaning area.
  • the preset cleaning area performs boundary leakage cleaning tasks.
  • the preset cleaning area includes room boundaries and/or outlines of obstacles, where room boundaries include, for example, walls, steps, thresholds, etc., and obstacles include, for example, cabinets, beds, sofas, tables, chairs, etc., and of course, It is not limited to this.
  • the outline of obstacles connected with walls, steps, thresholds, etc. can also be determined as the room boundary.
  • the cleaning robot when the cleaning robot performs the boundary cleaning task, it can perform boundary cleaning and leaking on the boundaries of the room, and can also perform boundary cleaning and leaking on the contour boundaries of obstacles.
  • the outline of the obstacle can be the outline of the orthographic projection of the obstacle (such as the outline of the orthographic projection of a sofa or bed), or it can be the actual outline of the obstacle that the cleaning robot can touch when the cleaning robot moves around the obstacle (such as the outline of the orthographic projection of the obstacle). outlines of table legs, chair legs).
  • controlling the cleaning robot to perform a boundary leak-filling cleaning task according to the preset frequency includes: controlling the cleaning robot to move along the room boundary and/or the obstacle according to the preset frequency.
  • a leak-filling cleaning action is performed, wherein the leak-filling cleaning action includes at least one of the following: changing the angular velocity to rotate, changing the angular velocity and linear velocity to make a turn, changing the linear velocity to move forward or backward;
  • the cleaning movement maintains a predetermined distance from the boundary of the room or the outline of the obstacle while cleaning the preset cleaning area.
  • the cleaning robot is cleaning the room boundaries, such as areas close to the wall. At this time, the coverage area of the mop part is limited, and the problem of leakage of cleaning is easy to occur. Please refer to Figure 14.
  • Figure 14 is a schematic diagram of the cleaning robot performing edge cleaning movement on the preset cleaning area through the mopping member 110, such as cleaning along the room boundary or obstacle outline in the preset cleaning area; the cleaning robot moves along the room When the boundary or obstacle outline is traveled and the area near the room boundary or obstacle outline is cleaned by the cleaning member such as the mopping member 110, the edge of the cleaning member such as the mopping member 110 maintains a predetermined distance L from the room boundary or the obstacle outline. , at this time, there will be a cleaning blind area due to the predetermined distance, that is, the area that is not cleaned by the cleaning parts.
  • Figure 14 shows a schematic diagram of the cleaning blind area in a working scenario in which the cleaning robot drags and cleans along a straight line.
  • the cleaning robot cleans along the straight boundary of the room or obstacle.
  • the working scenario of the cleaning robot is not limited to dragging and cleaning along a straight line.
  • it can also include at least one of the following: brushing and cleaning along the inner corners, dragging and cleaning along the inner corners, brushing and cleaning along the outer corners, and cleaning along the outer corners. Cleaning by mopping at outer corners, brushing along the column, and mopping along the column will also create cleaning blind spots in these scenarios.
  • Embodiments of the present application can control the cleaning robot to perform a boundary leak cleaning task to clean at least part of the cleaning blind area to improve the cleaning effect of the preset cleaning area. For example, it can at least improve the cleaning of room boundaries and areas near obstacles. Effect.
  • Figure 15 shows a schematic diagram of a cleaning robot performing a boundary leak-filling cleaning task in one embodiment.
  • the cleaning robot rotates a preset angle clockwise or counterclockwise or any angle every time it travels a certain distance, so that the edges of cleaning parts such as the mopping part 110 are closer to the room boundary or obstacles.
  • the object outline expands the moving trajectory of the mopping member 110 and increases the coverage area of the mopping member 110 to clean at least part of the cleaning blind area and improve the cleaning effect on areas such as room boundaries.
  • FIG. 16 is a schematic diagram of a cleaning robot performing a boundary leak-filling cleaning task in another embodiment.
  • the preset cleaning area includes the outline of a room boundary or an obstacle
  • controlling the cleaning robot to perform a boundary leak-filling cleaning task includes controlling the cleaning robot to perform an arcuate cleaning movement on the preset cleaning area, and when the cleaning robot When traveling in a straight line to the boundary of the room or the outline of the obstacle, a turning action is performed so that the coverage of the cleaning parts of the cleaning robot covers at least part of the cleaning blind area, where the cleaning blind area is where the cleaning robot performs During edge cleaning movement, the area between the boundary line of the coverage range of the cleaning element and the room boundary and/or the outline of the obstacle.
  • whether the cleaning robot has traveled to the room boundary or the outline of the obstacle can be determined by whether the distance between the cleaning robot and the outline of the room boundary or obstacle reaches a distance threshold when the cleaning robot moves in a straight line. It can also be determined by determining whether the cleaning robot moves in a straight line. Determine whether the cleaning robot has traveled to the outline of the room boundary or obstacle if it collides with the room boundary or obstacle. As shown in Figure 16, the cleaning robot approaches the room boundary or the outline of the obstacle along the arcuate trajectory. When it travels to a distance equal to the distance threshold from the room boundary or the outline of the obstacle, or collides with the room boundary or the obstacle.
  • the edge of the cleaning member such as the dragging member 110 is closer to the room boundary or the outline of the obstacle, so as to clean at least part of the cleaning blind area; optionally, the turning movement is not limited to the reverse direction.
  • the clockwise direction may be turned around, or the clockwise direction may be turned around; for example, the distance threshold may be determined based on the radius of the cleaning robot, the cleaning range of cleaning parts such as the mopping part 110, etc.
  • the cleaning robot when cleaning each room, the cleaning robot is controlled to first perform an edge cleaning movement along the room boundary and perform the leak-filling cleaning action, and then the cleaning robot is controlled to arc along an arcuate path.
  • Cleaning motion it can also be to perform an arcuate cleaning motion for all rooms, and then perform edge cleaning motions along the boundaries of each room and perform leak-proof cleaning actions; it can also be to perform an arcuate cleaning motion for the room, and control the cleaning robot to reach a distance from the room boundary.
  • a bow-shaped turning movement is performed.
  • the dragging element located at the tail of the robot can drag at least part of the cleaning blind area near the room boundary without the need for additional cleaning movements along the edge.
  • the cleaning robot is controlled to first perform an edge cleaning movement along the room boundary and perform the leak-filling cleaning action, and then the cleaning robot is controlled to arc along an arcuate path.
  • Cleaning motion it can also be to perform an arcuate cleaning motion for all rooms, and then perform edge cleaning motions along the boundaries of each room and perform leak
  • the method before controlling the cleaning robot to perform an arcuate cleaning movement on the preset cleaning area, the method further includes: controlling the cleaning robot to perform an edge cleaning movement on the preset cleaning area.
  • the cleaning motion may determine room boundaries and/or the outline of obstacles in a preset cleaning zone. For example, as shown in Figure 16, in the case of boundary leakage correction by turning around during the arcuate cleaning movement, you can perform an edge cleaning movement in the room (no leakage filling movement is performed during the edgewise cleaning movement), and then perform arcuate cleaning When moving, the cleaning robot is controlled to perform a U-turn when it reaches the distance threshold from the room boundary or collides with the room boundary, so as to repair and clean at least part of the cleaning blind area.
  • the area covered by the arcuate cleaning movement does not deduct the area covered by the edge cleaning movement. Area, that is, when the cleaning robot travels along the arcuate trajectory until the distance from the outline of the room boundary or the obstacle is equal to the distance threshold or the cleaning robot collides with the room boundary or obstacle, it turns counterclockwise or clockwise and turns around, and it can be cleaned Clean at least part of the blind area; you can also perform two arcuate cleaning movements, and the arcuate trajectories of the arcuate cleaning movements on both sides are orthogonal to ensure that most or all of the leaked mopping areas, that is, the cleaning blind area can be cleaned once .
  • controlling the cleaning robot to perform a boundary leak-filling cleaning task according to the preset frequency may include controlling the cleaning robot to perform a boundary leak-filling cleaning task at a preset time interval, such as once every 7 days; It can also include controlling the cleaning robot to perform a boundary leak cleaning task for a preset cleaning area, such as a room, after performing a preset number of cleaning tasks. For example, a room has been cleaned 7 times, and it needs to be cleaned for the 8th time. Boundary patch cleaning. Of course, it is not limited. For example, after the room performs a preset number of cleaning tasks, it is determined whether the time interval since the last boundary leak cleaning is greater than or equal to 7 days. If so, the cleaning robot is controlled to perform a boundary leak cleaning task, which can prevent room boundaries. Accumulation of dirt caused by long-term uncleaning.
  • the cleaning robot is controlled to perform the boundary leak cleaning task to improve the cleaning efficiency; the embodiment of the present application controls the cleaning robot to perform the boundary leak cleaning task according to the preset frequency to clean the preset cleaning area, reduce or eliminate room boundaries or Dirt near the outline of obstacles can take into account both cleaning efficiency and cleaning effect.
  • the cleaning robot is more intelligent and the user experience is better. For example, the preset frequency of edge trap cleaning tasks may be lower than the frequency of only edge cleaning movements.
  • the method further includes: obtaining the object type of the boundary leak cleaning object; and controlling the cleaning robot to perform a boundary leak cleaning task on the preset cleaning area according to the preset frequency, including: Determine a preset frequency corresponding to the object type of the boundary leak cleaning object, and control the cleaning robot to perform a boundary leak cleaning task on the preset cleaning area according to the preset frequency corresponding to the object type.
  • Controlling the cleaning robot to perform a boundary leak cleaning task on the preset cleaning area includes controlling the cleaning robot to perform boundary leak cleaning on the boundary leak cleaning object, that is, cleaning at least part of the cleaning blind area near the boundary leak cleaning object. . It can be understood that each object type corresponds to a preset frequency, and a boundary leak cleaning object corresponds to at least one object type.
  • the boundary leak cleaning object When a boundary leak cleaning object only corresponds to one object type, the boundary leak cleaning object only corresponds to one Preset frequency; when a boundary leak cleaning object corresponds to at least two object types, the boundary leak cleaning object corresponds to at least two preset frequencies, and the boundary leak cleaning object can be cleaned according to at least two different preset frequencies.
  • Objects undergo boundary trap cleaning. For example, if a coffee table has one pair of sides that belong to one object type and the other pair of sides that belongs to another object type, the cleaning robot can perform border leak repair cleaning on the two pairs of sides respectively according to different frequencies. By determining the corresponding preset frequency according to the object type of the boundary leak cleaning object, both the cleaning efficiency of the cleaning robot and the cleaning effect of the preset cleaning area can be taken into consideration.
  • the preset frequency includes a first preset frequency and a second preset frequency
  • the object types include suspended obstacles and non-suspended obstacles.
  • suspended obstacles are invisible to the radar (cannot be observed by the radar), but can trigger changes in the output signal of the collision sensor on the radar, that is, obstacles that can be detected by the collision sensor on the radar, such as those with low spaces at the bottom.
  • Bookcases, coffee tables, sofas, etc. for example, it can be determined that the edge of the bottom of the sofa detected by the collision sensor on the radar is a suspended obstacle.
  • the cleaning robot when the cleaning robot cleans along a suspended obstacle, it can clean the ground along the outer contour of the orthographic projection of the obstacle. At this time, the area near the outer contour of the obstacle is not cleaned.
  • the cleaning robot By controlling the cleaning robot to perform boundary leak cleaning along the suspended obstacle, the cleaning blind area near the outer contour of the suspended obstacle can be reduced or eliminated; the cleaning robot performs boundary leak cleaning along the suspended obstacle, including the cleaning robot along the obstacle.
  • the outer contour of the orthographic projection is used for floor cleaning and leak-proofing cleaning operations at the same time.
  • the cleaning range under the suspended obstacle can be determined based on the output signal of the collision sensor on the radar, or the visual signal of the visual sensor can be combined to determine the outer surface of the suspended obstacle. Clean the blind area near the contour, and control the cleaning robot to clean the determined range.
  • non-suspended obstacles are obstacles that can be observed by radar and may collide with the main body of the cleaning robot, such as a box bed.
  • controlling the cleaning robot to perform a boundary leak cleaning task on the preset cleaning area according to the preset frequency corresponding to the object type includes: when determining that the boundary leak cleaning object is a suspended obstacle, According to the first preset frequency, the cleaning robot is controlled to perform boundary leak cleaning along the suspended obstacle; when it is determined that the boundary leak cleaning object is a non-suspended obstacle, the cleaning robot is controlled according to the second preset frequency.
  • the cleaning robot performs a boundary leak-filling cleaning task on the non-suspended obstacle.
  • the second preset frequency is different from the first preset frequency.
  • the first preset frequency is higher than the second preset frequency.
  • Dirt near suspended obstacles such as the bottom edge of the sofa is more likely to be noticed than dirt near non-suspended obstacles.
  • cleaning the dirt near suspended obstacles at a higher frequency better results can be achieved. cleaning effect; by cleaning dirt near non-suspended obstacles at a lower frequency, cleaning efficiency can be improved.
  • the preset frequency includes a third preset frequency and a fourth preset frequency
  • the object types include discrete obstacles and aggregated obstacles.
  • discrete obstacles represent boundary leak cleaning objects with relatively scattered distribution of obstacles.
  • Discrete obstacles are, for example, cartons with no other items around in the living room.
  • Aggregated obstacles represent boundary leak cleaning objects with relatively concentrated distribution.
  • Aggregated obstacles are, for example, Obstacle avoidance is required for the table/chair legs of the dining table/chairs clustered together in the kitchen, but is certainly not limited to this.
  • controlling the cleaning robot to perform a boundary leak cleaning task on the preset cleaning area according to the preset frequency corresponding to the object type includes: when determining that the boundary leak cleaning object is a discrete obstacle, According to the third preset frequency, the cleaning robot is controlled to perform boundary leak cleaning along the discrete obstacles; when it is determined that the boundary leak cleaning object is a gathered obstacle, the cleaning robot is controlled according to the fourth preset frequency.
  • the cleaning robot performs a boundary leak cleaning task on the gathered obstacles.
  • the third preset frequency and the fourth preset frequency are different.
  • the third preset frequency is higher than the fourth preset frequency. Since users may be more sensitive to dirt in cleaning blind areas near discrete obstacles, better cleaning results can be obtained by performing boundary patching cleaning on discrete obstacles at a higher frequency; by cleaning clustered obstacles at a lower frequency Cleaning objects can improve cleaning efficiency.
  • the boundary leak cleaning object corresponds to at least two of the object types, and one of the object types corresponds to one of the preset frequencies.
  • the boundary leak cleaning object can correspond to both suspended obstacles and discrete obstacles.
  • the suspended obstacles correspond to the first preset frequency and the discrete obstacles correspond to the third preset frequency.
  • the boundary leak cleaning object corresponds to the first preset frequency and the third preset frequency.
  • Three preset frequencies. Controlling the cleaning robot to perform a boundary leak cleaning task on the preset cleaning area according to the preset frequency corresponding to the object type includes: according to the highest preset frequency among at least two preset frequencies corresponding to the object type. Set the frequency to perform boundary leak repair cleaning on the preset cleaning area.
  • the system is controlled according to the larger preset frequency of the first preset frequency and the third preset frequency.
  • the cleaning robot performs boundary leakage cleaning tasks on boundary leakage cleaning objects. Choosing a more appropriate frequency to perform boundary leak cleaning tasks not only ensures the overall cleaning effect of the room, but also ensures the overall cleaning efficiency of the room and provides a better user experience.
  • the method further includes: obtaining the environment type of the area where the boundary leak cleaning object is located; and controlling the cleaning robot to perform a boundary leak cleaning task on the preset cleaning area according to the preset frequency,
  • the method includes: controlling the cleaning robot to perform a boundary leak-filling cleaning task on the preset cleaning area according to a preset frequency corresponding to the environment type.
  • the environment type includes but is not limited to at least one of the following: public area, non-public area type, narrow area, and non-narrow area.
  • the preset frequency includes a fifth preset frequency and a sixth preset frequency
  • the environment type of the area where the boundary leak cleaning object is located includes public areas and non-public areas.
  • public areas can include living rooms, dining rooms, kitchens, balconies, walkways and other areas shared by members.
  • Non-public areas include bedrooms, study rooms and other areas with strong privacy. Whether each room is a public area or a non-public area can be determined by the cleaning staff.
  • the robot performs automatic recognition, such as based on the type of furniture, or it can be set by the user.
  • controlling the cleaning robot to perform a boundary leak cleaning task on the preset cleaning area according to the preset frequency corresponding to the environment type includes: determining that the environment type of the area where the boundary leak cleaning object is located is In a public area, according to the fifth preset frequency, the cleaning robot is controlled to perform a boundary leak-filling cleaning task on the room boundary of the public area and/or the outline of the obstacles in the public area; determine the boundary leak-filling cleaning task
  • the environment type of the area where the object is located is a non-public area
  • the sixth preset frequency the cleaning robot is controlled to perform boundary execution on the room boundary of the non-public area and/or the outline of the obstacles in the non-public area. Leak cleaning tasks.
  • the sixth preset frequency is different from the fifth preset frequency.
  • the fifth preset frequency is higher than the sixth preset frequency.
  • the preset frequency includes a seventh preset frequency and an eighth preset frequency
  • the environment type of the area where the boundary leak cleaning object is located includes a narrow area and a non-narrow area.
  • the cleaning robot travels in a preset area
  • the sum of the distances between the cleaning robot and the obstacles and/or room boundaries on its left and right sides is less than or equal to the predetermined value L0, that is, when l1+l2 ⁇ L0, it is determined
  • the preset area is a narrow area; when the sum of the distances between the cleaning robot and the obstacles and/or room boundaries on its left and right sides is greater than the predetermined value, the preset area is a non-narrow area.
  • the entire preset cleaning area that the cleaning robot needs to clean is the narrow area or is a non-narrow area, or some partial areas of the entire preset cleaning area that the cleaning robot needs to clean are the narrow area. , and some partial areas are non-narrow areas.
  • controlling the cleaning robot to perform a boundary leak cleaning task on the preset cleaning area according to the preset frequency corresponding to the environment type includes: determining that the environment type of the area where the boundary leak cleaning object is located is In a narrow area, according to the seventh preset frequency, the cleaning robot is controlled to perform a boundary leak cleaning task on the room boundary and/or the outline of obstacles in the narrow area; determine the environment of the area where the boundary leak cleaning object is located When the type is a non-narrow area, the cleaning robot is controlled to perform a boundary leak cleaning task on the outline of the room boundary and/or obstacles in the non-narrow area according to the eighth preset frequency. Specifically, the eighth preset frequency is different from the seventh preset frequency.
  • the seventh preset frequency is higher than the eighth preset frequency, that is, the boundary cleaning frequency of the narrow area is higher than the boundary cleaning frequency of the non-narrow area. Since users are highly sensitive to obstacles in narrow areas and/or dirt in cleaning blind areas near room boundaries, cleaning dirt in narrow areas at a higher frequency can achieve a better cleaning effect; by cleaning dirt in narrow areas with a higher frequency; Cleaning dirt in non-narrow areas less frequently can improve cleaning efficiency.
  • the area where the boundary leak cleaning object is located corresponds to at least two of the environmental types, and one of the environmental types corresponds to one of the preset frequencies, that is, the area where the boundary leak cleaning object is located corresponds to at least two of the environmental types.
  • Preset frequency for example, the fifth preset frequency, the sixth preset frequency, the seventh preset frequency, and the eighth preset frequency are different.
  • the environment type of the area where the boundary leak cleaning object is located may be both a public area and a narrow area.
  • controlling the cleaning robot to perform a boundary leak-filling cleaning task according to the preset frequency corresponding to the environment type includes: according to the highest preset frequency among at least two preset frequencies corresponding to the environment type, The cleaning robot is controlled to perform a boundary leak-filling cleaning task on a boundary leak-filling cleaning object in the preset cleaning area.
  • the cleaning robot performs a boundary leak cleaning task on the boundary leak cleaning object of the preset cleaning area, and selects a more appropriate frequency to perform the boundary leak cleaning task. On the one hand, it ensures the overall cleaning effect of the room, and on the other hand, it also ensures that the room The overall cleaning efficiency is better and the user experience is better.
  • control method may include: obtaining the object type of the boundary leak cleaning object; and obtaining the environment type of the area where the boundary leak cleaning object is located.
  • Controlling the cleaning robot to perform a boundary leak cleaning task on the preset cleaning area according to the preset frequency includes: determining a preset frequency corresponding to the object type of the boundary leak cleaning object; and determining the boundary The preset frequency corresponding to the environment type of the leak-proof cleaning object in the area; according to the highest preset frequency among the preset frequencies corresponding to the object type and the environment type, the cleaning robot is controlled to perform boundary execution on the preset cleaning area Leak cleaning tasks.
  • the boundary leak cleaning object corresponds to multiple object types and/or environment types
  • the highest preset frequency among the respective corresponding preset frequencies of the multiple object types and/or environment types is used.
  • the boundary trap cleaning object performs the boundary trap cleaning task.
  • the first preset frequency, the second preset frequency, the third preset frequency, the fourth preset frequency, the fifth preset frequency, the sixth preset frequency The frequency, the seventh preset frequency, and the eighth preset frequency are different.
  • the preset cleaning area is a narrow area, and the narrow area includes both suspended obstacles and non-suspended obstacles
  • the larger preset frequency among the first preset frequency and the seventh preset frequency can be used.
  • Frequency perform boundary leakage cleaning tasks on the suspended obstacles in the narrow area, and perform cleaning on all the suspended obstacles in the narrow area according to the larger preset frequency of the second preset frequency and the seventh preset frequency.
  • the above non-suspended obstacles perform boundary leakage cleaning tasks.
  • the preset cleaning area is a narrow area in a non-public area, and the narrow area includes both suspended obstacles and non-suspended obstacles
  • the first preset frequency, the sixth preset frequency and the The larger preset frequency among the seventh preset frequencies is used to perform a boundary leak cleaning task on the suspended obstacles in the narrow area.
  • the sixth preset frequency and the The larger preset frequency among the seventh preset frequencies is used to perform a boundary leak cleaning task on the non-suspended obstacles in the narrow area.
  • the preset frequency of the boundary leak cleaning task of the boundary leak cleaning object can be determined according to the multiple types of boundary leak cleaning objects in the room, and the boundary leak cleaning task can be performed on each boundary leak cleaning object according to the determined preset frequency.
  • Task select a more appropriate frequency to perform boundary leak cleaning tasks for different types of boundary leak cleaning objects in the same room space. On the one hand, it ensures the overall cleaning effect of the room, and on the other hand, it also ensures the overall cleaning efficiency of the room. Boundary leak cleaning cleaning Objects, user experience is better.
  • the method further includes: obtaining a working scene of the cleaning robot, and the working scene includes at least one of the following: brushing and cleaning along the inner corner, dragging and cleaning along the inner corner, brushing and cleaning along the outer corner, and cleaning along the outer corner. Mop and clean the outer corners, brush and clean along the column, mop and clean along the column, and mop and clean along the straight line.
  • controlling the cleaning robot to perform a boundary leak cleaning task includes: selecting a corresponding boundary leak cleaning strategy according to the working scenario of the cleaning robot; controlling the cleaning robot to perform boundary leak cleaning according to the boundary leak cleaning strategy Cleaning tasks.
  • the boundary leak-filling cleaning strategy is to control the cleaning robot to perform leak-filling cleaning actions, so as to perform leak-filling cleaning along the edges of the cleaning and leak-filling objects.
  • Boundary leakage cleaning strategies include rotation strategy, retreat cleaning strategy, and tangential cleaning strategy.
  • the cleaning robot cleans along the inner angle of the room boundary or obstacle (the inner angle can be an inner angle greater than 0 degrees and less than 180 degrees, and the illustration is a 90-degree right angle).
  • the cleaning robot can be controlled to rotate according to the rotation cleaning strategy to bring the air outlet of the fan close to the inner corner.
  • the airflow from the air outlet will lift up the dirt at the inner corner, and then the cleaning tool will be swept. Clean up.
  • an air outlet for a fan is provided at the rear of the right side of the cleaning robot. The right side of the cleaning robot cleans along the boundaries of the room or the outline of obstacles.
  • the cleaning robot When cleaning inner corners, the cleaning robot is controlled to rotate counterclockwise in place to Make the airflow from the air outlet blow to the inner corner, and lift the dirt on the ground in the inner corner toward the front of the cleaning robot, so that the cleaning robot's sweeping and vacuuming devices can absorb the dirt.
  • the cleaning robot can be controlled according to the rotation cleaning strategy to travel along the first side of the inner corner, and when traveling to the inner corner and contacting the second side of the inner corner, rotate in situ to at least clean the inner corner. Drag partially and then along the second side of the inside corner.
  • the cleaning robot cleans along the outer corner of the room boundary or obstacle (the outer angle can be an outer angle greater than 180 degrees, and the illustration is an outer angle of 270 degrees).
  • an optional boundary leak cleaning strategy is a backward cleaning strategy. For example, when the cleaning robot turns from the first side of the outer corner to (such as turning right) the first side of the outer corner. After the two sides, the cleaning robot is controlled to retreat a preset distance before moving forward (shown by the dotted arrow) to clean the missed area when turning.
  • Another optional boundary leak cleaning strategy is to turn the cleaning robot to the second side of the outer corner (such as turning right) before turning from the first side of the outer corner to the second side of the outer corner.
  • the predetermined distance can be equal to or known to be equal to the radius of the cleaning robot; and then the cleaning robot is controlled to turn in place (such as turning around a certain point between the running wheels). ), and then control the cleaning robot to move along the second side of the outer corner so that the brushing element contacts the second side of the outer corner.
  • an optional boundary leak cleaning strategy is a backward cleaning strategy, for example, when the cleaning robot turns from the first side of the outer corner to the second side of the outer corner (such as Before turning left), the cleaning robot is controlled to advance a preset distance and then retreat a preset distance to complete the leak-repairing and sweeping action.
  • Another optional edge patching cleaning strategy is turning patching, which is the same as turning patching when cleaning by brushing along the outside corners with a brushing piece, and will not be discussed again here.
  • FIG 21 The cleaning robot brushes and cleans along the columnar body.
  • Figure 22 The cleaning robot mops and cleans along the columnar body.
  • the gray ring surrounding the columnar body represents the cleaning blind area, such as the missed scanning area, in which the columnar body includes but does not Limited to cylinders, table legs, and chair legs.
  • Figure 23 shows the boundary leakage cleaning strategy corresponding to the working scenario of brushing cleaning along the columnar body or dragging cleaning along the columnar body in one embodiment. It can be called a tangential cleaning strategy, for example, controlling the cleaning robot to move along the tangential direction of the cylinder.
  • the method further includes: obtaining a cleaning task map; controlling the cleaning robot to clean the preset cleaning area at least through a dragging member according to the cleaning task map; and after determining that the boundary is satisfied.
  • the cleaning robot is controlled to perform the boundary leakage cleaning task according to at least one of the preset frequencies; when a carpet is detected, the cleaning robot is controlled to explore the carpet along the edge to obtain the carpet. According to the outline of the carpet, add the carpet area corresponding to the carpet in the cleaning task map.
  • the cleaning robot can first be controlled according to the cleaning task map to mop (or brush and mop at the same time) non-carpet areas in the preset cleaning area.
  • the cleaning robot can also be controlled according to at least A preset frequency controls the cleaning robot to perform a boundary leak cleaning task to clean at least part of the cleaning blind area;
  • the carpet can be detected when cleaning the non-carpet area, for example, the carpet is detected through a sensor unit, and when the carpet is detected Control the cleaning robot to explore the carpet along the edge, determine the carpet area corresponding to the carpet, and update the carpet area into the cleaning task map.
  • the method further includes: when carpet cleaning conditions are met, controlling the cleaning robot to clean the carpet in the carpet area by using a brush according to the carpet area in the cleaning task map. For example, after all non-carpet areas in the cleaning task map have been mopped, or all non-carpet areas in the preset cleaning area have been cleaned, it is determined that the carpet cleaning conditions are met, and the cleaning robot is controlled.
  • the carpet in the carpet area is cleaned by means of a brush.
  • the carpet cleaning switch a button on the base station/cleaning robot, or a virtual switch on the user terminal
  • the brush sweeper cleans the carpet in the carpet area.
  • the cleaning all non-carpet areas perform maintenance on the mop element, such as cleaning the mop element and drying it, and then clean the carpet in the order of cleaning the carpet area. Clean the carpet after the mop is dry to avoid wetting the carpet.
  • the control method of a cleaning robot includes: obtaining one or at least two preset frequencies of a boundary leak cleaning task, and controlling the cleaning robot to perform boundary cleaning on the preset cleaning area according to the preset frequencies.
  • Leak cleaning tasks By controlling the cleaning robot to perform boundary leak-filling cleaning tasks, the cleaning effect of the preset cleaning area can be improved.
  • By controlling the cleaning robot to perform boundary leak-filling cleaning tasks according to the preset frequency both cleaning efficiency and cleaning effect can be taken into account.
  • the cleaning robot is more intelligent and the user experience is better. .
  • FIG. 24 is a schematic block diagram of the control device 300 provided by the embodiment of the present application.
  • the control device 300 includes a processor 301 and a memory 302.
  • processor 301 and the memory 302 are connected through a bus 303, such as an I2C (Inter-integrated Circuit) bus.
  • bus 303 such as an I2C (Inter-integrated Circuit) bus.
  • the processor 301 may be a micro-controller unit (Micro-controller Unit, MCU), a central processing unit (Central Processing Unit, CPU) or a digital signal processor (Digital Signal Processor, DSP), etc.
  • MCU Micro-controller Unit
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • the memory 302 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk or a mobile hard disk, etc.
  • ROM Read-Only Memory
  • the memory 302 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk or a mobile hard disk, etc.
  • the processor 301 is configured to run a computer program stored in the memory 302, and implement the steps of the cleaning robot control method of any of the foregoing embodiments when executing the computer program.
  • the robot controller 104 of the cleaning robot 100 and/or the base station controller 206 of the base station 200 can be used as the control device 300 alone or in combination to implement the cleaning robot according to the embodiment of the present application.
  • the control device 300 can be provided on the cleaning robot 100, or It can be set on the base station 200; of course, it is not limited thereto.
  • the control device 300 can be a device other than the cleaning robot 100 and the base station 200, such as a home smart terminal, a master control device, etc.
  • control device 300 on the base station 200 such as the base station controller 206
  • the control device 300 on the cleaning robot 100 such as the robot
  • the controller 104 is used to implement the steps of any cleaning robot control method in the embodiment of the present application; of course, it is not limited thereto.
  • control device 300 on the base station 200 can be used to implement the control method of the cleaning robot in the embodiment of the present application. A step of.
  • the embodiment of the present application also provides a base station, which is at least used to clean the mopping parts of the cleaning robot.
  • the base station also includes a control device 300, such as the base station controller 206, for implementing the implementation of the present application. The steps of the control method of the cleaning robot are shown in the example.
  • the embodiment of the present application also provides a cleaning robot, which includes a control device 300, such as the robot controller 104, for implementing the steps of the control method of the cleaning robot in the embodiment of the present application.
  • a control device 300 such as the robot controller 104
  • Figure 2 is a schematic diagram of a cleaning system provided by an embodiment of the present application.
  • the cleaning system includes:
  • the cleaning robot 100 includes a walking unit 106 and a mopping part 110, and may also include a brushing and sweeping part 120.
  • the walking unit 106 is used to drive the cleaning robot 100 to move so that the mopping part 110 mops the ground; the brushing and sweeping part
  • the component 120 includes a side brush component 121 and/or a middle sweep component 122;
  • the base station 200 is at least used to clean or replace the mopping member 110 of the cleaning robot 100; and/or the base station 200 includes a dirt detection device to detect the degree of dirt of the mopping member 110 of the cleaning robot 100; and
  • Embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program.
  • the processor can implement the steps of the method of any of the above embodiments. .
  • the computer-readable storage medium may be an internal storage unit of the control device described in any of the preceding embodiments, such as a hard disk or memory of the control device.
  • the computer-readable storage medium may also be an external storage device of the control device, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), or a secure digital (SD) equipped on the control device. ) card, Flash Card, etc.
  • Cleaning equipment such as cleaning robots
  • the application scenarios can be household indoor cleaning, large-scale place cleaning, etc.
  • cleaning equipment cannot detect the degree of dirt on the cleaned floor when cleaning the floor, and cannot reflect the cleaning workload of the cleaning equipment, thereby affecting the user's experience of using the cleaning equipment.
  • FIG. 25 is a schematic flowchart of a method for processing images cleaned by a cleaning device according to an embodiment of the present application.
  • the method for processing cleaning images of cleaning equipment can be applied in a cleaning equipment system to generate and display cleaning images of cleaning equipment in the system, so as to visualize the cleaning workload of the cleaning equipment.
  • the preset cleaning area can be any area to be cleaned such as a family space, a room unit of a family space, a part of a room unit, a large place, or a part of a large place. From another perspective, the preset cleaning area can refer to the larger area that is cleaned for the first time, such as the entire room unit; it can also refer to the area that needs to be cleaned after the first cleaning of the larger area, such as the wall area in the room unit. , or obstacle area.
  • a cleaning equipment system (or may be called a cleaning system or a cleaning robot system) includes one or more cleaning equipment 100 , one or more base stations 200 , and a processing device 300 .
  • the cleaning device 100 includes a movement mechanism and cleaning parts.
  • the movement mechanism of the cleaning equipment 100 is used to drive the cleaning equipment 100 to move so that the cleaning parts clean the preset cleaning area.
  • the movement mechanism is, for example, a walking unit, and the walking unit is used to drive the cleaning robot to move.
  • the motion mechanism drives the cleaning device 100 to move, the cleaning parts contact the preset cleaning area to clean the preset cleaning area during the movement of the cleaning device 100 .
  • the base station 200 is used in conjunction with the cleaning device 100.
  • the base station 200 can charge the cleaning device 100, the base station 200 can provide a docking location to the cleaning device 100, and so on.
  • the base station 200 can also clean the cleaning parts of the cleaning device 100 .
  • the cleaning equipment system includes one or more cleaning equipment 100 and a processing device 300 .
  • the cleaning device 100 includes a movement mechanism, cleaning parts and a maintenance mechanism.
  • the movement mechanism is used to drive the cleaning equipment 100 to move so that the cleaning parts clean the preset cleaning area
  • the maintenance mechanism is used to clean the cleaning parts.
  • the processing device 300 may be used to implement the steps of the method for processing a cleaning image by a cleaning device according to the embodiment of the present application.
  • the cleaning device 100 is provided with a device controller for controlling the cleaning device 100
  • the base station 200 is provided with a base station controller for controlling the base station 200
  • the device controller of the cleaning device 100 and/or the base station controller of the base station 200 can serve as the processing device 300 alone or in combination, for implementing the steps of the cleaning image processing method of the cleaning device according to the embodiment of the present application; in other embodiments, the cleaning equipment system includes a separate processing device 300 for implementing the steps of the cleaning device cleaning image processing method of the embodiment of the present application.
  • the processing device 300 may be disposed on the cleaning device 100 , or may be disposed on the cleaning device 100 .
  • the base station 200 of course, it is not limited to this.
  • the processing device 300 can be a device other than the cleaning device 100 and the base station 200, such as a home smart terminal, a master control device, etc.
  • the cleaning device 100 can be used to automatically clean a preset cleaning area.
  • the application scenarios of the cleaning device 100 can be household indoor cleaning, large-scale place cleaning, etc.
  • the cleaning components of the cleaning device 100 include at least one of a mopping component and a vacuum suction component, and are certainly not limited thereto.
  • the cleaning equipment 100 or the base station 200 further includes a dirt detection device, and the dirt detection device is used to detect the dirt degree of the cleaning parts.
  • the dirt detection device includes at least one of the following: a visual sensor and a sewage detection sensor.
  • the image or color information of the cleaning piece can be obtained according to the visual sensor, and the image or color information of the cleaning piece can be determined based on the image or color information of the cleaning piece. The degree of dirtiness.
  • the distance between the dirt inside a cleaning piece such as a dust collector and the dust collector is The closer the edge is, the more soiled the vacuum cleaner will be.
  • the sewage detection sensor can obtain detection information of sewage obtained by cleaning cleaning parts such as mopping parts, and the degree of contamination of the mopping parts of the cleaning parts can be determined based on the obtained detection information; optionally, the sewage detection sensor includes at least the following One: visible light detection sensor, infrared detection sensor, total dissolved solids detection sensor; for example, the infrared detection sensor collects the turbidity value of sewage, the visible light detection sensor collects the chromaticity value of sewage, and the total dissolved solids detection sensor Collect the water conductivity value of sewage; the degree of dirtiness of the mopping parts can be determined based on one or more of the turbidity value, chromaticity value, and water conductivity value; for example, the greater the turbidity value of the sewage, the greater the water conductivity value. The greater the conductivity, the greater the degree of dirt on the mopping parts.
  • the method of determining the degree of contamination of the cleaning parts of the cleaning device 100 is not limited to this, and is
  • the cleaning image processing method of the cleaning equipment is used to generate a cleaning image after the cleaning equipment completes cleaning of one or at least two preset cleaning areas through cleaning parts.
  • the method includes the steps S110 to step S120.
  • Step S110 After the cleaning equipment cleans a preset cleaning area once using a cleaning piece, obtain the degree of dirt corresponding to the preset cleaning area.
  • the preset cleaning area may be an area to be cleaned divided by the cleaning device based on a task map.
  • the task map can be established by the cleaning equipment by exploring the current space in response to the mapping instruction, or it can be updated by the cleaning equipment based on obstacles, carpets, etc. identified during the cleaning process; optionally, the task map The map may be a map of a cleaning area specified by the user, for example, in response to the user selecting a cleaning area such as one or more rooms on the map, determining the one or more rooms as a task map, or in response to the user circling the map.
  • the cleaning area is, for example, a partial area of one or more rooms, and the partial area of one or more rooms is determined as a task map, and is certainly not limited to this.
  • the preset cleaning area may be determined based on the room in the task map and/or the workload threshold of the cleaning equipment.
  • the workload of each preset cleaning area is less than or equal to the workload threshold, where the workload threshold is used to instruct the cleaning equipment to interrupt the current operation before completing the workload corresponding to the workload threshold. Cleaning tasks and movement to the base station for maintenance.
  • Cleaning tasks refer to tasks in which the cleaning equipment responds to cleaning instructions to clean all preset cleaning areas corresponding to the task map; for example, a room can be a preset cleaning area, or a room There are multiple preset cleaning areas; certainly not limited thereto, for example, a preset cleaning area includes one room and at least part of another room.
  • the preset cleaning area can be determined based on the user's division on the task map, or the division can be determined based on preset area division rules.
  • obtaining the degree of contamination corresponding to the preset cleaning area includes: after the cleaning device completes cleaning of the preset cleaning area through the cleaning piece, obtaining the degree of contamination of the cleaning piece. ; and, determining the degree of dirt corresponding to the preset cleaning area according to the degree of dirt of the cleaning parts. For example, after the cleaning device completes cleaning the preset cleaning area by using the mopping member, the degree of contamination of the mopping member is obtained.
  • the cleaning component includes a mopping component.
  • the mopping part such as a mop
  • the mop starts mopping from the moment it is washed until the dirt value of the mopping part reaches the maximum.
  • the cleaning robot moves forward at a constant speed and does not repeatedly mop a piece.
  • the dirt is uniformly distributed on the ground (the area is infinite)
  • the relationship between the amount of dirt collected by the mop piece, that is, the dirt value d of the mop piece and the mopping time is shown in Figure 28.
  • the cleaning equipment can also be controlled to move to the base station for maintenance, such as cleaning the mopping parts, replacing the cleaned mopping parts, or controlling the maintenance of the cleaning equipment.
  • the organization maintains the cleaning equipment, such as cleaning the mop parts or replacing the cleaned mop parts.
  • the maximum dirt value d_max of the mopping part is an empirical value, which can be measured in a laboratory, for example.
  • the cleaning component includes a dust suction component.
  • the vacuum cleaner has a certain dirt holding space. When the amount of dirt sucked by the vacuum cleaner reaches the maximum dirt holding space, the vacuum cleaner will no longer be able to suck in more dirt, which will affect the ground. The vacuum cleaning effect is also very poor.
  • the maximum amount of dirt on the dust collector is an empirical value, which can be measured in a laboratory, for example.
  • the dirt value corresponding to the preset cleaning area is positively correlated with the dirt value of the mopping part, that is, the greater the dirt value of the mopping part, the greater the dirt value of the mopping part.
  • the degree of dirt corresponding to the preset cleaning area is positively related to the degree of dirt of the dust collector, that is, the degree of dirt of the dust collector is The greater the degree of contamination, the dirtier the preset cleaning area; when the amount of dirt inhaled by the vacuum cleaner is equal to the maximum amount of dirt of the vacuum cleaner, it can be determined that the preset cleaning area is very dirty, and it is more likely to complete the preset cleaning area. It is assumed that after cleaning the cleaning area, for example, after completing the vacuuming of the preset cleaning area, there is still dirt in the preset cleaning area that has not been sucked in by the dust collector.
  • the degree of contamination of the preset cleaning area may be determined based on the degree of contamination of the mopping member and/or the degree of contamination of the vacuum suction member.
  • the dirt level of the cleaning piece is obtained through a dirt detection device on the base station or cleaning equipment, such as a visual sensor.
  • a dirt detection device on the base station or cleaning equipment such as a visual sensor.
  • the darker the color of the mopping piece the greater the dirt level of the mopping piece.
  • it can also be obtained through the visual sensor mounted on the cleaning equipment and facing the mop member.
  • the degree of contamination of the mopping part, or the degree of contamination of the cleaning part of the cleaning device can be obtained through a visual sensor mounted on the cleaning device and facing the inside of the suction part to determine the degree of contamination of the cleaning part of the cleaning device.
  • obtaining the degree of contamination of the cleaning part includes: when cleaning the mopping part, obtaining detection information of sewage used to clean the mopping part; and determining the degree of contamination of the mopping part based on the detection information.
  • the dirt detection device includes a sewage detection sensor.
  • the sewage detection sensor is used to detect sewage after cleaning the mop piece, such as detecting one or more of the turbidity information, color information, and water conductivity information of the sewage.
  • the amount of dirt removed from the mop element can be determined by the turbidity value of the wastewater, the colorimetric value of the wastewater or the water conductivity of the wastewater.
  • the turbidity, color or water conductivity of the sewage when the turbidity, color or water conductivity of the sewage is greater, it means that the sewage after cleaning the mop parts is dirtier, and the amount of dirt cleaned from the mop parts is larger, which is used to indicate the amount of dirt removed from the mop parts.
  • the turbidity value of sewage, the chromaticity value of sewage, and the water conductivity of sewage can all be used to characterize the amount of dirt cleaned from the mop parts, that is, it can characterize the degree of dirt of the mop parts, and they are all related to the dirt.
  • the dirt elution value, dirt amount or dirt degree are positively correlated or have a corresponding relationship.
  • the turbidity value detected for the sewage from cleaning the mop for the first time is 1 NTU
  • the turbidity value corresponding to 1 NTU is 100.
  • the turbidity value detected for the sewage from the second cleaning mop is 1 NTU.
  • the dirt elution value or the amount of dirt corresponding to the sewage with a turbidity value of 2NTU and a turbidity value of 2NTU is 200. It can be judged that the amount of dirt cleaned from the mop part for the first time is less than that from the mop piece for the second time. The amount of dirt washed off the wiping parts, that is, the degree of dirt on the wiping parts cleaned for the first time is less than the dirtiness of the wiping parts cleaned for the second time.
  • the corresponding relationship between the chromaticity value of sewage, the water conductivity of sewage and the dirt elution value or dirt amount is the same, and will not be repeated here.
  • the degree of contamination can be represented by a numerical value, such as by any of the turbidity value of the sewage, the chromaticity value of the sewage, the water conductivity of the sewage, the amount of dirt, and the dirt elution value.
  • the degree of dirtiness can be determined by any one of the turbidity value of the sewage, the colorimetric value of the sewage, the water conductivity of the sewage, the amount of dirt, and the dirt elution value.
  • the turbidity value of the sewage after cleaning the mop parts is 1 NTU, it can indicate that the degree of dirtiness of the mop parts is 1; or if the turbidity value of the sewage after cleaning the mop parts is 1 NTU, the corresponding degree of dirtiness is 100 , then the degree of dirtiness of the mopping part is 100.
  • the detection value of the sewage detection sensor can be obtained at intervals, and the dirt corresponding to the detection value can be calculated based on the time and/or the amount of water used to clean the mop member.
  • the amount is accumulated to obtain an accumulated result of the amount of dirt, where the amount of water can be determined based on the amount of cleaning water provided to the cleaning tank and/or the amount of waste water discharged.
  • a mopping element cleaning operation performed between two floor cleaning operations can be used as a mopping element cleaning task.
  • the cleaning task of cleaning the mopping element includes, for example, a process of cleaning the mopping element after cleaning a preset cleaning area and before cleaning another preset cleaning area, and may also include ending the cleaning task. There is then a process of cleaning the mopping parts, wherein the condition for ending the cleaning task is to determine that the dirt values of all areas of the task map are less than the corresponding dirt amount thresholds.
  • the cleaning task of the mopping part includes one or more stage tasks. At each stage of the task, clean water is provided to the cleaning tank of the base station or directly to the cleaning of the mopping parts to clean the mopping parts, and then the sewage after cleaning the mopping parts is discharged from the cleaning tank or recycled into the sewage tank, in which the sewage
  • the tank can be a sewage tank installed on the base station or the cleaning robot. This process does not need to be circulated or can be circulated multiple times; or at the same time, clean water is supplied to clean the mop parts and the sewage after cleaning the mop parts is discharged or recycled. Of course, It is not limited to this. For example, when cleaning water is provided to the cleaning tank, the sewage after cleaning the mop member is intermittently discharged.
  • the time and/or amount of water required to clean the mopping part corresponding to tasks in different stages may be the same or different. According to the time and/or amount of water corresponding to one or more stage tasks in the task of cleaning the mopping part, the tasks in each stage are The amount of dirt corresponding to the detection value obtained during execution is accumulated to obtain the accumulated result of the amount of dirt.
  • Determining the degree of dirt of the mopping part according to the detection information includes: accumulating the amount of dirt corresponding to the detection information according to the time and/or the amount of water used to clean the mopping part, wherein the amount of water can be based on the amount of water provided to the cleaning tank.
  • the amount of water used is determined by the amount of water used and/or the amount of waste water discharged.
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the detection information such as sewage turbidity
  • the cumulative result d of the dirt amount can be determined based on the detection information of one or more samples and the water volume in the sampling interval, which is expressed as follows:
  • Ti represents the sewage turbidity T of the i-th sampling
  • l i represents the water volume between two samplings
  • i is any value among 1, 2,..., n
  • n is the total number of samplings.
  • determining the degree of contamination of the mopping element based on the detection information includes: predicting the degree of contamination of the mopping element based on the single detection information. For example, after stopping the supply of clean water to the cleaning tank, the sewage is discharged, and the sewage turbidity is detected once during the sewage discharge process, and the water volume of the discharged sewage is obtained. The product of the sewage turbidity and the water volume can be determined as The cumulative result of the amount of dirt d. Of course, it is not limited to this.
  • the turbidity of the sewage can be detected multiple times during the sewage removal process, and the product of the average, maximum, or minimum value of the multiple detected turbidities of the sewage and the amount of water is determined as the amount of dirt. Accumulated result d.
  • the amount of dirt corresponding to the detection information is accumulated according to the time and/or amount of water used to clean the mopping member, and the accumulated result of the amount of dirt represents the amount of dirt cleaned from the mopping member.
  • the amount of dirt can be called the dirt elution value.
  • the dirt elution value of the mopping part cleaning task may be determined based on the dirt elution values of one or more stages of the mopping part cleaning task; for example, the mopping part cleaning task The dirt elution values of all stages of the cleaning task are accumulated to obtain the dirt elution value of the mopping part cleaning task.
  • the detection information of sewage can be obtained only once, or the detection information of sewage can be obtained multiple times, and the dirt elution value of the stage task is determined based on one or more detection information, for example, based on multiple detection information
  • the product of the average value of the detection information and the water volume of the task stage determines the dirt elution value of the task in this stage.
  • the degree of contamination of the mopping part may be determined based on the dirt elution value of one or more stage tasks, or the dirt elution value of the mopping part cleaning task.
  • the degree of dirt of the mopping part is determined based on the dirt elution value of the first stage task in the mopping part cleaning task.
  • the greater the dirt elution value of the first stage task the greater the dirt elution value of the first stage task.
  • the greater the dirtiness value of the mopping part; or the degree of dirtiness of the mopping part is determined based on the maximum or average value of the dirt elution values of multiple stages of tasks. The greater the maximum or average value, the greater the degree of dirtiness of the mopping part.
  • the greater the dirtiness value of the mopping parts is determined based on the dirt elution value of one or more stage tasks, or the dirt elution value of the mopping part cleaning task.
  • the mopping member cleaning task is performed, and the dirt in all stages of the mopping member cleaning task is washed away.
  • the values are accumulated to obtain the dirt elution value of the mopping part cleaning task, and the dirt elution value of the mopping part cleaning task is determined as the degree of dirt corresponding to the preset cleaning area.
  • Step S120 Generate a cleaning image according to the degree of dirt corresponding to one or at least two preset cleaning areas.
  • the step of generating a cleaning image according to the degree of dirtiness corresponding to the preset cleaning area further includes: determining a preset cleaning area with a degree of dirtiness greater than or equal to a preset degree of dirtiness threshold; Or a preset clean area equal to the preset dirtiness threshold to generate a clean image. For example, if after the cleaning device cleans multiple preset cleaning areas, there is at least one preset cleaning area with a degree of contamination less than the preset contamination degree threshold, then the cleaning image may not contain the degree of contamination that is less than the preset degree of contamination.
  • the preset cleaning area with a dirty degree threshold is displayed, so as to more intuitively reflect the cleaning effect of the cleaning equipment on the preset cleaning area.
  • the cleaning image may be generated after the cleaning equipment completes cleaning of all preset cleaning areas in the task map, or may be generated after the cleaning equipment completes one cleaning of at least one preset cleaning area in the task map.
  • Clean images are not limited here.
  • the cleaning image includes an image area corresponding to a preset cleaning area.
  • generating a clean image based on the degree of dirt corresponding to the preset cleaning area includes: determining the target filling icon of the image area according to the value range of the degree of dirt corresponding to the preset cleaning area; and determining the target filling icon according to the target filling icon.
  • the image area is marked, and the target filling icons corresponding to different value ranges are also different.
  • the target fill icon may include at least one of color, line, shadow, pattern, numerical value or other fill icon.
  • the target filling icon may be preset or may be set by the user, which is not limited here. It will be understood that the target fill icon can be expanded accordingly.
  • the value range of the degree of contamination corresponding to the preset clean area is determined, and the target filling icon of the image area is determined based on the value range of the degree of contamination corresponding to the preset clean area.
  • the preset It is assumed that the higher the degree of dirt corresponding to the clean area, the darker the color in the target fill icon; the higher the degree of dirt corresponding to the preset clean area, the denser the lines in the target fill icon; the darker the color corresponding to the preset clean area.
  • determining the target filling diagram of the image area according to the value range of the degree of dirt corresponding to the preset cleaning area includes: when the number of times the preset cleaning area is cleaned is greater than 1 time, according to The value range of at least one degree of dirt corresponding to the preset cleaning area determines the target filling diagram of the image area corresponding to the preset area.
  • the degree of dirt corresponding to the preset cleaning area can be obtained.
  • the degree of contamination corresponding to the preset cleaning area can be obtained every time.
  • the preset cleaning area has been cleaned 5 times.
  • a degree of dirtiness will be obtained, that is, 5 degrees of dirtiness are obtained.
  • the target filling icon of the image area corresponding to the preset area is divided into 5 degrees of dirtiness. Determined by any one of , or by the cumulative value of at least 2 of the 5 soiling levels.
  • the preset cleaning area has been cleaned 5 times, and the dirtiness level is obtained only after 2 cleanings.
  • 2 dirtiness levels are obtained
  • the target filling icon of the image area corresponding to the preset area is represented by 2 Determined by any one of two soiling levels, or by the cumulative value of 2 soiling levels.
  • the remaining cleaning times of the preset cleaning area can be predicted based on the degree of contamination corresponding to the preset cleaning area. For example, the remaining number of cleanings is 4 times.
  • the cleaning area is When the estimated number of remaining cleanings is greater than one, the degree of dirtiness of the preset cleaning area after only a few cleanings can be obtained to generate a cleaning image.
  • the degree of contamination of the preset cleaning area corresponding to each cleaning may also be obtained based on the actual number of cleanings, and a cleaning image may be generated based on the degree of contamination of the preset cleaning area corresponding to each cleaning.
  • generating a cleaning image according to a degree of dirt corresponding to a preset cleaning area includes: generating a first cleaning image according to a degree of dirt corresponding to a first target preset cleaning area, wherein the first target preset The cleaning area is one preset cleaning area among at least two preset cleaning areas, the first cleaning image includes all image areas corresponding to the preset areas, and at least the image area corresponding to the first target preset cleaning area A target filling icon is identified, and the target filling icon is determined according to the degree of dirt corresponding to the most recently obtained first target preset cleaning area.
  • the processing method further includes: the image area corresponding to the non-first target preset cleaning area does not identify the target filling icon; or the image area corresponding to the non-first target preset cleaning area identifies the preset target filling icon; Or the target filling icon of the image area corresponding to the non-first target preset cleaning area is determined according to the degree of dirt corresponding to the most recently acquired non-first target preset cleaning area; wherein the non-first target preset cleaning area is except A preset cleaning area outside the first target preset cleaning area.
  • each preset cleaning area corresponds to the image area in the clean image one-to-one.
  • the preset cleaning area A1 corresponds to the image area a1
  • the preset cleaning area A2 corresponds to the image area a2, and so on.
  • the cleaning equipment first cleans the preset cleaning area A1.
  • the first target cleaning area is the preset cleaning area A1
  • the first cleaning image a includes all image areas a1-a4 corresponding to the preset cleaning areas A1-A4, and the image area a1 corresponding to the preset cleaning area A1 has color Filling, the color of the filling of the image area a1 is determined by the degree of dirt corresponding to the preset cleaning area A1.
  • the image areas a2-a4 corresponding to the non-first target cleaning areas A2-A4 do not identify the target filling.
  • a first cleaning image b can be generated according to the degree of dirt corresponding to the preset cleaning area A2.
  • the first target cleaning area is the preset cleaning area A2
  • the first cleaning image b includes all image areas a1-a4 corresponding to the preset cleaning areas A1-A4.
  • the preset cleaning area The image area a1 corresponding to A1 is filled with color, and the image area a2 corresponding to the preset cleaning area A2 is filled with color.
  • the color filled in the image area a1 is determined by the degree of dirt corresponding to the preset cleaning area A1 obtained last time, that is, the first
  • the color filled in the image area a1 in the clean image b is the same as the color filled in the image area a1 in the first clean image a.
  • the color filled in the image area a2 is determined by the degree of dirt corresponding to the preset cleaning area A2.
  • the corresponding colors of A3 and A4 are Image areas a3 and a4 do not identify target filling icons.
  • the image areas corresponding to the non-first target areas in the first cleaning image a and the first cleaning image b may not be marked with any target filling icon, or may be marked with a preset marked filling icon, which is not limited here.
  • the first cleaning image b retains the target filling icon identified by the image area a1 corresponding to the preset cleaning area A1 in the first cleaning image a, and the preset cleaning area A1 and the preset cleaning area A2 are tasks.
  • the degree of dirtiness corresponding to the preset cleaning area A2 will not affect the degree of dirtiness corresponding to the preset cleaning area A1. Therefore, in the first cleaning image b, the target filling map identified by the image area a2 means that it will not affect the target fill pattern identified in the image area a1.
  • the target fill pattern identified in the image area a1 in the first cleaning image b is the same as the target fill pattern identified in the image area a1 in the first cleaning image a.
  • a first cleaning image c can be generated according to the degree of dirt corresponding to the preset cleaning area A3, as shown in Figure 30(c)
  • a first cleaning image d can be generated according to the degree of dirt corresponding to the preset cleaning area A4, as shown in Figure 30(d).
  • the first cleaning image is generated when the degree of dirt corresponding to the first target preset cleaning area is obtained, that is, every time a cleaning of a preset cleaning area is completed (that is, every time a dirt detection is completed)
  • a first cleaning image is displayed.
  • the first cleaning image a, the first cleaning image b, the first cleaning image c, and the first cleaning image d are respectively completed in the preset cleaning areas A1, A2, A3, and A4. It is displayed after one cleaning; in some embodiments, a first cleaning image is generated after completing the cleaning task. For example, only a first cleaning image is displayed after completing the cleaning task, such as by clicking on the corresponding label in the screen. The first cleaning image is selectively displayed, as shown in Figure 31.
  • the user can select to display any one of the first cleaning images a-d by clicking any label on the screen as shown in numbers 1 to 5 in Figure 31.
  • Zhang in some embodiments, after completing the cleaning task, at least two first cleaning images are generated sequentially or simultaneously, for example, after completing the cleaning task, at least two first cleaning images are displayed sequentially or simultaneously, and after completing the cleaning task,
  • the user can selectively display the corresponding first cleaning image by clicking at least two labels in the screen, as shown in Figure 31 as numbers 1 to 5, or as shown in Figure 32, the user can click in the screen
  • the icon "Clean Image" allows multiple first cleaning images to be displayed in sequence, or as shown in Figure 33, multiple first cleaning images are displayed on the screen by clicking the icon on the screen.
  • the cleaning process of the cleaning equipment can be reflected based on at least one first cleaning image.
  • the actual cleaning sequence of each preset cleaning area by the cleaning equipment and the corresponding degree of dirt of each preset cleaning area can be reflected, for example, in Figure 30
  • the first cleaning image a reflects that the cleaning equipment first starts to clean the preset cleaning area A1
  • the target filling icon identified by the image area a1 reflects the degree of dirt of the preset cleaning area A1 before cleaning
  • the cleaning equipment performs the preset cleaning at that time.
  • the cleaning equipment when the cleaning equipment performs a cleaning task, the cleaning equipment cleans each preset cleaning area A1, A2, A3, and A4 in the cleaning order of A1-A2-A3-A4.
  • the cleaning of each preset cleaning area The order is not limited to this.
  • the cleaning order of A1-A2-A3-A4 will be explained below. Referring to Figure 34, the cleaning equipment first cleans the preset cleaning area A1 once. Then, based on the degree of dirt corresponding to the preset cleaning area A1 obtained by the cleaning equipment after cleaning the preset cleaning area A1, a A first cleaning image a, as shown in Figure 34(a).
  • the preset cleaning area A1 is the first target cleaning area
  • the first cleaning image a includes all image areas a1- corresponding to the preset cleaning areas A1-A4. a4, and the image area a1 corresponding to the preset cleaning area A1 is filled with color.
  • the color filled in the image area a1 is determined by the degree of dirt corresponding to the preset cleaning area A1; if this cleaning of the preset cleaning area A1 is completed, the cleaning The equipment also needs to continue to clean the preset cleaning area A1 for the second time. At this time, the preset cleaning area A1 is still the first target cleaning area.
  • the degree of dirt corresponding to the preset cleaning area A1 after the second cleaning Generate a first cleaning image b, as shown in Figure 34(b).
  • the first cleaning image b contains all image areas a1-a4 corresponding to the preset cleaning areas A1-A4, and the image area a1 corresponding to the preset cleaning area A1
  • the color of the filling is determined by the degree of dirt corresponding to the preset cleaning area A1 obtained for the second time.
  • the changes in the cleaning effect of the area can reflect that the preset cleaning area A1 has been cleaned twice, and the target filling diagram is marked by the image cleaning area a1.
  • the change reflects the change in the cleaning effect of the preset cleaning area A1.
  • the cleaning equipment if the cleaning equipment completes cleaning a preset cleaning area, that is, after the degree of dirt corresponding to the preset cleaning area is lower than the preset dirt level threshold, the cleaning equipment continues to clean other preset cleaning areas. Clean, and generate at least one first cleaning image according to the degree of dirt corresponding to other preset cleaning areas.
  • the degree of dirt corresponding to the preset cleaning area A1 is lower than the preset dirt level threshold, and the cleaning equipment continues to clean the preset cleaning area A2.
  • the preset cleaning area A2 is the first target cleaning area, and the first cleaning image d can be generated according to the degree of dirt of the preset cleaning area A2, as shown in Figure 34(d), and so on, according to the cleaning equipment.
  • the degree of contamination of the area A3 and the preset cleaning area A4 respectively generates at least one first cleaning image, so that the cleaning process of the cleaning equipment can be reflected based on the at least one first cleaning image, such as the cleaning of each preset cleaning area by the cleaning equipment.
  • the preset cleaning areas A2-A4 are non-first target areas, and the image areas a2-a4 corresponding to the preset cleaning areas A2-A4 do not identify target filling icons to indicate the preset cleaning areas A2-A4.
  • the preset cleaning area A1 is a non-first target area
  • the target filling icon filled in the image area a1 corresponding to the preset cleaning area A1 is the same as the first cleaning
  • the image area a1 in image c has the same target filling icon to indicate that the preset cleaning area A1 is no longer cleaned.
  • the image areas a2-a4 in the first cleaning images a-c may also identify preset target filling icons to indicate that the preset cleaning areas A2-A4 are not cleaned, and the image area a1 in the first cleaning image d may not be cleaned.
  • the target filling icon or the preset target filling icon is marked to indicate that the preset cleaning area A1 is no longer cleaned, which will not be described in detail here.
  • the first cleaning image is generated when the degree of dirt corresponding to the first target preset cleaning area is obtained, that is, every time a cleaning of a preset cleaning area is completed (that is, every time a dirt detection is completed)
  • a first cleaning image is displayed.
  • the first cleaning image a, the first cleaning image b, the first cleaning image c, and the first cleaning image d are respectively placed in the preset cleaning areas A1 and A2 after each cleaning is completed.
  • At least two first cleaning images can be generated sequentially or simultaneously after the cleaning task is completed, that is, multiple first cleaning images are displayed sequentially or simultaneously after the cleaning task is completed, for example, as shown in Figure 35
  • the user can selectively display the corresponding first cleaning image by clicking on the labels on the screen, numbers 1 to 5 in Figure 35, or as shown in Figure 36, the user can Multiple first cleaning images can be displayed in sequence by clicking on the "Clean Image" icon on the screen, or as shown in Figure 37, multiple first cleaning images can be displayed on the screen by clicking on the icon on the screen.
  • At least one first cleaning image is dynamically displayed as the animation or short video plays, or the changing process of the preset cleaning area corresponding to multiple first cleaning images is displayed through the animation or short video.
  • first cleaning images a-d can be displayed in sequence through animation or short videos; the images of image area a1, image area a2, image area a3, and image area a4 can also be displayed over time through animation or short video.
  • the target filling icons change sequentially, in which the target filling icons identified in each image area are determined according to the degree of dirt corresponding to each preset cleaning area, so that the cleaning process of the cleaning equipment can be displayed through animation or short video, such as cleaning equipment
  • the actual cleaning sequence of each preset cleaning area and the corresponding degree of dirt after each preset cleaning area is cleaned.
  • the sequential display of the first cleaning images a-d reflects that the cleaning order of the preset cleaning area by the cleaning equipment is in accordance with A1 -Clean each preset cleaning area A1, A2, A3, A4 in the order of A2-A3-A4, and the preset can also be reflected by changes in the target filling icons identified in each image area a1-a4 in the first cleaning image a-d.
  • the changes in the cleaning effect of the cleaning areas A1-A4 will help the user understand the cleaning process of the cleaning equipment and the degree of dirtiness of the preset cleaning area during the cleaning process.
  • multiple first cleaning images can be displayed in sequence through animation or short videos; changes over time in the target filling icon identified in image area a1, and changes in the target filling icon identified in image area a2 can also be displayed through animation or short videos.
  • the number of times the equipment cleans the preset cleaning areas, the change in the degree of dirtiness after each cleaning of each preset cleaning area, and the cleaning sequence of the preset cleaning areas by the cleaning equipment which will help users understand the cleaning process of the cleaning equipment. and the process by which a preset cleaning area gradually becomes cleaner after multiple cleanings of the cleaning equipment.
  • the animation or short video can be generated and displayed in real time during the cleaning process of the cleaning equipment, or can be generated and displayed after the cleaning process of the cleaning equipment is completed to reproduce the cleaning process of the cleaning equipment. There are no restrictions here.
  • generating a cleaning image according to the degree of dirt corresponding to the preset cleaning area includes: determining at least one second target preset cleaning area, where the second target preset cleaning area is all areas that have been cleaned i times.
  • the preset cleaning area where i is an integer greater than or equal to 1; generate the i-th second cleaning image according to at least one target degree of dirtiness, wherein the target degree of dirtiness is the second target preset cleaning
  • the degree of dirt corresponding to the second target preset cleaning area obtained after the i-th cleaning of the area, the i-th second cleaning image includes all image areas corresponding to the preset areas, and each The image area corresponding to the second target preset cleaning area is respectively marked with a target filling icon, and the target filling icon corresponding to the image area identification corresponding to each second target preset cleaning area is respectively obtained according to the i-th
  • the target degree of contamination corresponding to each of the second target preset cleaning areas is determined.
  • the processing method also includes: the image area corresponding to the non-second target preset cleaning area does not identify the target filling icon; or the image area corresponding to the non-second target preset cleaning area identifies the preset target filling icon. ; Or the target filling icon of the image area identification corresponding to the non-second target preset cleaning area is determined based on the degree of dirt corresponding to the most recently obtained non-second target preset cleaning area; wherein, the non-second target preset cleaning area A preset cleaning area other than the second target preset cleaning area.
  • the cleaning equipment completes the cleaning task, if the cleaning equipment has cleaned the preset cleaning area A1 three times and cleaned the preset cleaning area A2 three times, the preset cleaning area A2 will be cleaned three times. Assume that the cleaning area A3 has been cleaned once, and the preset cleaning area A4 has been cleaned twice, then the target dirtiness levels corresponding to all the preset cleaning areas that have been cleaned once can be determined, such as the preset cleaning area
  • the corresponding degree of dirtiness obtained after each of A1-A4 is cleaned once generates the first second cleaning image, such as the second cleaning image a, as shown in Figure 38(a) or Figure 39(a),
  • the second cleaning image a includes all image areas a1-a4 corresponding to the preset areas A1-A4, and each image area a1-a4 is marked with a target filling icon, and each image area a1-a4 is marked with a target filling icon.
  • the illustrations are determined based on the corresponding dirtiness levels of the preset cleaning areas A1-A4 to indicate the dirtiness level before the preset cleaning areas A1-A4 were cleaned once respectively; similarly, they can be based on all the dirtiness levels that have been cleaned 2 times.
  • the target dirtiness level corresponding to the preset cleaning area for the second time such as the corresponding dirtiness level obtained after each of the preset cleaning areas A1, A2, and A4 is cleaned for the second time, generates the second second cleaning image, such as The second cleaning image b, as shown in Figure 38(b) or Figure 39(b), the second cleaning image b includes all image areas a1-a4 corresponding to the preset areas A1-A4, and each image area a1
  • the target filling icons marked by , a2 and a4 are determined according to the corresponding target degree of contamination to indicate the degree of contamination of the preset cleaning areas A1, A2 and A4 after being cleaned twice respectively, because the preset cleaning area A3 has not been cleaned.
  • the second cleaning, so the preset cleaning area A3 does not belong to the second target preset cleaning area determined when generating the second second cleaning image b.
  • the target filling icon identified in the image area a3 is based on the preset cleaning area A3.
  • the corresponding degree of dirt obtained after one cleaning is determined to show that the degree of dirt in the preset cleaning area A3 has not changed, thereby indicating that the preset cleaning area A3 has not been cleaned for the second time, as shown in Figure 38(b) ); or the target filling icon may not be marked or the preset target filling icon may be marked to indicate that the preset cleaning area A3 has not been cleaned for the second time, as shown in Figure 39(b), which is not done here. Repeat.
  • a second cleaning image c is generated based on the degree of dirt obtained by the cleaning equipment after completing the third cleaning of the preset cleaning area A1 and the preset cleaning area A2, as shown in Figure 38(c) or 15 (c) is shown to indicate the degree of dirtiness of the preset cleaning areas A1 and A2 after they have been cleaned three times respectively.
  • the workload of the cleaning equipment can be reflected based on one second cleaning image, such as the amount of dirt cleaned by the cleaning equipment from each preset cleaning area, and the working process of the cleaning equipment can be reflected based on at least two second cleaning images, such as Changes in the corresponding dirtiness of each preset cleaning area after multiple cleanings.
  • a second cleaning image is generated after the cleaning task is completed, that is, only one second cleaning image is displayed after the cleaning task is completed, such as by clicking a label on the screen.
  • the label is, for example, "th” in Figure 40 1st time”, “2nd time”, “3rd time”, the corresponding second cleaning image is selectively displayed, as shown in Figure 40, any one of the second cleaning images a-c can be selectively displayed. image.
  • At least two second cleaning images are generated sequentially or simultaneously, that is, after completing the cleaning task, at least two second cleaning images are displayed sequentially or simultaneously.
  • the user can Click the icon on the screen.
  • the icon is, for example, "Clean Image" in Figure 41, so that multiple second cleaning images can be displayed in sequence, as shown in Figure 41; or at least two second cleaning images can be displayed on the screen at the same time, as shown in Figure 41.
  • the second cleaning images a-c are displayed sequentially or simultaneously.
  • the i-th second cleaning image is generated based on at least one target degree of contamination.
  • the cleaning equipment cleans each preset cleaning area A1, A2, A3, and A4 once according to a predetermined cleaning sequence, and then cleans each preset cleaning area A1, A2, A3, and A4.
  • the preset cleaning areas that need to be cleaned for the second time are cleaned for the second time, and so on, until the corresponding dirtiness of each preset cleaning area A1, A2, A3, and A4 is less than the dirtiness threshold; then the cleaning equipment can be After cleaning each preset cleaning area A1, A2, A3, and A4 once, a first second cleaning image is generated, such as the second cleaning image a, when the cleaning equipment cleans the preset cleaning area that needs to be cleaned for the second time. , for example, after the preset cleaning areas A1, A2, and A4 are cleaned for the second time, a second cleaning image is generated, such as the second cleaning image b, and so on, multiple second cleaning images are generated in sequence to display each in stages.
  • the degree of dirtiness corresponding to the preset cleaning area reflects the degree of dirtiness of each preset cleaning area and the changes in the degree of dirtiness of each preset cleaning area.
  • At least one second cleaning image is dynamically displayed as the animation or short video plays, or the changing process of the preset cleaning area corresponding to multiple second cleaning images is displayed through the animation or short video.
  • the second cleaning image a, the second cleaning image b, and the second cleaning image c can be sequentially displayed through animation or short video to display the preset cleaning areas A1, A2, A3, and A4 after at least one cleaning.
  • the corresponding changes in the degree of dirtiness help users understand the cleaning process of the cleaning equipment and the process of the preset cleaning area gradually becoming cleaner.
  • the animation or short video can be generated in real time during the cleaning process of each preset cleaning area by the cleaning equipment, or can be generated after the cleaning equipment completes cleaning of each preset cleaning area to provide a clear picture of the cleaning process.
  • the cleaning process of the equipment is repeated and is not limited here.
  • generating a cleaning image based on the degree of dirt corresponding to the preset cleaning area includes: generating a third cleaning image based on the acquired accumulated amount of degree of dirt corresponding to the preset cleaning area. For example, each time a cleaning is completed for a preset cleaning area, a third cleaning image is generated. Each cleaning image contains image areas corresponding to all preset cleaning areas, and the image areas corresponding to each preset cleaning area are filled with The target filling icon is determined by the accumulated value of the dirtiness degree corresponding to each preset cleaning area.
  • the cleaning equipment When the cleaning equipment performs a cleaning task, the cleaning equipment cleans each of the preset cleaning areas A1, A2, A3, and A4 in the cleaning sequence of A1-A2-A3-A4. , of course, the cleaning order of each preset cleaning area is not limited to this. The cleaning order of A1-A2-A3-A4 will be explained below.
  • a third cleaning image a can be generated based on the degree of dirt corresponding to the preset cleaning area A1 obtained after the cleaning equipment cleans the preset cleaning area A1 for the first time, as shown in Figure 43(a) As shown, the third cleaning image a includes all image areas a1-a4 corresponding to the preset cleaning areas A1-A4, and the image area a1 corresponding to the preset cleaning area A1 is filled with a value of 500.
  • the value filled in the image area a1 is determined by the preset value.
  • the degree of dirt corresponding to the first cleaning of cleaning area A1 is determined; if after completing this cleaning of preset cleaning area A1, the cleaning equipment needs to continue cleaning the preset cleaning area A1 for the second time, it can be based on the two cleanings.
  • the accumulated values of the two dirt levels corresponding to the preset cleaning area A1 are then generated to generate a third cleaning image b, as shown in Figure 43(b).
  • the third cleaning image b contains the corresponding values of all preset cleaning areas A1-A4.
  • the value 800 filled in the image area a1-a4 of the image area a1 corresponding to the preset cleaning area A1 is determined by the degree of dirtiness corresponding to the preset cleaning area A1 obtained for the second time and the preset cleaning area A1 obtained for the first time.
  • the accumulated value of the degree of dirt is determined. It can be understood that after the preset cleaning area A1 completes the third cleaning, the third cleaning image c is generated according to the accumulated values of the three degrees of dirt corresponding to the preset cleaning area A1, as shown in Figure 43(c) ), the preset image area a1 corresponding to the cleaning area A1 is filled with a value of 900.
  • the third cleaning image d includes image areas a1-a4 corresponding to all preset cleaning areas A1-A4.
  • the value 900 filled in the image area a1 corresponding to the preset clean area A1 is determined by the cumulative value of the corresponding three degrees of dirt
  • the value 500 filled in the image area a2 corresponding to the preset clean area A2 is determined by the corresponding one degree of dirt.
  • the image area corresponding to the preset cleaning area that has not been cleaned does not identify the target filling icon or identifies the preset target filling icon.
  • the image areas a2-a4 in the third cleaning image a-c do not identify the target filling icon,
  • a preset target filling icon can also be identified, which is not limited here.
  • multiple third cleaning images are generated, so that the cleaning process of a preset cleaning area by the cleaning equipment can be reflected based on at least two third cleaning images, such as the number of times the cleaning equipment cleans the preset cleaning area and the number of times the cleaning equipment cleans the preset cleaning area. Changes in the amount of dirt cleaned during the cleaning process of the preset cleaning area. The degree of dirtiness of each preset cleaning area before being cleaned and the accumulated cleaning amount of the preset cleaning area by the cleaning equipment can also be displayed based on the last third cleaning image.
  • the filling value 900 of the image area a1 corresponding to the preset cleaning area A1 is determined based on the accumulated dirt level after three cleanings of the preset cleaning area A1, and the image area a2 corresponding to the preset cleaning area A2
  • the filling value 900 is determined based on the accumulated dirt level of the preset cleaning area A2 after three cleanings.
  • the filling value 100 of the image area a3 corresponding to the preset cleaning area A3 is determined based on the preset cleaning area A3 after one cleaning.
  • the accumulated dirt level is determined.
  • the filling value 400 of the image area a4 corresponding to the preset cleaning area A4 is determined based on the accumulated dirt level after the preset cleaning area A4 has been cleaned twice.
  • each preset area has been cleaned for different times. After cleaning, the difference in the cumulative amount of dirt cleaned by the cleaning equipment in each preset area facilitates the user to increase the perception of the different levels of dirt in each preset area and improve the perception of the cleaning ability of the cleaning equipment.
  • the target filling icon can be filled with a numerical value or a color filling.
  • a third cleaning image a can be generated based on the degree of dirt corresponding to the preset cleaning area A1 obtained after the cleaning device cleans the preset cleaning area A1 for the first time.
  • the third cleaning image a includes All the image areas a1-a4 corresponding to the preset cleaning areas A1-A4, and the image area a1 corresponding to the preset cleaning area A1 is filled with color.
  • the color filled in the image area a1 is determined by the first cleaning of the preset cleaning area A1.
  • the degree of contamination is determined.
  • the fill icon can be filled with color to display the degree of dirt of each preset cleaning area before being cleaned and the effect of the cleaning equipment on the preset cleaning area. cumulative cleaning volume.
  • the process of gradually changing the target filling icon of the image area corresponding to the preset cleaning area included in the third cleaning image is dynamically displayed to reflect the amount of dirt in the preset cleaning area.
  • the cleaning process can also reflect the accumulation process of the dirt elution value in the process of cleaning the preset cleaning area of the mop piece.
  • multiple third cleaning images can be displayed sequentially in the order in which the third cleaning images are generated through animation or short videos, so that the cleaning process of the cleaning equipment can be displayed through animations or short videos, for example, the cleaning process of the cleaning equipment against a preset cleaning area.
  • the cleaning equipment cleans the preset cleaning area of the dirt accumulation process, which is beneficial to the user to understand the cleaning process of the cleaning equipment, and after the cleaning equipment cleans the preset cleaning area at least once.
  • the accumulated cleaning amount of each preset cleaning area according to the degree of dirtiness.
  • the animation or short video may be generated in real time during the cleaning process of each preset cleaning area by the cleaning equipment, or may be generated after the cleaning equipment completes cleaning of each preset cleaning area. Reproduction of the cleaning process of the cleaning equipment is not limited here.
  • the cleaning image includes a room area, and the room area corresponds to one or at least two preset cleaning areas.
  • generating a cleaning image according to the degree of dirtiness corresponding to the preset cleaning area includes: determining the target of the room area according to the degree of dirtiness corresponding to one or at least two of the preset cleaning areas in the room area. Fill icon.
  • the cleaning equipment cleans at least one room according to the task map.
  • Room R includes one or more preset cleaning areas B. After cleaning room R, the cleaning equipment can generate a cleaning image.
  • the cleaning image includes For the room area r corresponding to room R, it can be understood that the room area r corresponds to the preset cleaning area B, and the target filling icon of the room area r is determined according to the degree of dirt corresponding to the preset cleaning area B.
  • Room R1 includes preset cleaning area B1, preset cleaning area B2 and preset cleaning area B3.
  • the cleaning equipment can generate a cleaning image after cleaning room R1.
  • the cleaning image includes room area r1 corresponding to room R1.
  • the room area r1 corresponds to the preset cleaning area B1, the preset cleaning area B2 and the preset cleaning area B3.
  • the target filling icon of the room area r1 is based on the preset cleaning area B1, the preset cleaning area B2 and the preset cleaning area B3.
  • the corresponding degree of soiling is determined.
  • determining the target filling diagram of the room area based on the degree of dirt corresponding to one or at least two preset cleaning areas corresponding to the room area includes: based on the at least two preset cleaning areas corresponding to the room area. Assuming any one of the average degree of dirtiness, the total degree of dirtiness, the maximum degree of dirtiness, or the dirtiness degree of any preset clean area corresponding to the clean area, determine the target filling map of the room area. Show.
  • the average degree of dirtiness, the total degree of dirtiness, and the maximum degree of dirtiness corresponding to each preset clean area in the room can be calculated.
  • any value in the degree of dirtiness of any preset clean area determine the target filling diagram of the room area. For example, please refer to Figure 47.
  • the cleaning equipment When the cleaning equipment is performing a cleaning task, if the cleaning equipment performs a cleaning on the preset cleaning area B1, the preset cleaning area B2 and the preset cleaning area B3 included in the room R1, for example , the degree of dirtiness corresponding to the preset cleaning area B1, the degree of dirtiness corresponding to the preset cleaning area B2, and the degree of dirtiness corresponding to the preset cleaning area B3 can be accumulated and then divided by the corresponding preset cleaning area in this cleaning to determine the target filling diagram of room area r1 corresponding to room R1 according to the average dirtiness level of the preset cleaning area in room R1; for example, the dirtiness level corresponding to the preset cleaning B1, the preset cleaning area
  • the degree of dirtiness corresponding to B2 and the degree of dirtiness corresponding to the preset clean area B3 are accumulated to determine the target filling diagram of the room area r1 corresponding to the room R1 based on the total dirtiness level of all preset clean areas in the room R1; for example
  • the maximum value of the degree of contamination determines the target filling icon of the room area r1 corresponding to the room R1; for example, the degree of contamination corresponding to the preset clean area B1, the degree of contamination corresponding to the preset clean area B2, and the preset clean area B3 can be determined
  • the corresponding degree of contamination is randomly selected to determine the target filling icon of the room area r1 corresponding to the room R1 based on the degree of contamination corresponding to any preset clean area in the randomly selected room R1.
  • the target of the room area is determined by any one of the average degree of dirtiness, the total degree of dirtiness, the maximum degree of dirtiness, or the dirtiness degree of any preset clean area corresponding to the preset clean area.
  • the corresponding dirtiness levels of clean area B1, preset clean area B2 and preset clean area B3 after the first cleaning are respectively 500, 600, 100, and the average of the three dirtiness levels of 500, 600 and 100. If the value is 400, the total value is 1200, and the maximum value is 600, then the five values of 600, 500, and 100 can be selected based on the average value of 400, the total value of 1200, and the corresponding pollution degree values of 600, 500, and 100 for the preset cleaning areas B1-B3.
  • any value of determine the target filling icon of the room area r1 corresponding to the room R1 to generate the first cleaning image containing the room area r1; the preset cleaning area B1 and the preset cleaning area B2 correspond to each other after the second cleaning
  • the degree of dirtiness is 100 and 200 respectively.
  • the average of the two values 100 and 200 is 150, the total value is 300, and the maximum value is 200. Then the room can be determined based on any one of the values 150, 300, 200, and 100.
  • the target filling diagram of the room area r1 corresponding to R1 is used to generate the second cleaning image containing the room area r1.
  • multiple room cleaning images corresponding to the number of dirtiness levels corresponding to the room area may be generated based on the multiple dirtiness levels corresponding to the room area. It can be understood that the determination of any one of the average degree of dirt, the total degree of dirt, the maximum degree of dirt, and the first degree of dirt of any preset cleaning area can refer to the foregoing, and will not be described again here.
  • the calculation can be based on the sum of the dirt levels corresponding to the preset cleaning areas corresponding to the room area. Any one of the average dirtiness level, the total dirtiness level, the maximum dirtiness level, and the sum of the dirtiness levels in any preset clean area is used to determine the target filling diagram of the room area to generate a target filling diagram containing the Cleaning images of room areas. For example, if the cleaning equipment cleans the preset cleaning area B1 included in the room R1 twice, cleans the preset cleaning area B2 twice and cleans the preset cleaning area B3 once, for example, the preset cleaning area B1 is cleaned twice.
  • the corresponding dirtiness levels obtained by cleaning area B1 after two cleanings are 500 and 100 respectively. Then the sum of the dirtiness levels corresponding to the two cleanings is 600.
  • the preset cleaning area B2's corresponding dirtiness levels obtained after two cleanings are respectively The degree of dirtiness is 600 and 200, then the sum of the degree of dirtiness corresponding to the two cleanings is 800.
  • the corresponding degree of dirtiness of the preset cleaning area B3 after one cleaning is 100.
  • the corresponding degree of dirtiness of the preset cleaning area B1 is 100.
  • the sum of the dirtiness levels is 600, the sum of the dirtiness levels corresponding to the preset clean area B2 is 800, and the dirtiness level corresponding to the preset clean area B3 is 100.
  • the average of these three values is 500, the total value is 1500, and the maximum value is 800, then the room R1 can be determined based on any one of the five values of 500, the total value 1500, and the sum of the corresponding dirtiness values of the preset clean areas B1-B3: 600, 800, and 100.
  • the target of the corresponding room area r1 is filled in to generate a cleaning image containing the room area r1.
  • the processing method also includes: obtaining the node position of the sequence in which the cleaning equipment performs the cleaning task, the node position including at least one of a starting position, an interruption position, and an end position; and determining to connect two adjacent nodes in the cleaning sequence.
  • the area covered by the cleaning trajectory of the node position is one of the preset cleaning areas.
  • the cleaning equipment when the cleaning equipment performs a cleaning task on the room according to the task map, the cleaning equipment interrupts the cleaning task for maintenance according to a workload threshold.
  • the workload threshold includes a cleaning area threshold, a power consumption threshold, a water consumption threshold, and a mopping threshold.
  • the area covered by the cleaning track in the cleaning image corresponds to The target fill pattern of the image area is determined by the corresponding degree of dirtiness of the area covered by the cleaning track.
  • the cleaning equipment cleans room R1.
  • the cleaning equipment starts cleaning from room R1.
  • the starting position O1 is in room R1.
  • the position O2 is the interruption position O2.
  • the degree of dirtiness corresponding to the area covered by the cleaning trajectory S1 connecting the starting position O1 and the interruption position O2 is obtained and the first cleaning image is generated.
  • the image area s1 in the cleaning image corresponds to the cleaning trajectory.
  • the area covered by S1 and the target filling icon identified by the image area s1 are determined according to the degree of dirt corresponding to the area covered by the cleaning track S1.
  • the second and third cleaning images can be generated based on the degree of contamination of the areas covered by the cleaning tracks S2 and S3. If the cleaning equipment also cleans the area covered by the cleaning tracks for the second time, it can be based on the cleaning The degree of dirtiness of the area covered by the trajectory corresponding to the area covered by the cleaning trajectory obtained after the second time is determined, and the target filling icon of the image area corresponding to the area covered by the cleaning trajectory is determined to generate a clean image.
  • the working process of the cleaning equipment can be reflected by at least one cleaning image generated according to the degree of dirt corresponding to the area covered by the cleaning track, for example, highlighting the cleaning track of the cleaning equipment, and the dirt corresponding to the area covered by different cleaning tracks.
  • the degree of contamination is at least one of the changes in the degree of contamination corresponding to the areas covered by different cleaning trajectories after multiple cleanings.
  • the average degree of dirtiness, the total degree of dirtiness, and the degree of dirtiness corresponding to the area covered by each of the cleaning tracks after the i-th cleaning can also be used.
  • the maximum value, any value of the degree of contamination of any preset cleaning area, determine the target filling icon of all image areas corresponding to the areas covered by the cleaning track after the i-th cleaning, to generate the i-th clean image.
  • the cleaning equipment cleans room R1 and room R2. After completing the cleaning of room R1, three cleaning trajectories are formed, namely cleaning trajectories S1-S3. The areas covered by cleaning trajectories S1 and S2 are cleaned respectively.
  • the area covered by the cleaning track S3 was cleaned once, and the corresponding degree of dirt was obtained each time after the area covered by the cleaning tracks S1-S3 was cleaned.
  • the area covered by the cleaning tracks S1-S3 was cleaned for the first time.
  • the dirtiness levels obtained after 1 time are 500, 600, and 100 respectively.
  • the average of the three dirtiness levels is 400 and the total value is 1200.
  • the cleaning trajectory can be calculated based on the average value of 400, the total value of 1200, and the total value of 1200.
  • the area covered by S1-S3 corresponds to any one of the five dirtiness values 500, 600, and 100, and the target filling icon of the graphic area s1-s3 corresponding to the area covered by the cleaning track S1-S3 is determined.
  • the dirt levels obtained after cleaning the areas covered by cleaning trajectories S1 and S2 for the second time are 100 and 200 respectively.
  • the average of the two dirt levels If the value is 150 and the total value is 300, then any one of the four values of 100 and 200 can be used based on the average value being 150, the total value being 300, and the corresponding dirtiness levels of the areas covered by the cleaning tracks S1 and S2 respectively.
  • the changes in the overall dirtiness of the preset cleaning areas on the cleaning image that have been cleaned at different frequencies highlight the working process of the cleaning equipment.
  • the cleaning area covered by the cleaning track can be appropriately expanded according to preset rules to make the cleaning area covered by the cleaning track more obvious and convenient for the user to observe, thereby improving the user experience.
  • the clean image can be called a dirty heat map; optionally, the processing method also includes: generating animation or short video based on the generated clean image.
  • an animation or short video can be generated based on the plurality of cleaning images generated above.
  • the cleaning images can be played frame by frame.
  • FIG. 50 is a cleaning image related to an embodiment of the present application.
  • a cleaning image is displayed according to the user's selection operation.
  • the user can determine the cleaning image to be output by selecting different cleaning times, and in response to the cleaning times selected by the user, the cleaning image corresponding to the cleaning times is output and displayed.
  • the user can be prompted to select the aforementioned generated cleaning images in various ways, so that the user can understand the cleaning effect of the cleaning equipment on the floor in different cleaning stages.
  • the cleaning images corresponding to the cleaning times are displayed so that the user can understand the cleaning effect of the cleaning equipment on the floor in different cleaning stages.
  • the output cleaning image also includes cleaning information corresponding to the cleaning task performed by the cleaning equipment, such as cleaning area and cleaning time, so that the user can understand the working process of the cleaning equipment. Thereby improving the user's experience of using cleaning equipment.
  • the method for processing cleaning images of cleaning equipment includes: after the cleaning equipment cleans a preset cleaning area once through a cleaning piece, obtaining the degree of dirt corresponding to the preset cleaning area; based on one or at least two Preset the degree of dirtiness corresponding to the cleaning area and generate a cleaning image to visualize the cleaning workload of the cleaning equipment, thereby improving the user's experience of using the cleaning equipment.
  • FIG. 51 is a schematic block diagram of a processing device 300 for cleaning images of a cleaning device provided by an embodiment of the present application.
  • the processing device 300 includes a processor 301 and a memory 302.
  • processor 301 and the memory 302 are connected through a bus 303, such as an I2C (Inter-integrated Circuit) bus.
  • bus 303 such as an I2C (Inter-integrated Circuit) bus.
  • the processor 301 may be a micro-controller unit (Micro-controller Unit, MCU), a central processing unit (Central Processing Unit, CPU) or a digital signal processor (Digital Signal Processor, DSP), etc.
  • MCU Micro-controller Unit
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • the memory 302 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk or a mobile hard disk, etc.
  • ROM Read-Only Memory
  • the memory 302 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk or a mobile hard disk, etc.
  • the processor 301 is used to run a computer program stored in the memory 302, and when executing the computer program, implement the steps of the aforementioned method for processing a cleaning image with a cleaning device.
  • the processor 301 is used to run a computer program stored in the memory 302, and implement the following steps when executing the computer program:
  • a cleaning image is generated based on the degree of dirtiness corresponding to one or at least two preset cleaning areas.
  • Figure 26 is a schematic diagram of a cleaning equipment system provided by an embodiment of the present application.
  • the cleaning equipment system includes:
  • the cleaning equipment 100 includes a movement mechanism and cleaning parts, the movement mechanism is used to drive the cleaning equipment 100 to move, so that the cleaning parts clean the preset cleaning area;
  • Base station 200 the base station 200 is at least used to clean the cleaning parts of the cleaning equipment 100;
  • Figure 27 is a schematic diagram of a cleaning equipment system provided by an embodiment of the present application.
  • the cleaning equipment system includes:
  • the cleaning equipment 100 includes a movement mechanism, cleaning parts and a maintenance mechanism.
  • the movement mechanism is used to drive the cleaning equipment 100 to move so that the cleaning parts clean the preset cleaning area.
  • the maintenance Mechanism is used to clean the cleaning parts; and,
  • the cleaning device 100 includes at least one of a cleaning robot, a handheld cleaning device, and other cleaning devices.
  • the cleaning device 100 can clean the cleaning parts by itself, for example, the cleaning device 100 includes a maintenance mechanism.
  • the cleaning equipment 100 cannot clean the cleaning parts by itself.
  • the cleaning equipment system also includes a base station 200, where the base station 200 is at least used to clean the actuator of the cleaning equipment.
  • the cleaning device 100 is provided with a device controller
  • the base station 200 is provided with a base station controller, for example.
  • the device controller and/or the base station controller of the base station 200 can be used individually or in combination as a process Device 300 is used to implement the steps of the method in the embodiment of the present application; in other embodiments, the cleaning system includes a separate processing device 300, used to implement the steps of the method in the embodiment of the present application.
  • the control device 300 can be set at
  • the cleaning device 100 may be installed on the base station 200; of course, it is not limited thereto.
  • the processing device 300 may be a device other than the cleaning device 100 and the base station 200, such as a home smart terminal, a master control device, etc.
  • Embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program.
  • the processor can implement the steps of the above method.
  • the computer-readable storage medium may be an internal storage unit of the processing device described in any of the preceding embodiments, such as a hard disk or memory of the processing device.
  • the computer-readable storage medium may also be an external storage device of the processing device, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), or a secure digital (SD) equipped on the processing device. ) card, Flash Card, etc.
  • the processing device 300 can be used to implement the steps of the cleaning image processing method of the cleaning device in the embodiment of the present application.
  • Cleaning equipment can be used to automatically clean the floor.
  • the application scenarios can be household indoor cleaning, large-scale place cleaning, etc.
  • cleaning equipment cannot display its cleaning process, making it impossible for users to understand the cleaning process of the cleaning equipment, thereby affecting the user's experience of using the cleaning equipment.
  • Figure 52 is a schematic flowchart of a method for generating a visual interface provided by an embodiment of the present application.
  • the method for generating a visual interface can be applied in a cleaning equipment system to display the cleaning process of the cleaning equipment in the system, so as to realize the visualization of the cleaning process of the cleaning equipment.
  • the cleaning process of the cleaning equipment may be that the cleaning equipment cleans the area to be cleaned.
  • the area to be cleaned may be any area to be cleaned such as a family space, a room unit of a family space, a partial area of a room unit, a large place, or a part of a large place.
  • the area to be cleaned can refer to the larger area that is cleaned for the first time, such as the entire room unit; it can also refer to the area that needs to be filled after the first cleaning of the larger area, such as the wall area in the room unit. or obstacle areas.
  • the user may not be aware of the switching of execution items of the cleaning device during actual use, and cannot understand the cleaning process of the cleaning device. It affects the user's experience of using cleaning equipment. Therefore, it is urgent to visualize the cleaning process of cleaning equipment, present the cleaning information included in the cleaning process of cleaning equipment, so that users can understand the cleaning process of cleaning equipment, and record the historical cleaning process of cleaning equipment. , thereby improving the user's satisfaction with the use of the cleaning equipment. Furthermore, the user can personally adjust the cleaning tasks of the cleaning equipment based on the historical cleaning process of the cleaning equipment to better fit the user's living habits and further improve the user's satisfaction with the cleaning equipment. Usage experience, such as providing a method to generate a visual interface.
  • the method for generating a visual interface includes steps S110 to S120.
  • Step S110 Obtain all or part of the completed execution items of the cleaning equipment.
  • the cleaning device is, for example, at least one of a cleaning robot, a handheld cleaning device, and other devices for cleaning the area to be cleaned.
  • the cleaning equipment can record the execution items executed. Therefore, all or part of the completed execution items of the cleaning equipment can be obtained based on the record of the execution items executed by the cleaning equipment.
  • Step S120 Generate a visual interface based on the acquired execution items of all or part of the cleaning equipment.
  • the visual interface is used to indicate all or part of the completed execution items of the cleaning equipment through animation.
  • the execution items of the cleaning equipment when performing the cleaning task may include at least one of moving, performing a specific execution action, performing a specific execution action and moving.
  • a visual interface can be generated based on all execution items currently completed by the cleaning equipment; for example, a visualization interface can be generated based on part of the execution items currently completed by the cleaning equipment. interface.
  • a visual interface can be generated based on all the execution items completed by the cleaning equipment; or for example, the visual interface can be generated based on part of the execution items completed by the cleaning equipment.
  • the execution items used to generate the visual interface can be the execution items set by the manufacturer by default, the execution items selected by the user, or the execution items randomly selected by the cleaning robot. There are no restrictions here.
  • a visual interface is generated based on the acquired execution items of all or part of the cleaning equipment, and the visual interface is used to indicate all or part of the completed execution items of the cleaning equipment through animation, including: when the execution item is When moving: Obtain the preset map and the movement path of the cleaning equipment when moving, and generate a visual interface based on the preset map and movement path. The visual interface is used to indicate the movement trajectory of the cleaning equipment on the preset map through animation.
  • the preset map can be established by the cleaning equipment by exploring the current space in response to the mapping instruction, or it can be updated by the cleaning equipment based on obstacles, carpets, etc. identified during the cleaning process; optionally , the preset map may be a preset map including a user-specified area to be cleaned, for example, in response to the user-selected area to be cleaned, such as one or more rooms, determining the one or more rooms as a preset map, or in response to Based on the areas to be cleaned circled by the user on the default map, such as partial areas of one or more rooms, the partial areas of the one or more rooms are determined to be the default map, and of course it is not limited to this.
  • a visual interface is generated based on the preset map and movement path, wherein the visualization interface generates an animation of the movement trajectory on the preset map based on the movement path of the cleaning equipment. For example, as shown in Figure 53(a), a dotted line corresponding to the movement trajectory is dynamically generated on the preset map.
  • the visual interface may not display the dotted line corresponding to the movement trajectory. It can still be seen that the cleaning equipment is moving, but no movement trajectory appears on the visual interface. The user can observe through the movement of the cleaning equipment. Movement trajectory of cleaning equipment. The animation indicates the movement trajectory of the cleaning equipment on the preset map, thereby visualizing the movement of the cleaning equipment during the cleaning process, which is beneficial to improving the user's experience of using the cleaning equipment.
  • a visual interface is generated based on the acquired execution items of all or part of the cleaning equipment, and the visual interface is used to indicate all or part of the completed execution items of the cleaning equipment through animation, including: when the execution item is When performing a specific execution action: obtain the specific execution action of the cleaning equipment, and generate a visual interface based on the specific execution action.
  • the visual interface is used to indicate the specific execution action of the cleaning equipment through animation.
  • the cleaning equipment determines that the execution item is a specific execution action
  • the animation shows that the cleaning equipment is performing the specific execution action. Therefore, when the user watches the animation on the visual interface, the user can judge the specific execution action of the cleaning equipment through the specific execution action of the cleaning equipment. Execution matters are conducive to visualizing the specific execution actions of the cleaning equipment and improving the user's experience of using the cleaning equipment.
  • the visual interface after generating the visual interface according to the specific execution action, it also includes: obtaining the degree of contamination corresponding to the area covered by the cleaning device executing the specific execution action; and, according to the degree of contamination corresponding to the area covered by the cleaning device executing the specific execution action.
  • the degree of dirtiness determines the display effect of the area covered by the specific execution actions of the cleaning equipment on the visual interface.
  • the specific execution actions include sweeping, mopping, sweeping and mopping at the same time, or other specific execution actions.
  • the cleaning equipment After the cleaning equipment performs the specific execution action, it can determine the degree of contamination corresponding to the area covered by the specific execution action. Then it can dynamically generate cleaning on the visual interface based on the degree of contamination corresponding to the area covered by the specific execution action by the cleaning equipment.
  • the display effect of the area covered by the specific execution actions of the device Exemplarily, the display effect of the area covered by the specific execution action of the cleaning equipment is dynamically displayed on the visual interface, as shown in the gray area in Figure 54.
  • the display effect is, for example, the depth of the color, where the dirtiness of the preset cleaning area is The higher the degree of dirt, the darker the color.
  • the display effect of the area covered by the cleaning equipment's specific execution actions on the visual interface for example, gradually changes from a lighter color or a transparent color to the degree of dirtiness of the preset cleaning area.
  • the corresponding darker color allows users to understand the degree of dirtiness of the preset cleaning area through the display effect of the area covered by the specific execution actions of the cleaning equipment when watching the animation of the visual interface, which is conducive to the realization of the cleaning process of the cleaning equipment.
  • the visualization of the degree of dirtiness of the preset cleaning area determined in the system improves the user's experience of using the cleaning equipment.
  • the display effect of the area covered by the specific execution actions of the cleaning equipment is not limited to this.
  • the display effect also includes the density of lines, the depth of shadows, the density of patterns, the size of numerical values or other display effects, which are not discussed here. limit.
  • a visual interface is generated based on the acquired execution items of all or part of the cleaning equipment, and the visual interface is used to indicate all or part of the completed execution items of the cleaning equipment through animation, including: when the execution item is When executing specific execution items and moving: Obtain the preset map and the movement path of the cleaning equipment when moving, as well as specific execution actions. Among them, the cleaning equipment performs specific execution actions when moving based on the movement path. According to the preset map and movement path, and specific execution actions Execute the action to generate a visual interface. The visual interface is used to indicate through animation the movement trajectory of the cleaning equipment on the preset map and the specific execution actions performed when moving.
  • the specific principles and implementation methods of the preset map can be referred to the above, and will not be described again here.
  • the execution item is executed with a specific execution action when moving according to the movement path.
  • the animation of the cleaning equipment moving on the preset map according to the movement path is displayed on the visual interface to indicate the movement trajectory of the cleaning equipment on the preset map through animation.
  • Execution items so that when users watch the animation of the visual interface, they can judge the execution items of the cleaning equipment through the movement trajectory of the cleaning equipment on the preset map and the specific execution actions of the cleaning equipment when moving, which is conducive to the movement of the cleaning equipment. And the visualization of specific execution actions improves the user's experience of using cleaning equipment.
  • the visual interface after generating the visual interface based on the preset map, movement path, and specific execution actions, it also includes: obtaining the degree of contamination corresponding to the area covered by the specific execution actions and movement of the cleaning equipment; and, according to the cleaning equipment The degree of dirtiness corresponding to the area covered by performing specific execution actions and moving is determined, and the display effect of the area covered by the cleaning equipment performing specific execution actions and moving is determined on the visual interface.
  • specifically performing actions and moving may be sweeping and moving, mopping and moving, sweeping and mopping simultaneously and moving, or other specific performing actions and moving.
  • the cleaning equipment After the cleaning equipment performs the specific execution action and moves, it can determine the degree of contamination corresponding to the area covered by the specific execution action and movement. Then, the degree of contamination corresponding to the area covered by the specific execution action and movement of the cleaning equipment can be determined.
  • the display effect of the cleaning equipment performing specific actions and moving the covered area is dynamically generated on the visual interface.
  • the display effect of the cleaning equipment performing specific actions and moving the covered area is dynamically displayed on the visual interface.
  • the display effect is, for example, the depth of color, where the preset cleaning The more soiled the area, the darker the color.
  • the movement trajectory of the cleaning equipment on the preset map is generated on the visual interface, as shown in the dotted line in Figure 53(b).
  • the cleaning equipment performs specific actions and moves.
  • the display effect of the covered area gradually changes from a lighter color or a transparent color to a darker color corresponding to the degree of dirtiness of the preset cleaning area, as shown in the gray area in Figure 53(b), so that the user is watching the visualization.
  • the display effect of the area covered by the specific action and movement of the cleaning equipment is not limited to this.
  • the display effect also includes the density of lines, the depth of shadows, the density of patterns, the size of numerical values, or other display effects.
  • the animation is playing, in response to the movement of the cleaning equipment, the visual interface can see that the cleaning equipment is moving, but no movement trajectory appears on the visualization interface. The user can observe the movement trajectory of the cleaning equipment through the movement of the cleaning equipment. There are no restrictions here.
  • Figure 54 in conjunction with Figure 53(b).
  • the cleaning equipment When the execution of the cleaning equipment is completed, as shown in Figure 54, when the cleaning equipment completes mopping and moving the living room floor, the cleaning equipment sweeps and moves the floor. The area covered is the living room. According to the display effect of the living room, the user can be prompted that the living room needs to be repeatedly mopped and moved. For example, the text "Prepare for repeated cleaning" can also be displayed to prompt the user to perform cleaning equipment, thereby improving the user's comfort. Use experience.
  • the animation includes a character identification, where the character identification is used to refer to the cleaning equipment.
  • a visual interface is generated based on the execution items.
  • the visual interface is used to indicate all or part of the execution items completed by the cleaning equipment through animation, including: determining the execution actions corresponding to the role identification according to the execution items, wherein the animation is used to indicate The execution action corresponding to the role ID.
  • the role identification includes at least one of a role image, a text identification, and an executive agency identification.
  • determining the execution action corresponding to the character identification according to the execution item includes: when the execution item of the cleaning equipment is movement, the animation displays the movement of at least one of the preset character image, text identification, and execution agency identification; When the execution matter of the cleaning equipment is to perform a specific execution action, the animation displays at least one of the execution action character image, the execution action text identifier, and the execution action execution structure identifier corresponding to the specific execution action; when the execution matter of the cleaning equipment is to execute a specific execution action When the action is executed and moved, the animation displays at least one of the execution action character image, the execution action text identifier, and the execution action execution structure identifier corresponding to the specific execution action; when the execution item of the cleaning equipment is to execute the specific execution action and move, the animation Display at least one of the execution action role, execution action text ID, and execution action execution structure ID corresponding to the specific execution action, and display the execution action role
  • the character image is, for example, the Q version of the whale spirit.
  • the animation shows that the movement path of the Q version of the whale spirit is to move from the current position on the preset map to the target position.
  • the movement path of the Q version of Whale Spirit is determined based on the movement path of the cleaning equipment. For example, please refer to Figure 55.
  • the execution item of the cleaning equipment is to move to the base station, if the base station is included in the default map displayed on the visual interface, the location of the base station is determined as the target position, and the Q version of Whale Spirit is dynamically displayed through animation.
  • the text mark is, for example, the text "Cleaning Robot”.
  • the motion path of the animated text "Robot” is to move from the current position on the preset map to the target.
  • the animation shows that the wheel rotates and the movement path of the wheel is to move from the current position on the preset map to the target position, and follow the movement of the wheel on the preset map Dynamically generate the movement trajectory of the wheel, or not generate the movement trajectory.
  • the execution structure identifier is, for example, a motion mechanism such as a wheel
  • the animation shows that the wheel rotates and the movement path of the wheel is to move from the current position on the preset map to the target position, and follow the movement of the wheel on the preset map Dynamically generate the movement trajectory of the wheel, or not generate the movement trajectory.
  • the execution action character image is, for example, a Q-version whale spirit carrying a Q-version cleaning tool.
  • the animation shows that the Q-version whale spirit carrying the Q-version broom alternately performs the sweeping action toward the left and the right at the current position.
  • the execution action text mark is, for example, the text "The robot is sweeping the floor”. For example, as shown in Figure 56, it is represented by a simple diagram such as a black dot.
  • the current position of the cleaning equipment is displayed, and the text "The robot is sweeping the floor” is displayed in the animation and played cyclically from left to right at the current position; when the execution item of the cleaning equipment is to perform a specific execution action, such as sweeping the floor, the execution action execution structure identifier is, for example, Side Brush, for example, animates the side brush rotating at its current position.
  • the execution action character image, the execution action text mark, and the execution action execution structure mark is not limited to this.
  • the animation displays the image of the character performing the action, such as the Q version of the whale spirit carrying the Q version of the broom, alternately performing towards the left While sweeping the floor and performing the sweeping action to the right, move from the current position to the target position, and follow the movement of the Q-version whale spirit carrying the Q-version broom to dynamically generate the Q-version whale carrying the Q-version broom on the preset map.
  • the spirit's movement trajectory may only dynamically move on the preset map.
  • the animation displays the text mark of the execution action, such as the text "The robot is sweeping the floor” and plays it cyclically from left to right. At the same time, it moves from the current position to the target position, and follows the movement of the text "The robot is sweeping the floor” to dynamically generate the movement trajectory of the text "the robot is sweeping the floor” or the movement of the text "the robot is sweeping the floor” on the preset map.
  • the animation displays the execution structure identifier of the execution action, such as the side brush rotating while The current position moves to the target position, and following the movement of the side brush, the movement trajectory of the side brush is dynamically generated on the preset map, or it only dynamically moves on the preset map.
  • the animation can also combine and display at least two of the character image, text logo, and execution structure logo.
  • the animation displays the execution action text mark, such as the text "The robot is sweeping the floor” cyclically from the left to the right to play and display the execution structure logo of the execution action.
  • the side brush rotates while displaying the text "The robot is sweeping the floor” and the side brush moves from the current position to the target position, and follows the text "The robot is sweeping the floor” and the side brush.
  • the movement dynamically generates its movement trajectory on the preset map or only moves without generating a movement trajectory.
  • the animation displays the execution action character image, such as the Q version whale spirit carrying the Q version mop, and displays the execution action execution structure identifier, such as Mop, and display the Q version of the whale spirit carrying the Q version of the mop and the mop moving to the base station, and at the same time follow the movement of the Q version of the whale spirit and the mop carrying the Q version of the mop to dynamically generate its movement trajectory or only move without generating a movement trajectory
  • the animation shows that the Q version of the whale spirit carrying the Q version of the mop and the mop are at the corresponding position of the base station
  • the animation shows that the Q version of the whale spirit carrying the Q version of the mop rubs and cleans the Q version of the mop and rotates the mop.
  • the execution action role identifier, the execution action text identifier, and the execution action execution structure identifier can also be at a position on the visual interface.
  • Fixed display such as performing action character identification, such as the Q version of the whale spirit carrying a Q version broom, maintaining the action of walking and sweeping the floor in the position, such as performing action text identification, such as "The robot is sweeping the floor"
  • action text identification such as "The robot is sweeping the floor”
  • the text scrolls and jumps in this position
  • Display for example, the execution structure identification of the execution action, for example, the side brush rotates and jumps in place within the range corresponding to the position, to indicate that the execution matters of the cleaning equipment include performing specific execution actions and moving, optionally, over time, in the visualization
  • the movement trajectory of the cleaning equipment is dynamically generated on the interface to indicate the movement trajectory of the cleaning equipment.
  • At least one of the execution action role identifier, the execution action text identifier, and the execution action execution structure identifier on the visual interface can be used to indicate that the execution item of the cleaning equipment is to perform a specific execution action and move it, and does not affect the visual interface.
  • the character identification also includes an organism identification.
  • the animation displays at least one of the execution action character image, the execution action text identifier, and the execution action execution structure identifier corresponding to the specific execution action, and displays The body logo moves.
  • the execution item of the cleaning equipment is to perform a specific execution action and move
  • at least one of the execution action character image, the execution action text identifier, and the execution action execution structure identifier can be combined with the body identifier.
  • the animation displays the execution action text mark of the cleaning equipment, for example, "The weather is really nice, I am going to sweep 100 square meters and work for two hours.” For example, "Sweep 100 square meters.” Texts such as "flat” and “dry for two hours” can be updated based on the information corresponding to the cleaning tasks performed by the cleaning equipment.
  • the animation displays the body logo.
  • the top view of the cleaning equipment is used as the body logo of the cleaning equipment to move and follow the movement of the body logo.
  • the movement dynamically generates the movement trajectory of the body logo.
  • the body logo can also be combined with the execution action character image and the execution action execution structure logo to animate the execution items of the cleaning equipment to perform specific execution actions and move.
  • the animation displays at least one of the execution action character image, the execution action text mark, and the execution action execution structure mark to perform the specific execution action and move, and displays the body mark movement, which is not limited here. Of course, it is not limited to this.
  • At least one of the execution action character image, the execution action text identification, the execution action execution structure identification and the body identification can follow Moving with the body logo, for example, at least one of the execution action character logo, the execution action text logo, the execution action execution structure logo, and the body logo are simultaneously displayed in the same area, or the body logo does not need to be moved, as shown in Figure 59. At least one of the execution action character image, the execution action text logo, the execution action execution structure logo, and the body logo are displayed in different areas.
  • At least one method jumps and performs specific execution actions within a region of the visual interface to instruct the cleaning equipment to perform execution items and move.
  • the execution item of the cleaning equipment is to execute a specific execution item and move
  • the body logo may not move, for example, maintaining a jumping action in a fixed area of the visual interface.
  • the movement trajectory of the cleaning equipment is dynamically generated on the interface to indicate the movement trajectory of the cleaning equipment.
  • At least one of the execution action character identifier, the execution action text identifier, the execution action execution structure identifier, and the body identifier on the visual interface can be used to indicate that the execution matter of the cleaning equipment is to perform a specific execution action and move it, and There is no restriction on how at least one of the execution action character identification, the execution action text identification, the execution action execution structure identification, and the body identification in the animation on the visual interface is displayed.
  • the animation can display a text mark of the cleaning mop piece such as "Little Whale Spirit is carrying the mop home to clean the mop"; animation
  • the cleaning mop character image can also be displayed, such as the Q version of the whale spirit carrying the Q version of the mop moving to the base station position, and when the Q version of the whale spirit carrying the Q version of the mop is at the base station position, the animation shows the Q version of the whale spirit versus the Q version.
  • the mop is rubbed and cleaned; for example, the animation can also display the logo of the cleaning mop actuator, such as the mop moving to the base station, and when the mop is at the base station, the animation displays the flowing water flowing to rinse the mop.
  • the animation displays the flowing water flowing to rinse the mop.
  • the above-mentioned At least two of the text logo of the cleaning and mopping part, the character image of the cleaning and mopping part, and the execution structure of the cleaning and mopping part are displayed in combination, which is not limited here.
  • the specific execution action is a specific execution action other than moving to the base station and cleaning the dragged parts
  • at least one of the corresponding execution action text identifier, execution action character image, and execution action execution structure identifier can also be displayed through animation.
  • the animation may display a text mark for receiving the instruction, such as "I am here to work hard", which is not limited here.
  • specific execution actions include mopping, sweeping, sweeping and mopping at the same time, cleaning carpets, cleaning mopping parts, repeated cleaning, edge leak repair, alternate extension of edge brushes, obstacle exploration, charging midway during tasks, and obstacles. At least one of crossing over, getting out of trouble, receiving instructions, and cleaning a specific area, but is certainly not limited to this.
  • the animation displays at least one of the mopping character image, the mopping text logo, and the mopping execution structure logo; when the specific execution action is sweeping the floor, the animation displays the sweeping character image, At least one of the sweeping text logo and the sweeping and mopping execution structure logo; when the specific execution action is to sweep and mop the floor at the same time, the animation displays at least one of the sweeping and mopping character image, the sweeping and mopping text logo, and the sweeping and mopping execution structure logo. kind; when the specific execution action is carpet cleaning, the animation displays at least one of a cleaning carpet character image, a cleaning carpet text logo, and a cleaning carpet execution structure logo.
  • the animation when the specific execution action is sweeping the floor, the animation displays a sweeping character image, such as a Q version whale spirit carrying a Q version broom, alternately performing sweeping actions toward the left and performing sweeping actions toward the right; the animation displays the sweeping text
  • the logo is, for example, "the robot is sweeping the floor”; the logo of the animation display sweeping execution structure is, for example, side brush rotation.
  • at least two of the aforementioned sweeping character image, sweeping text mark, and sweeping execution structure mark may be displayed in combination.
  • the animation when the specific execution action is mopping the floor, the animation displays the mopping character image, for example, a Q version whale spirit carrying a Q version mop, alternately performing the mopping action toward the left and performing the mopping action toward the right;
  • the animated mopping text mark is, for example, "the robot is mopping the floor”;
  • the mopping execution structure mark for example, is mopping cloth rotation.
  • at least two of the aforementioned mopping character images, mopping text logos, and mopping execution structure logos may be displayed in combination.
  • the animation displays the sweeping and mopping character image, for example, the Q version whale spirit carrying the Q version mop and Q version broom to perform the mopping and sweeping actions at the same time;
  • the animated sweeping and mopping text mark is, for example, "the robot is sweeping and mopping at the same time.”
  • the animated sweeping and mopping execution structure mark for example, is that the side brush and the mop rotate simultaneously.
  • at least two of the aforementioned sweeping and mopping character images, sweeping and mopping text logos, and sweeping and mopping execution structure logos may be displayed in combination.
  • the animation when the specific execution action is carpet cleaning, the animation displays a carpet cleaning character image, such as a Q version whale spirit carrying a Q version broom, appearing on the carpet to perform the sweeping action; the animation displays the carpet cleaning text logo, for example, " "The robot is cleaning the carpet”; the animation shows that the cleaning carpet execution structure identifier is, for example, at least one of at least one side brush rotation, or a mid-sweeping motion, or the mop gradually disappearing.
  • the cleaning carpet execution structure identifier is, for example, at least one of at least one side brush rotation, or a mid-sweeping motion, or the mop gradually disappearing.
  • the cleaning carpet execution structure identifier is, for example, at least one of at least one side brush rotation, or a mid-sweeping motion, or the mop gradually disappearing.
  • at least two of the aforementioned cleaning carpet character image, cleaning carpet text logo, and cleaning carpet execution structure logo are displayed in combination.
  • the animation can dynamically display at least one of the corresponding character image, text logo, and execution structure logo.
  • At least one of a character image, a text identifier, and an execution structure identifier is intended to prompt the user to perform an execution action corresponding to an execution item of the cleaning equipment.
  • at least one of the character image, text identifier, and execution structure identifier corresponding to the specific execution action may be preset, may be expanded and updated later, or may be set by the user. This is not the case here. Make limitations.
  • the actuator identification includes at least one of a side brush identification, a drag part identification, and a middle sweep identification.
  • the animation displays the execution action execution structure corresponding to the specific execution action, including: when the specific execution action is mopping the floor, the animation displays the mopping execution structure identifier as The mopping part logo, and the rotation of the mopping part logo are displayed; when the specific execution action is sweeping the floor, the animation shows that the sweeping execution structure logo is at least one of the middle sweep logo and the side brush logo, and the movement of the mid-sweep logo, At least one of the side brush identification movements; when the specific execution action is sweeping and mopping the floor at the same time, the animation display sweeping and mopping execution structure identification is at least one of the mopping part identification, the middle sweep identification, and the side brush identification, and , displaying at least one of the movement of the mopping element logo, the movement of the mid-sweep logo, and the movement of
  • the animation may show, for example, that the logo of the mopping member drops to contact the ground and then rotates and mops.
  • the position of the mopping part logo gradually changes from a slightly dull gray to a cleaner position as the mopping part logo rotates, to remind the user that the execution of the cleaning equipment is to mop the floor.
  • the cleaning equipment can select at least one of the middle sweep and the side brush to clean the floor according to the material of the ground.
  • the animation can, for example, show the movement of the middle sweep logo after it touches the ground. , and the dirt at the position of the middle scan logo gradually disappears as the middle scan logo moves; and/or, the animation can, for example, show that the side brush logo moves after it unfolds and touches the ground, and the position of the side brush logo changes with the movement of the side brush logo. Movement dirt gradually disappears, prompting the user to clean the equipment and clean the floor.
  • the cleaning equipment can control at least one side brush to expand alternately, and the animation can, for example, show that the mopping piece mark moves after it drops to touch the ground, and the position of the mopping piece mark follows the mopping piece mark.
  • the animation can, for example, show the movement of the mid-scan logo after it touches the ground, and the dirt at the location of the mid-scan logo gradually disappears as the mid-scan logo moves; and/ Or, for example, the animation may display at least one side brush logo alternately unfolding and contacting the ground and retracting without contacting the ground. At the same time, it moves after unfolding and contacting the ground, and the position of each side brush logo follows the movement of the corresponding side brush logo. The dirt gradually disappears to remind the user that the cleaning equipment is to be swept and mopped at the same time.
  • the cleaning equipment can also control the side brushes to expand alternately to avoid contamination of the side brushes when sweeping and mopping the floor at the same time. The floor has been mopped.
  • the cleaning equipment can select at least one of a middle sweep and a side brush according to the material of the carpet to clean the carpet.
  • the animation can, for example, show that the middle sweep mark moves after it drops to contact the carpet, and the dirt at the position of the middle sweep mark gradually disappears as the middle sweep mark moves; and/or, at least one side brush
  • the logo alternately unfolds and contacts the carpet and retracts without contacting the carpet.
  • the cleaning device can also control the lifting of the mopping member when cleaning the carpet to minimize the mopping member from wetting the carpet.
  • the specific execution actions corresponding to the execution items of the cleaning equipment may be preset, may be expanded and updated at a later stage, or may be set by the user themselves, which is not limited here.
  • the character identifier includes an organism identifier.
  • the execution action corresponding to the character identification is determined according to the execution item, wherein the animation is used to indicate the execution action corresponding to the character identification, including: when the execution action of the cleaning equipment is movement, the animation displays the movement of the body identification.
  • the animation can, for example, show that the body identification moves, and as the body identification moves Dynamically generate the movement trajectory of the body logo.
  • FIG. 60 is a schematic flowchart of a method for generating a visual interface provided by another embodiment of the present application.
  • the method for generating a visual interface includes steps S210 to S230.
  • Step S210 Obtain all or part of the completed execution items of the cleaning equipment.
  • Step S220 Generate a visual interface based on the acquired execution items of all or part of the cleaning equipment, and the visual interface is used to indicate all or part of the completed execution items of the cleaning equipment through animation.
  • step S210 and step S220 can refer to the specific principles and implementation methods of the aforementioned step S110 and step S120, and will not be described again here.
  • Step S230 Display a prompt of the first execution item on the visual interface, where the first execution item is an execution item that meets the preset conditions.
  • the preset conditions include any of the following: cleaning carpets, cleaning mops, repeated cleaning, edge leak repair, alternate extension of edge brushes, obstacle exploration, charging during task execution, obstacle crossing, escape, and reception. Instructions, specific area cleaning.
  • a prompt for the first execution item is displayed on the visual interface.
  • the preset condition can also be customized by the user, and the corresponding execution item set by the user is the first execution item.
  • the time when the prompt of the first execution item is displayed on the visual interface is not earlier than the time when the visual interface indicates the first execution item through animation.
  • the animation of executing the first execution item is displayed on the visual interface and a prompt of the first execution item is displayed at the same time; or, the prompt of executing the first execution item is displayed on the visual interface.
  • a prompt for the first execution item is displayed during the animation of the item, so as to highlight the execution item of the cleaning equipment as the first execution item.
  • displaying a prompt for the first execution item on the visual interface includes: displaying a pop-up window on the visual interface; and displaying at least one of copywriting, pictures, and animation corresponding to the first execution item in the pop-up window.
  • the execution item of the cleaning equipment is highlighted as the first execution item.
  • the copy corresponding to the first execution item displayed in the pop-up window may be, for example, information corresponding to the execution item.
  • the copy corresponding to the execution item displayed in the pop-up window includes, for example, "Little Whale Spirit is carrying the mop home and cleaning the mop!"
  • the copy corresponding to the execution item displayed in the pop-up window includes, for example, "Little Whale Spirit is carrying the mop home and cleaning the mop!"
  • Figure 62 it is not limited to this; please refer to Figure 62.
  • the copy corresponding to the first execution item displayed in the pop-up window may include, for example, the number of executions corresponding to the first execution item, the first execution At least one of the cleaning time corresponding to the item and the cleaning area corresponding to the first execution item.
  • the copy in the pop-up window includes, for example, "I'm out to work hard!”. Of course, it is not limited to this. For example, it can also include " I’m going to sweep 100 square meters and do it for two hours!” etc., and highlight the implementation of cleaning equipment through copywriting as the first implementation item.
  • the picture corresponding to the first execution item displayed in the pop-up window may include, for example, a picture of at least one of the environment in which the cleaning equipment is located and the role identification of the cleaning equipment.
  • the picture displayed in the pop-up window may include, for example, a Q-version whale spirit carrying a Q-version mop near the base station.
  • the picture displayed in the pop-up window may include, for example, a Q-version whale spirit carrying a Q-version mop near the base station.
  • displaying the animation corresponding to the first execution item in the pop-up window may include any one of a preset first execution item animation and an animation corresponding to the first execution item displayed on the visual interface.
  • displaying the preset first execution item animation in the pop-up window for example, please refer to FIG. 61 to FIG. 62 , when the first execution item is cleaning the mop piece and moving it, the preset first execution item animation is displayed in the pop-up window.
  • the execution item animation can display the content of the Q version of Whale Spirit moving to the base station and cleaning the drag items in the base station, but of course it is not limited to this; in some embodiments, by displaying the first execution item on the visual interface in a pop-up window
  • the corresponding animation for example, the animation corresponding to the first execution item is displayed in a certain position or a certain part of the visual interface
  • the pop-up window can, for example, enlarge the display of a certain position or a certain part of the visual interface.
  • the animation corresponding to the first execution item is, for example, an enlarged display of the movement trajectory of the cleaning equipment corresponding to the first execution item and/or the animation corresponding to the execution item displayed at a certain position or a certain part of the visual interface, so as to pass
  • the animation highlights the execution of cleaning equipment as the first execution item.
  • displaying the pop-up window in the visual interface includes: the background transparency of the pop-up window is within a preset transparency range, so that the pop-up window does not completely block the animation in the visual interface or highlight the pop-up window.
  • the background transparency of the pop-up window can be set to a lower value within the preset transparency range to highlight the pop-up window and thereby highlight the execution items of the cleaning equipment as the first execution item.
  • the cleaning process of the cleaning device is, for example, a process in which the cleaning device cleans a room designated by the user.
  • the tasks performed by the cleaning equipment in cleaning the room designated by the user include sweeping and moving, mopping and moving, and cleaning the mop parts.
  • the animation is displayed on the visual interface according to the order of execution of the cleaning equipment. For example, the animation first displays the Q version of the whale spirit carrying the Q version of the broom alternately performing sweeping actions towards the left and sweeping towards the right. The movements move at the same time, and follow the movement of the Q version of the whale spirit carrying the Q version of the broom to dynamically generate a corresponding movement trajectory for sweeping and moving.
  • the corresponding movement trajectory for sweeping and moving can cover the entire room, for example; then the animation displays the Q version carrying the Q version of the whale spirit.
  • the Q-version whale spirit of the mop alternately performs mopping actions toward the left and mops toward the right and moves simultaneously, and follows the movement of the Q-version whale spirit carrying the Q-version mop to dynamically generate mopping and move corresponding movements. trajectory; then the animation shows that when the Q version of the whale spirit carrying the Q version of the mop alternately performs mopping actions towards the left and right, while moving to a certain position, the Q version of the whale spirit stays at that position to mop the floor.
  • the Q-version mop is rubbed and cleaned.
  • a pop-up window pops up on the visual interface.
  • the Q-version mop is enlarged and displayed.
  • the Q-version Whale Spirit of the mop rubs and cleans the Q-version mop, and so on, to visually display the cleaning process of the cleaning equipment.
  • the visual interface displays animations of multiple execution items at the same time.
  • the animation simultaneously displays the Q version of the whale spirit carrying the Q version of the broom and the Q version of the mop.
  • the Q-version whale spirit carrying the Q-version broom alternately performs sweeping actions toward the left and the right, and the Q-version whale spirit carrying the Q-version mop follows behind and alternately moves toward the left. Perform the mopping action and perform the mopping action towards the right side.
  • the Q version whale spirit carrying the Q version mop moves to a certain position
  • the Q version whale spirit carrying the Q version mop stays at that position and mops the Q version mop. Carry out rubbing and cleaning.
  • a pop-up window will pop up on the visual interface.
  • the Q version of the whale spirit will be enlarged to display the Q version.
  • the mop is rubbed and cleaned, thereby intuitively showing the user the movement trajectory and execution matters involved in the cleaning process of the cleaning equipment through at least one of animations and pop-up windows on the visual interface.
  • the visual interface can also display animations based on all or part of the execution items currently executed by the cleaning equipment, and dynamically update the animation as the execution items of the cleaning equipment increase, which is not limited here.
  • the generation method also includes: obtaining the duration of the animation in the visual interface; when the duration of the animation in the visual interface is greater than the preset animation duration threshold, compressing the duration of the animation in the visual interface.
  • the duration of the animation in the visual interface needs to be controlled accordingly. For example, when the duration of the animation in the visual interface is greater than the preset animation duration threshold, the duration of the animation in the visual interface is compressed. For example, compressing the duration of the animation may mean speeding up the playback speed of the animation.
  • the duration of the animation in the visual interface and the duration of the pop-up window can also be controlled accordingly according to the preset animation duration threshold, so as to increase the user's interest in watching the animation in the visual interface, thereby facilitating cleaning. Visualize the cleaning process of the equipment and improve the user's experience of using the cleaning equipment.
  • the visual interface also includes sharing the identification and/or saving the identification.
  • the sharing logo is used to instruct the user to share the visual interface
  • the save logo is used to instruct the user to save the visual interface.
  • the save logo is, for example, a download logo
  • the animation in the visual interface is shared or saved, for example, in the form of a video, so as to conveniently meet the user's sharing and/or saving needs for the visual interface, thereby improving the user's awareness of the cleaning equipment. Use experience.
  • FIG. 65 is a schematic block diagram of a visual interface generation device 200 provided by an embodiment of the present application.
  • the generating device 200 includes a processor 201 and a memory 202 .
  • the processor 201 and the memory 202 are connected through a bus 203, such as an I2C (Inter-integrated Circuit) bus.
  • a bus 203 such as an I2C (Inter-integrated Circuit) bus.
  • the processor 201 may be a micro-controller unit (Micro-controller Unit, MCU), a central processing unit (Central Processing Unit, CPU) or a digital signal processor (Digital Signal Processor, DSP), etc.
  • MCU Micro-controller Unit
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • the memory 202 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk or a mobile hard disk, etc.
  • ROM Read-Only Memory
  • the memory 202 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk or a mobile hard disk, etc.
  • the processor 201 is configured to run a computer program stored in the memory 202, and implement the steps of the aforementioned visual interface generation method when executing the computer program.
  • the processor 201 is used to run a computer program stored in the memory 202, and implement the following steps when executing the computer program:
  • a visual interface is generated based on the acquired execution items of all or part of the cleaning equipment, and the visual interface is used to indicate all or part of the execution items of the cleaning equipment through animation.
  • Figure 66 is a cleaning equipment system provided by an embodiment of the present application.
  • the cleaning equipment system includes:
  • the cleaning equipment 100 includes a movement mechanism and an actuator, the movement mechanism is used to drive the cleaning equipment 100 to move, so that the actuator performs cleaning;
  • the generation device 200 can be used to implement the steps of the method for generating a visual interface in the embodiment of the present application.
  • the cleaning equipment includes at least one of a cleaning robot, a handheld cleaning equipment, and other cleaning equipment.
  • the cleaning device 100 can, for example, clean the execution mechanism that performs the cleaning task by itself.
  • the cleaning device 100 cannot, for example, clean the execution mechanism that performs the cleaning task by itself.
  • the cleaning equipment system also includes one or more base stations. The base station is used in conjunction with the cleaning device 100 to at least clean the actuator of the cleaning device.
  • the cleaning equipment system includes a separate generating device 200 for implementing the steps of the visual interface method of the embodiment of the present application.
  • the generating device 200 can be provided on the cleaning equipment 100, or can be provided on the base station; of course, it can also It is not limited to this.
  • the generating device 200 can be a device other than the cleaning device 100 and the base station, such as a home smart terminal, a master control device, etc.; in other embodiments, the cleaning device 100 is provided with a device controller, a base station, etc.
  • a base station controller is provided, and the equipment controller and/or the base station controller may serve alone or in combination as the generating device 200 for implementing the steps of the visual interface generating method in the embodiment of the present application.
  • Embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program.
  • the processor can implement the steps of the above method.
  • the computer-readable storage medium may be an internal storage unit of the generation device described in any of the preceding embodiments, such as a hard disk or memory of the generation device.
  • the computer-readable storage medium may also be an external storage device of the generating device, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), or a secure digital (SD) equipped on the generating device. ) card, Flash Card, etc.
  • Cleaning robots can be used to automatically clean floors, and their application scenarios can include household indoor cleaning, large-scale place cleaning, etc.
  • the cleaning robot can clean the floor through cleaning parts. After cleaning the floor for a period of time, the cleaning robot often runs out of power or the cleaning parts become dirty. It needs to return to the base station for maintenance, such as charging the cleaning robot or cleaning the floor. Parts are cleaned; usually the cleaning robot is limited to return to the base station after completing the cleaning of one room. However, this does not allow each room to be cleaned well, and in some cases it will also affect the cleaning efficiency of the cleaning robot. For example, some When the area of a certain room in the scene is relatively large, the cleaning ability of the cleaning parts cannot meet the needs of cleaning the area.
  • the cleaning effect is not good, or the power of the cleaning robot cannot meet the requirements.
  • the cleaning of the area causes the cleaning robot to return to the base station for charging before completing the cleaning of the area. It takes a long time to charge before returning to the area to continue cleaning.
  • the cleaning efficiency is also affected; and some area division methods make the area of each area If it is too small, it will need to return to the base station for maintenance every time it completes an area, making the cleaning robot return to the base station for maintenance more frequently and lowering the cleaning efficiency.
  • FIG. 67 is a schematic flowchart of a cleaning area dividing method for a cleaning robot provided by an embodiment of the present application.
  • the cleaning area dividing method of the cleaning robot can be applied in a cleaning system for processes such as dividing a room in a to-be-cleaned area to obtain a preset cleaning area.
  • the area to be cleaned can be any area to be cleaned such as a family space, a room unit of a family space, a part of a room unit, a large place or a part of a large place.
  • the cleaning system includes one or more cleaning robots 100 and one or more base stations 200 .
  • the base station 200 is used in conjunction with the cleaning robot 100, at least for maintaining the cleaning robot; for example, the base station 200 can charge the cleaning robot 100, the base station 200 can provide a docking position for the cleaning robot 100, etc.
  • the base station 200 can also clean the cleaning parts of the cleaning robot 100.
  • the cleaning parts may include brushing parts, such as side brushes and middle brushes. The brushing parts are used to sweep the ground to remove garbage or dust on the ground.
  • the cleaning part may also include a mopping part, and the mopping part is used for mopping the ground to clean stains on the ground.
  • the cleaning system also includes a control device 300.
  • the control device 300 can be used to implement the steps of the cleaning area dividing method of the cleaning robot and/or the steps of the control method of the cleaning robot according to the embodiment of the present application.
  • the robot controller of the cleaning robot 100 and/or the base station controller of the base station 200 can serve as the control device 300 alone or in combination, for implementing the steps of the method in the embodiment of the present application.
  • the cleaning system includes a separate control device 300 for implementing the steps of the method in the embodiment of the present application.
  • the control device 300 can be provided on the cleaning robot 100 or can be provided on the base station 200; of course, it can also It is not limited thereto.
  • the control device 300 may be a device other than the cleaning robot 100 and the base station 200, such as a home smart terminal, a master control device, etc.
  • the cleaning robot 100 can be used to automatically clean the floor.
  • the application scenarios of the cleaning robot 100 can be household indoor cleaning, large-scale place cleaning, etc.
  • the cleaning area dividing method of the cleaning robot according to the embodiment of the present application includes steps S110 to S130.
  • the cleaning task map includes Room 1, Room 2, and Room 3; when obtaining the graphical characteristics of the room in step S110, the graphical characteristics of one or more rooms can be obtained.
  • the rooms include rooms separated by scanning by the cleaning robot, and also include rooms edited by users (such as splitting and merging).
  • a room can be understood as an area defined by tangible boundaries such as walls, steps, and obstacles (such as box beds) on the cleaning task map. It can also include rooms defined by users.
  • User-defined rooms are, for example, areas enclosed by walls. Part of the area, such as the open kitchen, dining room, etc. in the living room, or a user-defined room can also be a room obtained by merging multiple areas defined by tangible boundaries, such as the living room and balcony can be used as one room.
  • the cleaning robot can obtain the boundaries of the room through radar, vision sensors, distance sensors, etc. to determine the graphic characteristics of the room, or when the cleaning robot performs edge cleaning of the area to be cleaned, based on edge cleaning
  • the cleaning trajectory determines the boundary of the room, and the graphic characteristics of the room are determined based on the boundary of the room.
  • the initial cleaning task map is obtained after the cleaning robot detects the area to be cleaned.
  • the edited cleaning task map can be obtained according to the user's editing operation on the initial cleaning task map, such as merging rooms or splitting rooms.
  • the workload value range includes an upper limit value and a lower limit value, and the upper limit value is greater than the lower limit value.
  • the amount of dirt that the cleaning robot's mopping parts can absorb is limited. When the mopping parts absorb more dirt, the cleaning effect on the ground is poor. If the mopping parts are not returned to the base station for maintenance, such as cleaning and mopping If the mopping parts are replaced or the mopping parts are replaced, the ground cannot be mopped clean; the power of the cleaning robot is also limited. If the power is insufficient when cleaning the floor, it cannot be guaranteed to return to the base station for charging; The amount of water in the water tank used to supply water to the mopping parts on the robot is also limited. When the amount of water is insufficient, the cleaning effect on the ground is also poor; the amount of dirt that the dust box on the cleaning robot can hold is also limited. Limited, when the amount of dirt contained is large, the cleaning effect on the floor is also poor.
  • the cleaning robot in order to ensure that the mopping part has the cleaning ability, it is necessary to insert the mopping part maintenance task in the middle of the task, so that the cleaning robot navigates back to the base station, cleans or replaces the mopping part, and then continues to perform the cleaning task of the area to be cleaned.
  • the cleaning robot can be controlled to return to the base station for maintenance in time before the workload reaches the upper limit of the workload value range, thereby improving the cleaning effect on the ground.
  • the maintenance of the mopping parts can also be completed by the cleaning robot without returning to the base station.
  • the cleaning robot has its own water tank and can directly clean the mopping parts, or the cleaning robot has its own mopping part replacement device and can directly clean the mopping parts. Replace the wiper parts.
  • the obtaining the workload value domain range includes: determining the preset cleaning area in at least two workload value domain ranges according to the user's selection operation.
  • workload value range For example, when the workload includes the area of the floor to be cleaned, the preset at least two workload value ranges include 5-7, 10-12, and 14-16, in square meters; the user can adjust the workload according to the graphical characteristics of the room. And/or the user's cleaning habits determine the workload value range for dividing the preset cleaning area in at least two workload value ranges.
  • a workload value range with a relatively large upper limit and/or lower limit such as 14-16; for example, when the user wants the cleaning robot to be maintained more frequently or to clean cleaner.
  • control device can determine the workload value for dividing the preset cleaning area within at least two preset workload value ranges based on the graphical characteristics of the room and/or the user's cleaning habits. domain scope.
  • S130 Determine the dividing line of the room according to the workload value range and the graphical characteristics, so that the dividing line and the boundary of the room form at least two preset cleaning areas, each of which is preset. It is assumed that the magnitude of the workload in the cleaning area is less than or equal to the upper limit of the workload value range, and greater than or equal to the lower limit of the workload value range; or there is only one preset value The magnitude of the workload of the cleaning area is less than the lower limit, and the cleaning sequence of the preset cleaning area whose magnitude of the workload is less than the lower limit is after the other preset cleaning areas.
  • the magnitude of the workload of each of the preset cleaning areas is less than or equal to the upper limit of the workload value range, so that the cleaning robot completes the work of each of the preset cleaning areas.
  • the current cleaning task is interrupted and the movement is carried out to the base station for maintenance.
  • the current cleaning task includes a cleaning task for a preset cleaning area, or a cleaning task for the floor of a room, or it may also include a cleaning task for the floors of all rooms in the whole house, or it may also include a cleaning task for Cleaning tasks are performed in the rooms indicated on the task map.
  • the cleaning task includes mopping the floor with a mopping member, brushing the floor or a carpet on the floor with a brushing member, and may also include mopping and brushing the floor at the same time.
  • the workload includes at least one of the following: the amount of dirt absorbed by the cleaning robot's mopping part when mopping the floor, the power consumption of the cleaning robot when cleaning the floor, The amount of water consumed when cleaning the floor, the amount of dirt collected when the cleaning robot cleans the floor, the area of the floor cleaned by the cleaning robot, and the path length of the floor cleaned by the cleaning robot.
  • the upper limit of the workload value range can be determined based on the changing relationship between the cleaning effect of the mopping element and the area of the floor cleaned by the cleaning robot; for example, in After the cleaning robot is maintained, the cleaning robot is controlled to clean the floor.
  • the upper limit of the workload value range is determined based on the area of the floor cleaned by the cleaning robot; wherein, the maintenance of the robot includes mopping. Clean or replace parts, charge the cleaning robot, replenish or drain the water tank of the cleaning robot, empty the dust box of the cleaning robot, etc.
  • the upper limit of the workload value range can be based on the cleaning effect of the mopping element and the amount of dirt absorbed by the mopping element when mopping the floor.
  • the changing relationship between the amount of dirt adsorbed is determined; for example, when the floor is mopped after maintenance of the mop piece, and the mopping effect of the mop piece on the floor is very poor, the amount of dirt adsorbed by the mop piece is determined by the amount of dirt adsorbed by the mop piece.
  • the dirt value d of the mopping part determines the upper limit of the workload value range; of course, it is not limited to this.
  • the upper limit of the workload value range can be based on the maximum dirt of the mopping part.
  • the value d_max is determined, and the maximum dirt value d_max is an empirical value, which can be measured in a laboratory, for example.
  • the lower limit of the workload value range can be determined based on the upper limit of the workload value range, for example, 0.6 to 0.8 times the upper limit.
  • determining the dividing line of the room based on the workload value range and the graphical characteristics includes: determining based on the graphical characteristics and/or the room identification of the room.
  • the magnitude of the workload is less than the lower limit value, and the cleaning sequence of the preset cleaning area whose magnitude value of the workload is less than the lower limit value is after the other preset cleaning areas.
  • the unit area can be a grid in the cleaning task map, but of course it is not limited to this.
  • the unit area can be an area of any size, such as an area of 0.5 square meters or an area of 1 square meter; the unit area It may be a rectangle or a square, and of course is not limited thereto, for example, it may be a parallelogram.
  • the workload can be determined according to the boundaries of the unit area and the room. and/or the distance between obstacles in the room determines the workload per unit area.
  • the workload of the unit area is negatively correlated with the distance between the unit area and the boundary of the room and/or obstacles in the room.
  • the environment map contains multiple grids.
  • One grid is a unit area, or multiple grids are combined into a unit area. Calculate the distance between each grid and obstacles in the environment map. distance, thereby determining the workload of each unit area.
  • the maximum distance, minimum distance, or average value of the distances between the multiple grids and obstacles is determined. Workload; the closer the unit area is to the obstacle, the greater the workload, or it can be said that the cleaning cost of the cleaning robot to clean the unit area is higher.
  • S1 represents an obstacle
  • the thick black line represents a wall
  • the workload of unit areas in different rooms For example, there are differences in the workload of unit areas in different rooms. For example, places with more people's activities, such as dining rooms/living rooms, are areas that are more likely to get dirty, while places with less people's activities, such as bedrooms/studies, are cleaner areas. ;For example, the workload of the unit area in the room marked as dining room/living room is greater than the workload of the unit area in the bedroom/study.
  • the graphical characteristics of the room include the distribution of dirt in the room (such as a dirt heat map), such as the dirt level of each unit area; each unit area can be determined based on the dirt level of the unit area in the room.
  • the workload of the unit area for example, the workload of the unit area is positively correlated with the degree of dirt of the unit area.
  • the distribution of dirt in the room can be determined based on the detection results of the cleaning robot's sensors, such as vision sensors; or a separate image sensor can be used to photograph the floor in the room, identify the captured images, and determine the room.
  • the distribution of dirt in the room; or after the cleaning robot completes cleaning different areas in the room, the dirt status in different areas can be determined based on the detection information of the sewage in the cleaning mop parts, or based on the amount of dirt in the dust box Determine the dirtiness of different areas.
  • the distribution of dirt in the room can be the current distribution of dirt in the room, or it can also be the distribution of dirt in the room in historical data.
  • Embodiments of the present application can determine the workload of each unit area in the room based on the graphical characteristics and/or the room identification of the room, and divide the room according to the workload of each unit area to obtain the preset cleaning area. , can more truly reflect the workload of the cleaning robot in the preset cleaning area, such as the amount of dirt adsorbed by the mopping parts, and can ensure that the cleaning robot has a better cleaning effect and higher cleaning efficiency.
  • the dividing line of the room is determined based on the workload value range and the graphical characteristics, so that the dividing line and the boundary of the room form at least two preset cleaning Region, including: determining the preset workload of each region formed after the room is divided according to the workload value range and the graphical characteristics; moving the dividing line along the long side of the boundary, and determining The workload of the area divided by the dividing line; when the workload of at least one area divided by the dividing line is equal to the corresponding preset workload, stop moving the dividing line.
  • the dividing line of the room is determined so that the dividing line and the boundary of the room form at least two preset cleaning areas; for example In other words, as shown in Figure 68, the living room is divided.
  • the total area of the living room is 48 square meters. According to the workload value range, it is determined that the area formed after the room is divided includes a preset workload, such as the area for an area of 6 square meters.
  • the room is divided according to the longest side of the room.
  • the horizontal length of the living room is 8m and the vertical length is 6m.
  • the longest side of the room is the horizontal side. It is necessary to divide a preset cleaning area with a workload equal to 6 square meters; as shown on the left side of the arrow in Figure 68
  • the embodiment of the present application moves the dividing line (indicated by a dotted line) laterally, and in the process of moving the dividing line laterally, it is determined whether there is a workload infinitely close to 6 square meters (can be regarded as equal to 6 square meters) in the divided area. area, stop moving the dividing line when the workload of the divided area reaches 6 square meters.
  • the room is divided according to the longest side of the room.
  • the change in the workload of dividing the area is relatively small, which makes it easier to control the workload of dividing the area and make the work of dividing the area easier. It is more consistent with the preset workload; on the other hand, it can avoid the appearance of long and narrow areas to a greater extent (as shown on the right side of the arrow in Figure 68, when the dividing line is moved longitudinally (short side), 6 square meters The area is 8 meters long and is a long and narrow area), thereby reducing the difficulty of cleaning by the cleaning robot.
  • the dividing line may be a straight line, a curve, or a closed line forming a figure (such as the dividing line forming the area S in Figure 71), or a combination of different line types.
  • the dividing line is perpendicular to the long side. In order to make the divided preset cleaning area square and easy to plan the cleaning path.
  • the preset cleaning area is divided according to the workload value range.
  • multiple division schemes can be generated. For example, there is at least one dividing line in any two different division schemes. Different, so that at least one preset cleaning area in any two different division schemes is different; and a division scheme that meets the preset conditions is determined from the multiple division schemes.
  • At least two division schemes may be determined based on the workload value range, the cleaning order of at least two of the rooms, and the area of each of the rooms, and at least one of any two division schemes must be
  • the dividing lines are different, so that at least one preset cleaning area in any two dividing schemes is different.
  • the areas of the three rooms that need to be cleaned are 3, 8, and 7 square meters respectively.
  • the order of room cleaning is 3 square meters of room - 8 square meters of room - 7 square meters of room, that is In Figure 70, the room cleaning sequence is Room 1 - Room 2 - Room 3.
  • Each preset cleaning area is obtained by dividing the room according to the cleaning order of the room.
  • the order of room division is based on the order of room cleaning, as shown in Figure 70
  • the cleaning order of at least two of the rooms can be determined based on the distance between the rooms and the base station. For example, cleaning the room farther away from the base station first can prevent the dirty mop parts from contaminating the cleaned floor for a second time. .
  • the cleaning order of at least two of the rooms can be determined according to the distance between the rooms with a degree of dirt. For example, the room with a higher degree of dirt is cleaned first, and then the room with a lower degree of dirt is cleaned, which can effectively prevent the occurrence of two rooms. pollution.
  • division plan 1 When divided according to the workload value range of 5-7 square meters, in all division plans, except for the preset cleaning area that was cleaned last in some division plans, the The areas are all within 5-7 square meters.
  • Different division plans include division plan 1 to division plan 3 as shown in the figure; among them, division plan 1 merges the 2 square meters area of room 2 with the 3 square meters area of room 1 As a preset cleaning area 1 of 5 square meters, the remaining 6 square meters of room 2 is used as preset cleaning area 2, and the 7 square meters area of room 3 is used as preset cleaning area 3 (division scheme 1 can be expressed as 3+2
  • Division plan 2 combines the 3 square meters area of room 2 with the 3 square meters area of room 1 as a 6 square meter preset cleaning area 1, and the remaining 5 square meters of room 2 m area as the default cleaning area 2, and the 7 square meters area of room 3 as the default cleaning area 3 (division plan 2 can be expressed as 3+3
  • the division scheme also includes at least one of the following: 3+2
  • the cost value for completing each preset cleaning area may be determined based on each preset cleaning area determined in the division scheme, and each division may be determined based on the cost value corresponding to each preset cleaning area in each division scheme.
  • the cumulative cost value of the plan is to select the dividing line corresponding to one division plan from at least two division schemes whose cumulative cost value satisfies the preset cost condition as the dividing line of at least two of the rooms, or determine the cumulative cost value.
  • the dividing line corresponding to the smallest dividing scheme is the dividing line of at least two of the rooms.
  • the cost value can be reflected by the number of times the cleaning robot interrupts the cleaning task, that is, the number of times it returns to the base station for maintenance.
  • the final partitioning scheme is determined based on the number of times the cleaning robot interrupts the cleaning task in each partitioning scheme; for example, choosing to interrupt the cleaning task.
  • the division plan with the least number of interruptions is the final division plan.
  • the division plan that meets the preset conditions is the division plan that interrupts the cleaning tasks the least frequently. That is, the least number of interruptions in the cleaning tasks represents the lowest value.
  • the preset cleaning division When it comes to areas, priority is given to a division scheme that allows the cleaning robot to interrupt the cleaning task as little as possible, thereby reducing the consumption of the cleaning robot traveling to and from the base station to improve the efficiency of the entire cleaning task.
  • the preset cleaning areas 1, 2, and 3 in the division plan 1 and the division plan 2 each need to be interrupted once to complete one cleaning. Then the cost value of each preset cleaning area can be calculated as 1, then the division
  • the cumulative cost value of plan 1 and division plan 2 are both calculated as 3; in division plan 3, preset cleaning areas 1, 2, 3, and 4 each need to be interrupted once to complete one cleaning, then the cost value of each preset cleaning area can be counted as 1, then the cumulative cost value of partition plan 3 is counted as 4.
  • the partition plan 1 and the partition plan 2 satisfy the preset cost condition that is, the partition plan 1 and the partition plan 2 can be used as alternatives.
  • the position of the cleaning robot in each division scheme when it interrupts the cleaning task can be determined, and the number of times when the cleaning robot interrupts the cleaning task is not at the door of the room where the cleaning robot is located, the number of times when the cleaning robot interrupts the cleaning task in each division scheme can be determined.
  • the number of times the position is not at the door of the room where the cleaning robot is located determines the final division scheme, that is, the generation value can be reflected by the number of times when the position of the cleaning robot is not at the door of the room where the cleaning robot is located when it interrupts the cleaning task; for example, select the position of the cleaning robot when it interrupts the cleaning task.
  • the division plan that is not at the door of the room where the cleaning robot is located the least number of times is the final division plan. At this time, the division plan that meets the preset conditions is the division that is not at the door of the room where the cleaning robot is located the least number of times when the cleaning robot interrupts the cleaning task.
  • the solution is that if the cleaning robot interrupts the cleaning task at a location that is not at the door of the room where the cleaning robot is located the least number of times, the representation value is the lowest.
  • the priority is to select a cleaning robot that interrupts the cleaning task at a location that is not at the door of the room where it is located.
  • the least frequent division scheme can reduce the possibility of cross-contamination caused by cleaning robots across different rooms, thereby improving the efficiency of the entire cleaning task while taking into account the cleaning effect. Please refer to Figure 70.
  • the rectangular pattern filled with hatched lines represents the room door, and the circular pattern filled with hatched lines represents the location of the base station; the locations of Division Scheme 1 and Division Scheme 2 when the cleaning task is interrupted in the preset cleaning area 1 are not in Room 2. door, then the value of the preset cleaning area 1 is calculated as 1.
  • the cleaning task is interrupted in the preset cleaning area 2
  • the location is at the door of room 2
  • the value of the preset cleaning area 2 can be calculated as 0.
  • the cleaning robot can directly enter the base station. At this time, the position of the cleaning robot can be regarded as being at the door of room 3, and the value of the preset cleaning area 3 can be calculated as 0.
  • the cleaning robot can directly enter the base station.
  • the position of the cleaning robot can be regarded as being at the door of room 3, and the cost value of the preset cleaning area 4 can be calculated as 0, which corresponds to the division plan 3.
  • the number of times when the cleaning task is interrupted is not at the door of the room, and the cumulative cost value is calculated as 3; if the preset cost condition is that the cumulative cost value is less than or equal to 2, then the division plan 1 and division plan 2 meet the preset cost conditions, That is, Division Plan 1 and Division Plan 2 can be used as alternatives.
  • the cumulative cost values of multiple partitioning schemes are all less than or equal to the preset cost threshold, you can select any one of the multiple partitioning schemes; or when the cumulative costs of multiple partitioning schemes are When the values are all smaller than the cumulative cost values of other partitioning schemes, any one of the multiple partitioning schemes can be selected.
  • the cumulative cost value of each division scheme and selecting the division scheme with a smaller cumulative cost value as the final division scheme, the cleaning efficiency of the entire cleaning task can be improved, while also taking into account the cleaning effect of the cleaning robot.
  • the determination of the cost value when completing each of the preset cleaning areas includes: determining the room where the cleaning robot is located and the backwashing point after completing the workload of each of the preset cleaning areas.
  • the point is the location where the robot completes one of the preset cleaning areas and needs to return to the base station; the cost value is determined based on the room where the cleaning robot is located and the backwash point.
  • the room where the cleaning robot is located is room 2, and the backwash point is not at the door of room 2; the preset cleaning is completed After completing the workload of area 2, the room where the cleaning robot is located is room 2, and the backwash point is at the door of room 2; after completing the workload of preset cleaning area 3, the room where the cleaning robot is located is room 3, and the cleaning robot can enter directly
  • the base station can be regarded as the backwash point at the door of room 3.
  • the room where the cleaning robot is located is room 2, and the backwash point is not at the door of room 2; after completing the workload of the preset cleaning area 2, The room where the cleaning robot is located is room 2, and the backwash point is not at the door of room 2; after completing the workload of the preset cleaning area 3, the room where the cleaning robot is located is room 3, and the backwash point is not at the door of room 3; the preset cleaning is completed After the workload of area 4, the room where the cleaning robot is located is room 3.
  • the cleaning robot can directly enter the base station. At this time, it can be regarded as the backwash point at the door of room 3.
  • determining the cost value based on the room where the cleaning robot is located and the backwash point includes: determining the cost factor value based on the room where the cleaning robot is located and the backwash point; The factor magnitude determines the value of the consideration.
  • the cost factor magnitude includes at least one of the following: the distance of the cleaning robot from the backwash point to the doorway of the room where the cleaning robot is located, the movement of the cleaning robot from the backwash point The number of rooms or preset areas passed by the base station, the path length of the cleaning robot from the backwash point to the base station, the cleaning robot returning from the base station to the backwash point or the next The number of rooms or preset areas that a preset cleaning area passes through, the path length of the cleaning robot returning from the base station to the backwash point or the preset cleaning area to be cleaned; wherein the cost value is related to each The cost factors are positively correlated.
  • the final division is determined by determining the cost value when completing each of the preset cleaning areas based on the aforementioned cost factor values, and determining the cumulative cost value of each division plan based on the cost value corresponding to each preset clean area in each division plan. solution, which can improve the cleaning efficiency of the entire cleaning task.
  • the cost value corresponding to cleaning the preset cleaning area 1 is 2, and the cost value of cleaning the preset cleaning area 2 and the preset cleaning area 3 is both 1, so the division plan 1 and the division plan 2
  • the determination of the cost value corresponding to the preset cleaning area is not limited to this. Further consideration can be given to the cost value. For example, the farther the distance between the backwash point and the door of the room where the robot is, the greater the cost value, and the higher the cost value. The higher the value, such as 3, 4 or 5; for example, as shown in Figure 70, the distance between the backwash point a of the preset cleaning area 1 of the cleaning robot in division scheme 1 and the doorway B of room 2 is greater than In the division plan 2, the cleaning robot cleans the distance between the backwash point a of the preset cleaning area 1 and the doorway B of the room 2. Therefore, the cost value corresponding to the preset cleaning area 1 in the division plan 1 is greater than the preset cleaning in the division plan 2. The value corresponding to area 1.
  • the room contains at least two, and the dividing line of the room is determined based on the workload value range and the graphical characteristics, so that the dividing line and the room
  • the boundary forms at least two preset cleaning areas, including: when the magnitude of the workload in the first room is less than the minimum value of the workload value range, and the magnitude of the workload in the first room and the magnitude of the workload in the second room are The sum of the magnitudes of the quantities is greater than the upper limit value, and the dividing line of the second room is determined according to the workload value range and the graphical characteristics of the second room, so that the dividing line and the The boundary of the second room forms at least two areas, and the sum of the magnitude of the workload of at least one area and the magnitude of the workload of the first room is less than or equal to the upper limit; the at least An area and the area of the first room are determined as a preset cleaning area.
  • room 2 can be divided into an area of 3 square meters and an area of 5 square meters. area; by merging the 3 square meters area of Room 1 and Room 2 as a preset cleaning area, the number of times the cleaning robot returns to the base station for maintenance can be reduced. For example, instead of returning to the base station after cleaning 3 square meters of room 1, you return to the base station after cleaning 3 square meters of room 2, then returning to the base station after cleaning another 5 square meters of room 2, and cleaning 7 square meters of room 3.
  • the position of the cleaning robot that is, the back-washing point
  • the position of the cleaning robot can be placed in the place where the room happens to be cleaned, such as the door of the room; so that the return
  • the impact of the base station's actions on cleaning path planning is reduced, which reduces the cost of cleaning compared to division scheme 3 and can improve cleaning efficiency.
  • the remaining area can continue to be divided by dividing lines. Please refer to Figure 68. After dividing a 6 square meter preset cleaning area in a 48 square meter room, the workload of the remaining area of 42 square meters is greater than the upper limit. You can continue to clean the remaining area through the dividing line. divide.
  • the magnitude of the workload of each of the preset cleaning areas is less than or equal to the upper limit value, and greater than or equal to the lower limit value in the workload value range.
  • it can solve the problem that the cleaning robot needs to frequently return to the base station for maintenance because the area of the preset cleaning area is too small.
  • it can also solve the problem of insufficient cleaning power of the cleaning robot because the area of the preset cleaning area is too large. Therefore, problems that affect the cleaning effect of the cleaning robot can be reduced, that is, the frequency of maintenance of the cleaning robot can be reduced, the performance of the cleaning robot can be fully utilized, and the cleaning efficiency of the cleaning robot can be ensured.
  • the magnitude of the workload of only one of the preset cleaning areas is less than the lower limit
  • the cleaning sequence of the preset cleaning area whose magnitude of the workload is less than the lower limit is in After other preset cleaning areas.
  • the workload of the remaining areas when the room is divided into one or more preset cleaning areas with a workload within the workload value range, the workload of the remaining areas, if the area is less than the lower limit, can be Treat the remaining area as a separate preset cleaning area, and divide the preset cleaning area 4 in Scheme 3 as shown in Figure 70.
  • the remaining area can be separated from the adjacent area.
  • the preset cleaning areas are merged; when the workload of the merged area is greater than the upper limit, the merged area can also be divided according to the workload value range to obtain the workload value in the Preset cleaning area within the workload value range.
  • the dividing line of the room is determined based on the workload value range and the graphical characteristics, so that the dividing line and the boundary of the room divide at least two predetermined
  • the method further includes: when there are at least two obstacles in the preset cleaning area, marking the preset cleaning area as an obstacle-dense area, or determining in the preset cleaning area The obstacle-dense area.
  • an obstacle-dense area may be a restaurant or other area where there are many scattered stool legs that need to be avoided, but of course it is not limited to this. It is understandable that it takes a long time to clean areas with dense obstacles. If the areas with concentrated obstacles are not aggregated into one area, the cleaning robot will clean around the obstacles when it detects them, and the cleaning robot will go back and forth. Movement, unable to focus on handling obstacles, the path is messy, and controllability is not high. Referring to Figures 72 and 73, the cleaning robot cleans the preset cleaning area along an arcuate path. As shown, there are four obstacles S1 to S4 on the left side of the preset cleaning area.
  • the trajectories of the cleaning robot when cleaning the preset cleaning area are trajectories 1-8.
  • trajectory 1 encounters the obstacle S1, it needs to go around the obstacle S1 clockwise to obtain trajectory 2.
  • trajectory 3 encounters obstacle S2, it is necessary to circle clockwise around obstacle S2, that is, trajectory 4 is obtained
  • trajectory 5 encounters obstacle S2, it is necessary to circle clockwise around obstacle S2.
  • the trajectory 5 reaches the left side of the preset cleaning area, part of the area on the left side of the preset cleaning area has not been cleaned yet. When cleaning this part of the area, it encounters the obstacle S3 and circumvents the obstacle S3 clockwise.
  • Embodiments of the present application can aggregate areas with concentrated obstacles into dense obstacle areas for separate processing, allowing the cleaning robot to first clean a preset cleaning area without obstacles or clean only a small number of obstacles or scattered obstacles. Reducing the movement around obstacles in this area can ensure that the cleaning robot has high controllability, quickly completes cleaning of most areas, and then cleans areas where obstacles are concentrated.
  • the obstacle-dense area is the preset cleaning area that is cleaned last among the at least two preset cleaning areas.
  • cleaning the preset cleaning areas in non-obstacle-dense areas cleaning the obstacle-dense areas in a concentrated manner is more in line with the user's understanding of cleaning.
  • the cleaning of non-obstacle areas will not be interrupted, and most of the areas can be cleaned as quickly as possible. area, the user experience is better.
  • the dividing line of the room is determined according to the workload value range and the graphical characteristics, so that the dividing line and the boundary of the room are divided
  • Finding at least two preset cleaning areas includes: determining the obstacle-dense area in the room according to the distribution of obstacles in the room, such as area S of room 3 in Figure 71; when all the obstacles in the room are removed
  • the dividing line of the room is determined according to the workload value range, so that the dividing line and the The boundaries of the room and the boundaries of the obstacle-dense area divide at least two preset cleaning areas. That is, you can also divide the obstacle-dense areas in the room first, and then divide the remaining areas to obtain other preset cleaning areas.
  • each unit area is a unit area in the obstacle-dense area. For example, the unit area whose distance from the obstacle is less than a preset distance threshold constitutes the unit area. Areas with dense obstacles.
  • the cleaning area division method of the cleaning robot includes: obtaining the graphical characteristics of the room, and the graphical characteristics include the boundaries of the room; obtaining the workload value range; and determining according to the workload value range and the graphics characteristics.
  • the dividing line of the room so that the dividing line and the boundary of the room form at least two preset cleaning areas, and the workload of each preset cleaning area is less than or equal to the upper limit of the workload value range, and is greater than or Equal to the lower limit value in the workload value range; or there is only one preset cleaning area whose workload value is less than the lower limit value, and the workload value is less than the lower limit value
  • the cleaning sequence of the preset cleaning area is after the other preset cleaning areas; this allows the cleaning robot to achieve better cleaning effects and higher cleaning efficiency when cleaning the room according to the preset cleaning area. .
  • FIG. 74 is a schematic flowchart of the control method of the cleaning robot provided by the embodiment of the present application.
  • the control method of the cleaning robot can be applied in a cleaning system to control the cleaning robot in the system so that the cleaning robot performs cleaning tasks, such as cleaning the area corresponding to the cleaning task map.
  • the control method of the cleaning robot includes steps S210 to S220.
  • the cleaning robot is controlled to clean the room according to the preset cleaning areas.
  • the order in which the preset cleaning areas are divided can be determined based on the cleaning order of multiple rooms.
  • the cleaning robot After the cleaning robot completes cleaning any of the preset cleaning areas, it moves to the base station for maintenance, such as cleaning cleaning parts, such as mopping parts, cleaning dirt in the dust box, charging, and cleaning. At least one of replenishing or draining the water tank of the robot.
  • FIG. 24 is a schematic block diagram of the control device 300 provided by the embodiment of the present application.
  • the control device 300 includes a processor 301 and a memory 302.
  • the processor 301 is used to run a computer program stored in the memory 302, and implement the following steps when executing the computer program:
  • the dividing line of the room is determined so that the dividing line and the boundary of the room form at least two preset cleaning areas, each of the preset cleaning areas
  • the magnitude of the workload in the area is less than or equal to the upper limit of the workload value range, and greater than or equal to the lower limit of the workload value range; or there is only one preset cleaning area.
  • the magnitude of the workload is less than the lower limit value, and the cleaning sequence of the preset cleaning area whose magnitude value of the workload is less than the lower limit value is after the other preset cleaning areas.
  • the processor 301 is used to run a computer program stored in the memory 302, and implement the following steps when executing the computer program:
  • the cleaning robot is controlled to clean the room according to the preset cleaning area.
  • control device on the base station such as the base station controller
  • the control device on the cleaning robot such as the robot controller
  • the control device on the base station can be used to implement the steps of the method in the embodiment of the present application.
  • embodiments of the present application also provide a base station, which is at least used to maintain the cleaning robot, for example, to clean the mopping parts of the cleaning robot.
  • the base station also includes a control device, such as a base station controller. , steps used to implement the method of the embodiment of the present application.
  • Embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program.
  • the processor can implement the steps of the above method.
  • the computer-readable storage medium may be an internal storage unit of the control device described in any of the preceding embodiments, such as a hard disk or memory of the control device.
  • the computer-readable storage medium may also be an external storage device of the control device, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), or a secure digital (SD) equipped on the control device. ) card, Flash Card, etc.
  • Figure 2 is a schematic diagram of a cleaning system provided by an embodiment of the present application.
  • Cleaning system includes:
  • Cleaning robot 100 includes a walking unit 106 and a cleaning piece 10.
  • the walking unit 106 is used to drive the cleaning robot 100 to move so that the cleaning piece 10 mops the ground;
  • the base station 200 is at least used to perform maintenance on the cleaning robot 100, such as cleaning or replacing the mopping parts of the cleaning robot 100;
  • Cleaning robots are used to replace people's personal cleaning of home environments or large places. Not only It can reduce people's work pressure and improve cleaning efficiency.
  • cleaning robots are usually equipped with special sensor devices to detect the floor.
  • ultrasonic waves are used to detect the material of the floor and then identify the floor type.
  • the related technology only involves how to use ultrasonic waves to detect and confirm the ground medium.
  • the ground medium is detected to be a special medium such as carpet, it does not involve how to adjust the response mode of the cleaning robot for the special medium.
  • Embodiments of the present application provide a ground medium exploration method, cleaning robot and storage medium. Among them, this method is applied to a cleaning robot, which can be a sweeper or other intelligent robot, and is not limited here.
  • Figure 75 is a schematic flow chart of a ground medium exploration method provided by an embodiment of the present application.
  • the ground medium exploration method includes steps S10 to S11.
  • Step S10 When the cleaning robot detects the preset ground medium, obtain the status information of the cleaning robot, and determine the edge exploration mode according to the status information, wherein the edge exploration mode includes an inner edge exploration mode and an outer edge exploration mode. Edge exploration mode;
  • the preset floor media includes carpets, floor mats, children's climbing mats, and mats laid on the ground.
  • the preset floor media can also be laid for other cleaning robots that require special treatment when encountering them.
  • the medium on the ground is not limited here.
  • the outer edge exploration mode means that the cleaning robot performs edge exploration on the preset ground medium outside the preset ground medium.
  • the orthographic projection of the geometric center of the cleaning robot does not Falling into the orthographic projection of the preset ground medium, this is used to control the activity range of the cleaning robot during the outer edge exploration process to reduce the extent to which the mopping part wets or soils the preset ground medium area;
  • the inner edge exploration mode means that the cleaning robot moves The preset ground medium is explored along the inside of the preset ground medium.
  • at least part of the trajectory formed by the orthographic projection of the geometric center of the cleaning robot coincides with the orthographic projection of the preset ground medium.
  • the cleaning robot is in the outer edge exploration mode.
  • the overlap between the orthographic projection of the robot and the orthographic projection of the preset ground medium is less than 5% of the orthographic projection of the robot. 10. This reduces the extent to which the mopping part wets or soils the preset floor media area; the cleaning robot is in the inner edge exploration mode, and when it performs edge exploration inside the preset floor media, the orthographic projection of the cleaning robot is consistent with the preset floor media.
  • the overlapped portion of the orthographic projections is greater than or equal to 50% of the orthographic projection of the robot.
  • the cleaning robot may encounter a preset floor medium while moving or performing a cleaning task without special handling, the cleaning robot or the preset floor medium may be damaged.
  • the cleaning robot may get stuck on the carpet when moving on the carpet. If the cleaning robot is currently performing a mopping task, the cleaning robot may wet the carpet during the mopping process. .
  • the cleaning robot can use a variety of technical means to detect whether it encounters a preset floor medium.
  • the following description takes the preset ground medium as a carpet.
  • the cleaning robot can detect the carpet through an ultrasonic sensor.
  • the cleaning robot When the cleaning robot detects the preset floor medium during movement, it means that the cleaning robot may have reached or is close to the edge of the preset floor medium. At this time, the cleaning robot can select the edge exploration mode according to the status information to explore the preset ground medium. After determining the position and contour of the preset ground medium through edge exploration, the cleaning robot can avoid detours or clean the ground medium based on the position and contour of the preset ground medium when planning its travel trajectory or cleaning trajectory.
  • the method further includes:
  • the cleaning robot detects the preset ground medium during movement, if the cleaning robot is currently in a state of constructing a cleaning area map, the preset ground medium will not be explored along the edge.
  • the cleaning robot 1 determines the moving route and cleaning trajectory based on the cleaning area map. If the cleaning robot 1 is newly purchased, before the cleaning robot performs the cleaning task, it needs to explore the surrounding environment to build an initial cleaning Area map. In order to improve the efficiency of mapping, if the cleaning robot 1 detects the preset ground medium during this process, the preset ground medium will not be explored along the edge.
  • the method further includes:
  • the cleaning robot detects the preset floor medium during movement, if the contour information of the preset floor medium is recorded in the cleaning area map of the cleaning robot, the preset floor medium will not be edged. explore.
  • the cleaning area map is a map used by the cleaning robot to plan a moving path or a cleaning path.
  • the method further includes:
  • the cleaning robot is controlled to explore the preset ground medium along the edge to update the outline information of the preset ground medium in the cleaning area map according to the exploration results. .
  • the working environment of the cleaning robot is dynamic, and through the technical solution provided by this embodiment, the cleaning work map can be updated regularly.
  • obtaining status information of the cleaning robot and determining an edge exploration mode based on the status information includes:
  • the status information of the cleaning robot can be configuration information, or it can be the peripheral information (such as obstacles, cliffs and other special terrain distribution) obtained by the cleaning robot through sensors (such as lidar, cliff sensors, visual sensors, etc.) environmental information.
  • peripheral information such as obstacles, cliffs and other special terrain distribution
  • sensors such as lidar, cliff sensors, visual sensors, etc.
  • the configuration information when the cleaning robot determines the edge exploration mode through configuration information, the configuration information includes the working mode and/or the floor medium cleaning function state of the cleaning robot.
  • the cleaning robot determines the edge exploration mode strategy as shown in Table 1 below:
  • the edge exploration mode is the inner edge exploration mode; when the cleaning robot's working mode is the mopping mode, determine the edge exploration mode to be the outer edge exploration mode; when the cleaning robot's working mode is the sweeping and mopping mode, and the floor media cleaning function is on,
  • the edge-edge exploration mode is determined to be the inner edge-edge exploration mode; when the cleaning robot's working mode is the sweeping and mopping mode, and the ground media cleaning function is turned off, the edge-edge exploration mode is determined to be the outer edge exploration mode.
  • the cleaning robot when the cleaning robot determines the edge exploration mode through environmental information, it can obtain the distribution of obstacles on the preset ground medium through the environmental information, and determine the number and distribution positions of obstacles on the preset ground medium. To determine the edge exploration mode to improve the robot's exploration efficiency.
  • the outer edge exploration mode is selected to explore the preset ground medium; and if the number of obstacles on the preset ground medium is less than or equal to When the number is preset, select the inner edge exploration mode to explore the preset ground medium; or, if the obstacles on the preset ground medium are close to the edge of the preset ground medium and thus affect the movement of the cleaning robot, select the outer edge
  • the exploration mode is used to explore the preset ground medium; and if the position of the obstacle on the preset ground medium is far away from the edge of the preset ground medium and does not affect the movement of the cleaning robot, then the inner edge exploration mode is selected to explore the preset ground medium.
  • Step S11 Perform edge exploration on the preset ground medium according to the edge edge exploration mode to obtain the outline of the preset ground medium.
  • the outer edge exploration or the inner edge exploration of the preset ground medium can be performed according to the edge exploration mode, and the outline of the preset ground medium can be determined during the exploration process.
  • the integrity of the contour of the preset ground medium determined by the cleaning robot after using the inner edge exploration mode to explore the preset ground medium is higher.
  • the cleaning robot uses the outer edge exploration mode to explore the preset ground medium, the obstacles may hinder the cleaning robot's exploration behavior, making it impossible to obtain the complete outline of the preset ground medium.
  • the edge exploration mode is the outer edge exploration mode
  • the cleaning robot explores the outer edge of the preset floor medium, which can reduce the extent to which the mopping member wets or soils the preset floor medium area. For example, if the cleaning robot is in mopping mode, and if the cleaning robot adopts the inner edge exploration mode when detecting the preset ground medium, it will wet a large area or even contaminate the preset ground medium.
  • the contour refers to the boundary of the preset ground medium obtained by fitting the contour points of the edge of the preset ground medium, or the boundary obtained by expanding the preset size outward according to the boundary. It is understandable that there will be a certain degree of lag in the detection results of the cleaning robot's sensor. The boundary obtained by fitting the detected contour points and expanding outward to the preset size will be more accurate and more in line with the actual situation.
  • the outline of the preset ground medium can be determined. After the outline of the preset ground medium is updated to the cleaning area map of the cleaning robot, the cleaning robot will subsequently plan according to the cleaning area map. When selecting a path, you can plan a detour path or a preset ground media cleaning path based on the contour of the preset ground medium.
  • the edge-edge exploration mode when the edge-edge exploration mode is the inner edge-edge exploration mode, the edge-edge exploration is performed on the preset ground medium according to the edge-edge exploration mode to obtain the outline of the preset ground medium. , including: when the cleaning robot detects the preset ground medium, controlling the cleaning robot to perform an inner edge exploration task to obtain the contour points of the preset ground medium until the cleaning robot reaches the first Contour point position.
  • the first contour point position is the coordinates of the cleaning robot when it obtains the first contour point.
  • the cleaning robot When the coordinates of the cleaning robot are equal to the coordinates of the cleaning robot when it obtains the first contour point, the cleaning robot obtains the first contour point.
  • the coordinates of the robot coincide with each other or are a predetermined distance away, it can be determined that the cleaning robot has reached the first contour point position again.
  • the coordinates of the cleaning robot may be determined by the coordinates of sensors detecting floor media.
  • FIG. 77 is a schematic diagram of a scene in which a cleaning robot performs inner edge exploration on a preset ground medium according to an embodiment of the present application.
  • the cleaning robot in the process of the cleaning robot 1 entering the preset ground medium area range 2 from the non-preset floor medium area, when the cleaning robot detects the preset ground medium, the cleaning robot is controlled to execute the inner edge edge.
  • the exploration task is to obtain the contour points of the preset ground medium until the cleaning robot reaches the first contour point position again.
  • the first contour point position is the position where the cleaning robot obtains the first contour point. It can be understood that when the cleaning robot reaches the first contour point again, it means that the cleaning robot has returned to the position where the first contour point was detected, and the exploration path for the preset ground medium has formed a closed loop, that is, the cleaning robot has completed the preset ground medium. exploration.
  • the cleaning robot when the cleaning robot reaches the first contour point again, it is not limited to the cleaning robot having to return to a position coinciding with the first contour point.
  • the distance between the cleaning robot and the first contour point is less than the preset distance threshold, It can also be determined that the cleaning robot has reached the first contour point again.
  • controlling the cleaning robot to perform an inner edge exploration task to obtain contour points of the preset ground medium includes: controlling the cleaning robot to perform a first predetermined action until the cleaning robot detects Non-preset ground medium; control the cleaning robot to rotate in the first direction, and when the cleaning robot detects the preset ground medium again, obtain the current coordinates of the cleaning robot and mark them as the contour points; control The cleaning robot travels at a predetermined angular speed and a predetermined linear speed. When the cleaning robot detects the non-preset ground medium again, the step of controlling the cleaning robot to rotate in the first direction is repeatedly performed to obtain the required information. Describe the contour points.
  • the edge-edge exploration mode is the inner edge-edge exploration mode
  • the cleaning robot 1 is already located in the media area corresponding to the preset ground medium 2
  • the position of the edge so that it can be determined that the next behavior of the cleaning robot 1 is an effective edge exploration behavior.
  • the cleaning robot next detects the preset ground medium
  • the coordinates are close to the edge of the preset ground medium. Only when the cleaning robot 1 is close to the preset ground medium
  • the coordinates when the cleaning robot detects the preset ground medium can be determined as the outline point of the preset ground medium.
  • the cleaning robot can be controlled to perform the first predetermined action until the cleaning robot detects the non-preset ground medium. Preset the ground medium. At this time, it can be determined that the current position of the cleaning robot is close to the edge of the preset ground medium to ensure that the points detected during the subsequent exploration behavior of the cleaning robot are contour points; as shown in Figure 77(b), determine When the position of the current cleaning robot is close to the edge of the preset ground medium, the cleaning robot is controlled to rotate in the first direction, as shown in Figure 77(c).
  • the cleaning robot When the cleaning robot detects the preset ground medium again, it can be determined that the coordinates of the current cleaning robot can be Marked as the first contour point, as shown in Figure 77(d), the cleaning robot is then controlled to travel at a predetermined angular speed and a predetermined linear speed, so that the cleaning robot first moves away from and then approaches the edge of the preset ground medium according to the arc path.
  • the edge-side exploration mode is an inner edge-edge exploration mode.
  • the cleaning robot can choose the direction of edge exploration. Specifically, the left edge or the right edge.
  • the left edge refers to the traveling direction of the cleaning robot as a reference.
  • the left side of the cleaning robot is close to the preset direction.
  • the edge of the ground medium performs an inner edge exploration task.
  • the exploration direction is the left edge
  • the first direction is clockwise with the cleaning robot as a reference
  • the exploration direction is the right edge
  • the first direction is counterclockwise with the cleaning robot as a reference. .
  • the cleaning robot after controlling the cleaning robot to rotate in the first direction and determining that the cleaning robot is located on the preset ground medium and close to the edge of the preset ground medium, the cleaning robot can also be controlled to rotate at a predetermined angular speed and a predetermined linear speed. Traveling, and the angular velocity gradually decreases, as long as the cleaning robot can explore in a direction close to the preset ground medium, there is no limit here.
  • controlling the cleaning robot to travel at a predetermined angular speed and a predetermined linear speed, and the angular speed gradually decreases can enable the cleaning robot to gradually expand the exploration range in a spiral shape with the cleaning robot as the center, and ensure that the cleaning robot can explore unpredicted areas.
  • Setting the ground medium ensures that the cleaning robot can approach the inner edge of the preset ground medium again, so as to smoothly carry out the next exploration action and continue to explore the preset ground medium.
  • controlling the cleaning robot to perform a first predetermined action until the cleaning robot detects a non-preset floor medium includes: controlling the rotation of the cleaning robot, obtaining the angle of rotation of the cleaning robot in real time and The detection result of the cleaning robot on the ground medium is determined to be that the non-preset ground medium is detected and the angle of rotation of the cleaning robot is determined to be less than 180°, and the cleaning is controlled.
  • the edge-edge exploration mode is the inner edge-edge exploration mode, in order to update Quickly make the cleaning robot go to the edge of the preset ground medium, you can control the rotation of the cleaning robot, and quickly determine whether the current position of the cleaning robot is close to the edge of the preset ground medium, as shown in Figure 76 (a)-(b) when the cleaning robot detects When reaching a non-preset ground medium and the rotation angle is less than 180°, it can be determined that the current position of the cleaning robot is close to the edge of the preset ground medium.
  • the cleaning robot can be controlled to stop rotating; as shown in Figure 76(c)-(e) , if the non-preset ground medium is still not detected when the cleaning robot's rotation angle reaches 180°, it can be determined that the current cleaning robot is already on the preset ground medium. At this time, the cleaning robot can be controlled to stop rotating and control the cleaning robot to move in the current direction. Go straight until the cleaning robot detects a non-preset ground medium and determines that the current position of the cleaning robot is close to the edge of the preset ground medium. The cleaning robot can be controlled to stop traveling for the next inner edge exploration behavior.
  • the cleaning robot 1 when the cleaning robot 1 performs an exploration task, if it is determined that the current edge-edge exploration mode is the inner edge-edge exploration mode, after detecting the preset ground medium, the cleaning robot 1 can be controlled to rotate to quickly determine the cleaning Whether the current position of the robot is close to the edge of the preset ground medium. During the rotation of the cleaning robot, when the cleaning robot detects non-preset ground medium and the rotation angle is less than 180°, it can be determined that the current position of the cleaning robot is close to the preset ground.
  • the cleaning robot can be controlled to stop rotating; if the cleaning robot still does not detect the non-preset ground medium when the rotation angle reaches 180°, it can be determined that the current cleaning robot is already on the preset ground medium, and the cleaning robot can be controlled at this time.
  • the cleaning robot stops rotating and controls the cleaning robot to move straight in the current direction until the cleaning robot detects a non-preset ground medium.
  • the cleaning robot can be controlled to stop traveling; after determining the cleaning After the current position of the robot is close to the edge of the preset ground medium, the cleaning robot 1 is controlled to rotate in the first direction until the cleaning robot detects the preset ground medium, and the current coordinates of the cleaning robot are obtained and marked as the first contour point; then the cleaning robot is controlled Traveling at a predetermined angular speed and predetermined linear speed, the cleaning robot gradually approaches the edge of the preset ground medium along an arc path. When the cleaning robot detects non-preset ground medium again, it indicates that the cleaning robot is again close to the edge of the preset ground medium.
  • the predetermined angular velocity can be controlled to gradually decrease. It can be understood that the cleaning robot travels at a predetermined angular velocity and a predetermined linear velocity, and the angular velocity gradually decreases.
  • the preset trajectory of the preset ground medium can be detected. It is a spiral trajectory, which allows the cleaning robot to explore a range that gradually expands in a spiral shape with the cleaning robot as the center, ensuring that the cleaning robot can explore the preset ground medium signal. When the angular speed remains unchanged, the preset trajectory is an arc.
  • the trajectory can also be designed as other trajectories according to the needs of the situation, and there is no restriction here.
  • the direction of the predetermined angular velocity is determined by the first direction. Specifically, when the first direction is clockwise with the cleaning robot as a reference, the direction of the predetermined angular velocity is vertically outward; when the first direction is counterclockwise with the cleaning robot as a reference, The direction of the predetermined angular velocity is vertically inward.
  • the cleaning robot is provided with at least one of a brushing element and a mopping element, wherein the brushing element includes a side sweeping element and/or a middle sweeping element;
  • the method further includes:
  • the mopping member is controlled to enter an off-the-ground state.
  • Figure 78 is a schematic structural diagram of a cleaning robot provided by an embodiment of the present application.
  • the cleaning robot 1 is provided with a brushing component and a mopping component 12 .
  • the brushing component includes a side scanning component 10 and a middle scanning component 11 .
  • the side scanning component 10 is used to sweep the coverage area of the cleaning robot.
  • the middle sweeping part 11 is used to sweep the garbage in the middle area of the cleaning robot's coverage area, while the mopping part 12 is used to mop the cleaning robot's coverage area.
  • the edge scanner 10 is controlled to enter the stowed state, and the center The scanning part 11 enters the off-the-ground state, which can prevent the cleaning robot 1 from being entangled in the carpet when the side scanning part 10 and the middle scanning part 11 are entangled in the carpet, causing the cleaning robot 1 to be unable to enter the carpet; in addition, the cleaning robot 1
  • the mopping element 12 By controlling the mopping element 12 to also enter the off-the-ground state, it can be avoided that the mopping element 12 wets the carpet after the cleaning robot 1 enters the medium area range 2 .
  • the cleaning robot is also provided with a fan component, and the fan component is used to suck garbage.
  • the method Before controlling the cleaning robot to enter the media area corresponding to the preset floor media, the method also includes include:
  • the fan speed of the fan component is reduced, and the cleaning robot is controlled to enter the media area corresponding to the preset floor medium.
  • the applicant's research found that when the cleaning robot 1 controls the middle scanning part 11 and the mopping part 12 to enter the off-the-ground state, the garbage adhered to the middle scanning part 11 or the mopping part 12 may fall to the ground. .
  • controlling the cleaning robot to stay in place for a preset time and increasing the fan speed of the fan unit allows the fan to suck away the garbage falling from the scanning unit 11 or the mopping unit 12 to avoid secondary pollution.
  • the fan speed of the fan unit can be reduced and the cleaning robot can be controlled to enter the media area range 2.
  • the preset duration can be set to 5 seconds, 10 seconds, or 15 seconds.
  • the preset duration can also be set to other durations according to the needs of the situation, and there is no limit here.
  • the method further includes: controlling the middle scanning element to enter a ground-close working state, and increasing the fan speed. The fan speed of the piece.
  • the middle scanning part 11 is controlled to enter the ground-close working state. Then during the movement of the cleaning robot 1, the middle scanning part 11 can clean the preset ground medium; and the fan part is increased The fan speed allows the cleaning robot 1 to clean the preset floor medium more cleanly.
  • the cleaning robot is further provided with a driving member for driving the cleaning robot. After controlling the cleaning robot to enter the media area corresponding to the preset floor medium, the method further includes:
  • the cleaning robot is further provided with a driving member for driving the cleaning robot. After controlling the cleaning robot to enter the media area corresponding to the preset floor medium, the method further includes:
  • the cleaning robot is controlled to issue a warning.
  • the driving member is used to drive the cleaning robot 1 to perform moving operations such as forward, turning, or retreating.
  • the cleaning robot 1 After entering the media area range 2, the cleaning robot 1 starts to monitor whether the driving part is slipping and getting stuck, and records the number of times the driving part is slipping as the number of slips, and records the number of times the driving part is getting stuck as The number of times the driver is stuck.
  • the first preset number of times can be set to 2, 3, or 5 times
  • the second preset number of times can be set to 2, 3, or 5 times
  • the first preset number of times and the second preset number of times can be set to 2, 3, or 5 times
  • the preset number of times can also be set to other times according to the needs of the situation, and there is no limit here.
  • the method further includes:
  • the cleaning robot When the cleaning robot is equipped with the brush scanning part, monitor whether the side scanning part triggers current threshold protection, and record the number of times the middle scanning part gets stuck when the middle scanning part becomes stuck;
  • the cleaning robot is controlled to issue a warning.
  • the cleaning robot 1 After the cleaning robot 1 enters the media area range 2, it starts to monitor whether the side scanned parts trigger the current threshold protection, and detects whether the middle scanned parts are stuck. When the middle scanned parts are stuck, it records that the middle scanned parts are stuck. The number of jams in the situation is regarded as the number of jams in the middle sweep.
  • the middle scanning piece 11 is jammed within the media area 2 , it may cause damage to the cleaning robot 1 . Therefore, when the cleaning robot 1 finds that the number of times the middle scan is stuck is greater than the third preset number, a warning reminder is issued, so that the user who is aware of the warning reminder can help the cleaning robot 1 get out of the predicament and avoid damaging the cleaning robot 1.
  • the third preset number of times can be set to 2 times, 3 times or 5 times, and can also be set to other times according to the needs of the situation, which is not limited here.
  • the cleaning robot 1 when it provides a warning reminder, it may use a sound reminder, a vibration reminder, a light reminder, or push a message through a terminal smart device such as a mobile phone.
  • a terminal smart device such as a mobile phone.
  • other reminder methods may be adopted according to the needs of the situation. , there is no restriction here.
  • the edge exploration of the preset ground medium according to the edge edge exploration mode to obtain the outline of the preset ground medium includes: when the cleaning robot detects the preset ground medium for the first time. When presetting the ground medium, mark the current position of the cleaning robot as the first contour point position; control the cleaning robot to perform an outer edge exploration task to obtain the contour points of the preset ground medium until the cleaning robot returns again The first contour point position is reached. It can be understood that when the cleaning robot reaches the first contour point again, it means that the cleaning robot has returned to the position where the first contour point was detected, and the exploration path for the preset ground medium has formed a closed loop, that is, the cleaning robot has completed the preset ground medium. exploration.
  • the cleaning robot when the cleaning robot reaches the first contour point again, it is not limited to the cleaning robot having to return to a position coinciding with the first contour point.
  • the distance between the cleaning robot and the first contour point is less than the preset distance threshold, It can also be determined that the cleaning robot has reached the first contour point again.
  • controlling the cleaning robot to perform an outer edge exploration task to obtain contour points of the preset ground medium includes: controlling the cleaning robot to perform a second predetermined action, wherein when the cleaning When the robot completes the second predetermined action, the ground medium detected by the cleaning robot is a non-preset ground medium; the cleaning robot is controlled to travel at a predetermined angular speed and a predetermined linear speed.
  • the cleaning robot detects the ground medium again, When presetting the ground medium, obtain the current coordinates of the cleaning robot and mark them as the contour points; repeat the step of controlling the cleaning robot to perform the second predetermined action to continue to obtain the contour points.
  • the cleaning robot's edge exploration mode is the outer edge exploration mode.
  • the cleaning robot 1 detects the preset ground medium for the first time, and records the cleaning robot 1's The current position is the first contour point, and then the cleaning robot is controlled to perform a second predetermined action to ensure that the cleaning robot is located outside the preset ground medium, and the current position of the cleaning robot is close to the edge of the preset ground medium to ensure that the cleaning robot 1
  • the next behavior is an effective edge-side exploration behavior.
  • the points detected during the next exploration behavior of the cleaning robot can be determined as contour points; it is determined that the current cleaning robot is located outside the preset ground medium and is located close to the preset ground medium.
  • the cleaning robot After the edge position is reached, the cleaning robot is controlled to travel at a predetermined angular speed and a predetermined linear speed, so that the cleaning robot gradually approaches the edge of the preset ground medium along a curved path.
  • the cleaning robot detects the preset ground medium again, it indicates that the cleaning robot is again
  • the contour point is detected, and so on, and the steps of controlling the cleaning robot to perform the second predetermined action are repeated to determine the position where the cleaning robot is located outside the preset ground medium and close to the edge of the preset ground medium (i.e., the cleaning robot detects the preset
  • the sensor of the ground medium is located outside the preset ground medium and is located close to the edge of the preset ground medium), thereby continuing to control the cleaning robot to travel at a predetermined angular speed and a predetermined linear speed, and so on in a cycle to continue to obtain other contour points.
  • the cleaning robot after determining that the current cleaning robot is located outside the preset ground medium and is located close to the edge of the preset ground medium, the cleaning robot can also be controlled to travel at a predetermined angular speed and a predetermined linear speed, and the angular speed gradually Reduce, as long as the cleaning robot can explore in a direction close to the preset ground medium, there is no limit here. It can be understood that controlling the cleaning robot to travel at a predetermined angular speed and a predetermined linear speed, and gradually reducing the angular speed, can allow the cleaning robot to explore a range that gradually expands in a spiral shape with the cleaning robot as the center, ensuring that the cleaning robot can explore the preset ground. medium, which ensures that the cleaning robot can detect the contour points of the preset ground medium again, so as to smoothly carry out the next exploration action and continue to explore the preset ground medium.
  • the second predetermined action is to control the cleaning robot to retreat until the cleaning robot detects the non-preset floor medium, and then continue to retreat a predetermined distance; or to control the cleaning robot to rotate in a predetermined direction. until the cleaning robot detects the non-preset floor medium; or controls the cleaning robot to retreat until the cleaning robot detects the non-preset floor medium, then continues to retreat a predetermined distance, and then controls the cleaning robot Rotate a predetermined angle in a predetermined direction.
  • the edge-along exploration mode is an outer edge-edge exploration mode.
  • the cleaning robot can choose the direction of edge exploration. Specifically, the left edge or the right edge.
  • the left edge refers to the direction of travel of the cleaning robot as a reference.
  • the left side of the cleaning robot is close to It is preset that the edge of the ground medium performs the outer edge exploration task.
  • the exploration direction is the left edge
  • the direction of rotation of the predetermined angle in the second predetermined action is clockwise rotation
  • the change trend of the linear speed direction of the cleaning robot is counterclockwise.
  • the angular velocity direction is vertical to the cleaning surface (i.e., the ground) outward; as shown in Figure 79(d)-(f), when the exploration direction is the right edge , the direction of the predetermined angle of rotation in the second predetermined action is counterclockwise rotation, the changing trend of the linear velocity direction of the cleaning machine is clockwise (that is, the extending direction of the curved trajectory of the cleaning robot is clockwise), and the angular velocity The direction is vertical to the robot plane and outward.
  • FIG. 79 is a schematic diagram of a scene in which a cleaning robot performs outer edge exploration on a preset ground medium according to an embodiment of the present application.
  • the cleaning robot 1 detects the preset ground medium 2 for the first time, and records the current position of the cleaning robot 1 as the first contour point. If there are no obstacles on the edge of the media area range 2 of the preset ground medium, the cleaning robot 1 can move around the edge of the media area range 2 and return to the first contour point, or when the distance between the cleaning robot and the first contour point When the distance is less than the preset distance threshold, the exploration task is completed.
  • part of the body may enter the medium area range 2, and the cleaning robot 1 is controlled to perform the second predetermined action.
  • the cleaning robot 1 completes the second predetermined action, the cleaning robot is located on the preset ground Outside the media area range 2 of the medium, that is, the cleaning robot 1 detects a non-preset ground medium, and then controls the cleaning robot 1 to travel at a predetermined angular speed and a predetermined linear speed.
  • the cleaning robot 1 detects the preset ground medium again, obtain The current coordinates of the cleaning robot 1 are the contour points, and the second predetermined action is repeatedly performed and the cleaning robot 1 is controlled to travel at a predetermined angular speed and a predetermined linear speed (repeat the steps of Figure 79(d)-(f)) to find the preset ground The next contour point of the medium.
  • the cleaning robot 1 reaches the first contour point again, or when the distance between the cleaning robot 1 and the first contour point is less than the preset distance threshold, the exploration task is completed.
  • the second predetermined action when executing the outer edge exploration mode, may be to control the cleaning robot 1 to retreat until the cleaning robot detects a non-preset ground medium, and then continue to retreat a predetermined distance; the second predetermined action may also be It may be to control the cleaning robot 1 to rotate in a predetermined direction until the cleaning robot 1 detects the non-preset floor medium; as shown in Figure 79(d)-(e), the second predetermined action may also be to control the cleaning robot 1 to retreat until it is cleaned.
  • the robot 1 When the robot 1 detects the non-preset ground medium, it continues to retreat a predetermined distance, and then controls the cleaning robot 1 to rotate at a predetermined angle in a predetermined direction; by controlling the cleaning robot to retreat a preset distance, it can ensure that the cleaning robot performs at a predetermined angular speed and During the process of traveling at the predetermined linear speed, the mopping member of the cleaning robot is located outside the preset ground medium, that is, the mopping member of the cleaning robot is prevented from entering the medium area while the cleaning robot is traveling at the predetermined angular speed and predetermined linear speed. Wet or contaminate the preset ground medium.
  • the preset distance of retreat is related to the longest distance that the orthographic projection of the rotation coverage of the mop exceeds the edge of the cleaning robot; while the predetermined rotation in the predetermined direction
  • the angle can extend the trajectory of the cleaning robot to travel at the predetermined angular speed and predetermined linear speed, that is, the density of contour points explored by the cleaning robot can be reduced, thereby improving the efficiency of outer edge exploration and shortening the time for cleaning robot 1 to complete outer edge exploration.
  • the cleaning robot after determining that the current cleaning robot is located outside the preset ground medium and is located close to the edge of the preset ground medium, the cleaning robot can also be controlled to travel at a predetermined angular speed and a predetermined linear speed, and the angular speed gradually decreases. Small.
  • controlling the cleaning robot to travel at a predetermined angular speed and a predetermined linear speed, and the angular speed gradually decreases can enable the cleaning robot to gradually expand the exploration range in a spiral shape with the cleaning robot as the center, and ensure that the cleaning robot can explore unpredicted areas.
  • Setting the ground medium ensures that the cleaning robot can approach the inner edge of the preset ground medium again, so as to smoothly carry out the next exploration action and continue to explore the preset ground medium.
  • the cleaning robot travels at a predetermined angular speed and a predetermined linear speed
  • the trajectory formed by detecting the preset ground medium can be a spiral trajectory, an arc trajectory, or other designs according to the needs of the situation.
  • the trajectory is such as a polyline trajectory and is not limited here.
  • edge-edge exploration is performed on the preset ground medium according to the edge-edge exploration mode to obtain the outline of the preset ground medium. , including: when the cleaning robot detects the preset ground medium for the first time, marking the current position of the cleaning robot as the first contour point position; controlling the cleaning robot to perform an outer edge exploration task in the second direction , to obtain the contour points of the preset ground medium;
  • the cleaning robot When the cleaning robot detects an obstacle, the cleaning robot is controlled to perform a third predetermined action, and the cleaning robot is controlled to perform the outer edge exploration task in a third direction to obtain the outline of the preset ground medium. point, the third direction is opposite to the second direction; when the cleaning robot detects an obstacle again, the outer edge exploration task ends.
  • the cleaning robot when it detects the preset ground medium for the first time, it marks the current position of the cleaning robot as the first contour point position, and begins to select the left or right edge direction to perform the outer edge exploration task.
  • the robot When cleaning When the robot detects an obstacle through lidar or a collision sensor, it controls the cleaning robot to perform a third predetermined action and perform an outer edge exploration task in a third direction opposite to the second direction.
  • the specific process of obtaining contour points is the same as the aforementioned outer edge.
  • the exploration tasks are the same and will not be repeated here.
  • the outer edge exploration task ends.
  • the cleaning robot when the cleaning robot is hindered by obstacles when performing the outer edge exploration task, the cleaning robot can be controlled to return to the position where the first contour point was obtained and continue to perform the outer edge exploration task in the opposite direction. In this way, the cleaning robot can maximize the Exploring along the edge of the preset ground medium.
  • the third predetermined action is to make a U-turn on the spot; or to navigate to the first contour point.
  • the cleaning robot can turn around on the spot and continue the outer edge exploration task in the third direction opposite to the second direction, or navigate to the first contour point and then continue the outer edge exploration task in the third direction opposite to the second direction.
  • turning around in situ and directly continuing the outer edge exploration task will repeatedly explore part of the preset ground media that has been previously explored. Therefore, after the cleaning robot navigates to the first contour point and then continues the outer edge exploration task, it is better than turning around in situ and directly continuing the outer edge exploration task. Continuing to explore the outer edges is more efficient.
  • the predetermined angular velocity gradually decreases. It can be understood that the cleaning robot travels at a predetermined angular velocity and a predetermined linear velocity, and the angular velocity gradually decreases. It is detected that the preset trajectory of the preset ground medium may be a spiral. The trajectory can enable the cleaning robot to explore a range that gradually expands in a spiral shape with the cleaning robot as the center, ensuring that the cleaning robot can explore the preset ground medium signal. When the angular speed remains unchanged, the preset trajectory is an arc trajectory. Other trajectories can also be designed according to the needs of the situation, and there are no restrictions here.
  • FIG. 80 is a schematic diagram of another scene in which a cleaning robot according to an embodiment of the present application performs outer edge exploration on a preset ground medium.
  • position 20 is the first contour point, and it encounters an obstacle for the first time at position 21.
  • Object at this time, control the cleaning robot to turn around in place and continue the outer edge exploration task along the left edge direction opposite to the right edge direction, or, after navigating to the first contour point, continue the outer edge edge along the left edge direction opposite to the right edge direction. Exploration tasks are not limited here.
  • the cleaning robot 1 encounters an obstacle for the second time at position 22, the edge exploration is completed.
  • the obstacles encountered by the cleaning robot 1 can be the same obstacle or different obstacles, and there is no restriction here.
  • the method further includes:
  • the outline of the preset ground medium is determined based on the outline point information.
  • the contour of the preset ground medium is determined by the obtained contour points of each preset ground medium. After the cleaning robot 1 obtains the contour points of the preset ground medium, it can fit multiple contour points to obtain the preset ground medium. Set the contour of the ground medium.
  • the method further includes:
  • the cleaning robot is controlled to clean the preset ground surface according to the inner edge exploration mode.
  • the medium is explored along the edge to obtain the outline of the preset ground medium.
  • the cleaning robot 1 cannot explore an edge close to the room wall when it explores the preset ground medium in the outer edge exploration mode, then the cleaning robot 1 detects If there is too much missing contour point information, the contour of the preset ground medium may not be determined. At this time, combined with the inner edge exploration method, the integrity of the contour of the preset ground medium can be improved.
  • FIGS. 81 to 83 are schematic diagrams of scenes in which a cleaning robot according to an embodiment of the present application determines the outline of a preset ground medium by connecting or fitting processing or graphic matching processing on contour points.
  • cleaning robot 1 After cleaning robot 1 performs inner edge exploration of the media area range 2 in the inner edge exploration mode, it determines multiple contour points and connects each contour point to obtain the outline of the preset ground medium.
  • the cleaning robot 1 determines multiple contour points and performs fitting processing on each contour point, thereby obtaining the preset ground medium. Smoother contours.
  • FIG. 83 is a schematic diagram of a scene for performing graphic matching processing on the outline of a preset ground medium according to an embodiment of the present application.
  • the preset ground medium is usually a regular shape.
  • the cleaning robot 1 obtains the first shape after connecting various contour points, and calculates the matching degree between the minimum polygon surrounding the first shape and the first shape. , to determine the contour of the preset ground medium.
  • the cleaning robot 1 calculates that the matching degree between the smallest rectangle surrounding the first graphic and the first graphic is 98%, and calculates that the matching degree between the smallest trapezoid surrounding the first graphic and the first image is 60%, then Determine the minimum rectangle as the outline of the preset ground medium.
  • the method further includes:
  • the first media area is determined according to the outline of the preset ground medium, and the second media area closest to the first media area is obtained from the pre-constructed cleaning area map, wherein the first media area is The ground media types corresponding to the second media areas are the same;
  • the first media area and the second media area are merged in the cleaning area map.
  • the cleaning area map is a map pre-constructed by the cleaning robot 1 and used to plan a movement path or a cleaning path.
  • the first media area is determined according to the outline of the preset ground medium, it can be determined through the cleaning area map whether there is a medium area with the same ground medium type as the first media area. If there is a ground medium area with the same ground medium type as the first media area, media area, select the second media area that is closest to the first media area.
  • the distance between the two media areas can be the shortest distance, the longest distance, etc. between the contour points of the two areas. This is not done here. limit.
  • the preset ground medium type represents the type of ground medium. For example, if the type of ground medium in the first media area is carpet type, then the type of ground medium corresponding to the second media area is also carpet type.
  • the cleaning robot can better plan its travel trajectory or tasks in the future. For example, the first media area and the second media area that belong to the same preset ground medium are identified and merged respectively. Afterwards, the cleaning robot will be regarded as a complete media area. If the obstacles that separate the preset floor media are removed, the cleaning robot can complete the cleaning of the preset floor media through one cleaning plan, and the cleaning robot was originally The area occupied by obstacles will also be cleaned at the same time. There is no need to plan and clean the first media area and the second media area separately, and there will be no leakage of cleaning in the area originally occupied by obstacles.
  • FIG. 84 is a schematic diagram of a scene in which a cleaning robot merges adjacent media areas according to an embodiment of the present application.
  • A is the first media area determined by the cleaning robot 1 based on the outline of the preset ground medium
  • B is the second media area closest to the first media area A obtained from the cleaning area map.
  • the preset separation distance can be set to 20cm, 30cm or 40cm.
  • the preset separation distance can also be set to other distances according to the needs of the situation, which is not limited here.
  • the method further includes:
  • the cleaning robot is controlled to clean the preset floor medium through the side scanning element and/or the middle scanning element in a first arcuate path;
  • the cleaning robot is controlled to clean the preset floor medium in a second arcuate path through the side scanning part and/or the middle scanning part, wherein the second The arcuate path is orthogonal to the first arcuate path.
  • the cleaning path can be determined according to the outline of the preset ground medium. As shown in FIG. 85 , while the cleaning robot 1 moves according to the first arcuate path and the second arcuate path, the preset floor medium is cleaned through the side scanning element 10 .
  • the moving trajectory of the cleaning robot 1 covers all areas of the preset ground medium, which can avoid Sweep and miss sweep.
  • the cleaning robot when the cleaning robot detects the preset ground medium, the status information of the cleaning robot is obtained, and the edge exploration mode is determined based on the status information.
  • the edge exploration mode includes the inner edge exploration mode and the outer edge exploration mode; according to the edge exploration mode The mode explores the preset ground medium along the edge to obtain the outline of the preset ground medium.
  • cleaning robots in related technologies explore the ground medium, they usually use a path that is a polygonal line or an arc line to explore the preset ground medium, that is, the cleaning robot is controlled to rotate to a predetermined direction and to go straight until the cleaning robot detects the target. signal, and then control the cleaning robot to rotate in another direction and control the cleaning robot away from the position where the target signal is detected, so as to repeatedly form a zigzag exploration path; or control the cleaning robot to proceed at a constant angular speed and linear speed until the cleaning robot detects the target signal.
  • Figure 86 shows the trajectories of using polyline, arc, and spiral trajectories to explore the preset ground medium.
  • Embodiments of the present application provide a ground medium exploration method, cleaning robot and storage medium. Among them, this method is applied to a cleaning robot, which can be a sweeper or other intelligent robot, and is not limited here.
  • FIG. 87 is a schematic flow chart of a ground medium exploration method provided by an embodiment of the present application.
  • the ground medium exploration method includes steps S1 to S4.
  • Step S1 Determine the starting position, and the cleaning robot detects the target ground medium at the starting position;
  • the cleaning robot uses sensors (such as ultrasonic sensors) to detect preset floor media (such as carpets, floor mats, etc. that require special treatment) and non-preset floor media during travel.
  • the target medium can be a preset ground medium or a non-preset ground medium.
  • the target medium is the preset ground medium; when the inner edge exploration action is adopted, the target medium It is a non-default ground medium.
  • the position of the cleaning robot when it first detects the preset ground medium is used as the starting position. If the inner edge exploration action is used, the position of the cleaning robot when it detects the preset ground medium for the first time and then continues to explore and detects non-preset ground medium can be used as the starting position.
  • Step S2 control the cleaning robot to perform predetermined actions
  • the cleaning robot When the cleaning robot performs steps S1 or S2, it also includes: obtaining the contour points of the preset ground medium, and using the first obtained contour points of the preset ground medium as the first contour point, and the first The contour point position is the coordinates of the cleaning robot when the cleaning robot obtains the first contour point.
  • the coordinates of the cleaning robot coincide with or are a predetermined distance away from the coordinates of the cleaning robot when the cleaning robot obtains the first contour point.
  • the coordinates of the cleaning robot may be determined by the coordinates of sensors detecting floor media.
  • the contour point of the preset ground medium is the coordinate of the cleaning robot when it detects the preset ground medium when the cleaning robot performs step S1 or S2.
  • the cleaning robot after the cleaning robot determines the starting position, it performs a predetermined action and completes the preparatory action for performing an edge exploration action.
  • Different exploration actions correspond to different predetermined actions.
  • the cleaning robot marks the contour point of the preset ground medium acquired for the first time as the first contour point during the execution of step S1. , and then perform the predetermined action in step S2; when the cleaning robot takes the inner edge exploration action, since it is necessary to determine the preset ground medium edge and adjust the starting posture of the robot when starting the exploration action, the cleaning robot performs In the process of step S2, the contour point of the preset ground medium obtained for the first time during the execution of the predetermined action is marked as the first contour point.
  • Step S3 Control the cleaning robot to travel at a predetermined angular speed and a predetermined linear speed until the cleaning robot detects the target signal again, wherein the angular speed gradually decreases;

Landscapes

  • Electric Vacuum Cleaner (AREA)

Abstract

Des modes de réalisation de la présente demande concernent un procédé de commande de robot de nettoyage, un procédé, un dispositif et un système de traitement, de génération, de division de zone et d'exploration, et un support de stockage. Le procédé comprend : l'obtention d'une carte de tâches de nettoyage ; la détermination si la carte de tâches de nettoyage comprend une zone de tapis, et si la carte de tâches de nettoyage comprend une zone de tapis, la commande du robot de nettoyage de nettoyer un tapis dans la zone de tapis à l'aide d'un élément de brossage ; la commande du robot de nettoyage de nettoyer au moins une partie de zones n'étant pas des zones de tapis dans la carte de tâches de nettoyage au moyen d'un élément de nettoyage ; et le brossage et le balayage automatique du tapis selon la zone de tapis dans la carte de tâches de nettoyage, et au moins le déplacement de la zone n'étant pas une zone de tapis. Par conséquent, un utilisateur n'a pas besoin de régler différents modes de nettoyage pour différentes zones, ce qui permet d'améliorer l'intelligence de nettoyage du robot de nettoyage.
PCT/CN2022/109209 2022-07-29 2022-07-29 Procédé de commande de robot de nettoyage, et procédé, dispositif et système de traitement, de génération, de division de zone et d'exploration WO2024021111A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/109209 WO2024021111A1 (fr) 2022-07-29 2022-07-29 Procédé de commande de robot de nettoyage, et procédé, dispositif et système de traitement, de génération, de division de zone et d'exploration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/109209 WO2024021111A1 (fr) 2022-07-29 2022-07-29 Procédé de commande de robot de nettoyage, et procédé, dispositif et système de traitement, de génération, de division de zone et d'exploration

Publications (1)

Publication Number Publication Date
WO2024021111A1 true WO2024021111A1 (fr) 2024-02-01

Family

ID=89705118

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/109209 WO2024021111A1 (fr) 2022-07-29 2022-07-29 Procédé de commande de robot de nettoyage, et procédé, dispositif et système de traitement, de génération, de division de zone et d'exploration

Country Status (1)

Country Link
WO (1) WO2024021111A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010051423A1 (fr) * 2008-10-30 2010-05-06 Intellibot Robotics Llc Procédé de nettoyage d’une surface utilisant un dispositif de nettoyage automatique
CN110236456A (zh) * 2019-01-08 2019-09-17 云鲸智能科技(东莞)有限公司 拖地机器人的控制方法、装置、设备及存储介质
CN112741555A (zh) * 2019-10-31 2021-05-04 深圳拓邦股份有限公司 一种清扫方法、系统及清扫设备
CN113208499A (zh) * 2021-05-26 2021-08-06 深圳市普森斯科技有限公司 清洁设备及其控制方法、计算机可读存储介质
CN113693495A (zh) * 2021-02-10 2021-11-26 北京石头世纪科技股份有限公司 自动清洁设备清洁方法及装置、介质及电子设备
WO2022117107A1 (fr) * 2020-12-04 2022-06-09 苏州宝时得电动工具有限公司 Robot de nettoyage, système de nettoyage et procédé de nettoyage

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010051423A1 (fr) * 2008-10-30 2010-05-06 Intellibot Robotics Llc Procédé de nettoyage d’une surface utilisant un dispositif de nettoyage automatique
CN110236456A (zh) * 2019-01-08 2019-09-17 云鲸智能科技(东莞)有限公司 拖地机器人的控制方法、装置、设备及存储介质
CN112741555A (zh) * 2019-10-31 2021-05-04 深圳拓邦股份有限公司 一种清扫方法、系统及清扫设备
WO2022117107A1 (fr) * 2020-12-04 2022-06-09 苏州宝时得电动工具有限公司 Robot de nettoyage, système de nettoyage et procédé de nettoyage
CN113693495A (zh) * 2021-02-10 2021-11-26 北京石头世纪科技股份有限公司 自动清洁设备清洁方法及装置、介质及电子设备
CN113208499A (zh) * 2021-05-26 2021-08-06 深圳市普森斯科技有限公司 清洁设备及其控制方法、计算机可读存储介质

Similar Documents

Publication Publication Date Title
TWI723631B (zh) 拖地機器人的控制方法、裝置、設備及儲存媒介
US11812907B2 (en) Base station and cleaning robot system
US20220022718A1 (en) Cleaning control method and apparatus, cleaning robot and storage medium
CN109195751B (zh) 用于基于区分类的机器人的可配置操作的系统和方法
CN110236456B (zh) 拖地机器人的控制方法、装置、设备及存储介质
CN105283109B (zh) 机器人清扫机及其控制方法
CN109998428A (zh) 用于扫地机器人的清洁方法、系统及装置
CN112650205A (zh) 一种清洁监控方法、清洁设备、服务器及存储介质
CN113951774B (zh) 清洁设备的控制方法、装置、清洁设备及可读存储介质
WO2024022360A1 (fr) Procédé, dispositif et système de commande de robot de nettoyage et support de stockage
JP7362917B2 (ja) 自律移動ロボットの制御
CN113974506A (zh) 清洁控制方法、装置、清洁机器人以及存储介质
CN113995355B (zh) 机器人管理方法、装置、设备及可读存储介质
WO2024021111A1 (fr) Procédé de commande de robot de nettoyage, et procédé, dispositif et système de traitement, de génération, de division de zone et d'exploration
WO2021033512A1 (fr) Dispositif de nettoyage de type à déplacement autonome, procédé de commande de dispositif de nettoyage de type à déplacement autonome, et programme
CN112971644B (zh) 扫地机器人的清洁方法、装置、存储介质和扫地机器人
CN114569001B (zh) 一种智能移动设备
CN116300844A (zh) 清扫设备的智能控制方法及装置
CN115444327B (zh) 清洁设备清洁图像的处理方法、装置、系统及存储介质
CN115633921A (zh) 清洁机器人的清洁区域划分方法、控制方法、装置、系统
WO2023217190A1 (fr) Procédé de nettoyage, appareil de nettoyage, dispositif de nettoyage et support de stockage
WO2024145776A1 (fr) Procédé et appareil de commande pour robot balayeur de sol, robot balayeur de sol, système et support de stockage
CN113116238B (zh) 清洁机器人维护方法、清洁机器人、清洁系统及存储介质
CN115429155B (zh) 清洁机器人的控制方法、装置、系统及存储介质
CN115553664A (zh) 扫地机和扫地机控制方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22952558

Country of ref document: EP

Kind code of ref document: A1