CN117297449A - Cleaning setting method, cleaning apparatus, computer program product, and storage medium - Google Patents

Cleaning setting method, cleaning apparatus, computer program product, and storage medium Download PDF

Info

Publication number
CN117297449A
CN117297449A CN202311438744.7A CN202311438744A CN117297449A CN 117297449 A CN117297449 A CN 117297449A CN 202311438744 A CN202311438744 A CN 202311438744A CN 117297449 A CN117297449 A CN 117297449A
Authority
CN
China
Prior art keywords
cleaning
area
cleaned
target object
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311438744.7A
Other languages
Chinese (zh)
Inventor
赖志鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen 3irobotix Co Ltd
Original Assignee
Shenzhen 3irobotix Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen 3irobotix Co Ltd filed Critical Shenzhen 3irobotix Co Ltd
Priority to CN202311438744.7A priority Critical patent/CN117297449A/en
Publication of CN117297449A publication Critical patent/CN117297449A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/29Floor-scrubbing machines characterised by means for taking-up dirty liquid
    • A47L11/30Floor-scrubbing machines characterised by means for taking-up dirty liquid by suction
    • A47L11/302Floor-scrubbing machines characterised by means for taking-up dirty liquid by suction having rotary tools
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Landscapes

  • Electric Vacuum Cleaner (AREA)

Abstract

The application discloses a cleaning setting method, a cleaning device, a computer program product and a non-volatile computer readable storage medium. The method comprises the steps of identifying a target object in a current scene, and determining the scene type of the current scene according to the type of the target object; determining a region to be cleaned of the associated target object covering the current scene; according to the scene type, the cleaning parameters of the to-be-cleaned area are set, the to-be-cleaned area set in the method is more closely matched with the actually existing dirty area of the current scene, the set cleaning parameters are also more closely matched with the actual dirty degree, therefore, the cleaning effect can be ensured, and the convenience of user operation can be improved.

Description

Cleaning setting method, cleaning apparatus, computer program product, and storage medium
Technical Field
The present application relates to the field of self-moving robot technology, and more particularly, to a cleaning setting method, a cleaning apparatus, a computer program product, and a non-volatile computer-readable storage medium.
Background
In the prior art, after a self-moving robot such as a sweeping robot saves a map of a designated area, a user edits the range and the position of the map at a base station and a terminal according to requirements, sets designated cleaning parameters and the like, and then executes a corresponding cleaning command on the designated area according to the edited navigation map.
Although the customization requirement of the user can be met, in the using process, the user is required to manually edit, the operation is inconvenient, and the using convenience is reduced.
Disclosure of Invention
The embodiment of the application provides a cleaning setting method, cleaning equipment, a computer program product and a nonvolatile computer readable storage medium, which can identify the type of a target object in a current scene, determine the scene type according to the type of the target object, determine a cleaning area covering an associated target object, set corresponding cleaning parameters according to the scene type, and edit the cleaning area and the cleaning parameters without manual operation of a user, so that the operation convenience is good.
The cleaning setting method of the method comprises the steps of identifying a target object in a current scene, and determining the scene type of the current scene according to the type of the target object; determining a region to be cleaned of the associated target object covering the current scene; and setting the cleaning parameters of the area to be cleaned according to the scene type.
In certain embodiments, the determining the area to be cleaned comprises: determining a pollution coefficient of each target object in the current scene, wherein the pollution coefficient is determined according to at least one of the activity duration of a user in the target object and a cleaning period corresponding to the current scene; determining that the target object with the dirt coefficient larger than a preset threshold value is the target object associated with the current scene; determining areas to be combined corresponding to the target objects associated with the current scene; merging the areas to be merged to determine the area to be cleaned, wherein the area to be cleaned covers all the areas to be merged and the areas between the areas to be merged.
In some embodiments, the determining the fouling factor of each of the target objects in the current scene comprises: determining initial pollution coefficients of all the target objects in the current scene according to the activity duration of the user on the target objects; and adjusting the initial fouling coefficients according to the cleaning period to determine the fouling coefficients of the target objects in the current scene.
In some embodiments, the area to be cleaned is a rectangular area covering the associated target object of the current scene; the method further comprises the steps of: determining a target area surrounding the edge contour of the area to be cleaned, wherein the distance between two opposite edge contours in the edge contour of the target area is a preset distance; merging the target area and the area to be cleaned to generate a target cleaning area; and re-determining the area to be cleaned according to the target cleaning area.
In certain embodiments, the determining the area to be cleaned comprises: acquiring position information of an edge contour of the target object associated with the current scene, wherein the position information comprises a first coordinate and a second coordinate; determining the smallest first coordinate, the largest first coordinate, the smallest second coordinate and the largest second coordinate in the position information of all the associated target objects; and determining the area to be cleaned according to the minimum first coordinate, the maximum first coordinate, the minimum second coordinate and the maximum second coordinate.
In certain embodiments, further comprising: displaying one or more pieces of cleaning recommendation information, wherein the cleaning recommendation information corresponds to a scene; receiving target cleaning recommendation information determined by a user from the cleaning recommendation information; and cleaning according to the cleaning parameters, the to-be-cleaned area and the cleaning time of the target cleaning recommended information.
In certain embodiments, further comprising: displaying the edge contour of the region to be cleaned corresponding to the target cleaning recommendation information under the condition that a viewing instruction of the target recommendation information is received; acquiring editing information of a user on the displayed area to be cleaned so as to determine the edited area to be cleaned; displaying the edited edge profile of the area to be cleaned; and displaying the target object according to the relative position of the target object in the edited region to be cleaned.
The cleaning device of the method of the present application comprises a processor, a memory and a computer program, wherein the computer program is stored in the memory and executed by the processor, and the computer program comprises instructions for executing the cleaning setting method of any of the embodiments described above.
A computer program product of an embodiment of the present application comprises a computer program comprising instructions for the cleaning setup method of any of the embodiments described above.
The non-transitory computer readable storage medium of the present application implementing methods includes a computer program that, when executed by a processor, causes the processor to execute the cleaning setting method of any of the above embodiments.
According to the cleaning setting method, the cleaning device, the computer program product and the nonvolatile computer readable storage medium, through identifying target objects such as identifying obstacles (furniture and the like) in a current scene, determining the scene type of the current scene according to the type of the target objects, then determining the area of the associated target objects covering the current scene as an area to be cleaned in the current scene, finally setting cleaning parameters (such as a water yield gear, a rolling brush rotating speed gear, a suction gear and the like) of the area to be cleaned according to the scene type, enabling the area to be cleaned to be more closely matched with the actually existing dirty area and the like of the current scene, and enabling the set cleaning parameters to be more matched with the actual dirty degree, so that the cleaning effect can be guaranteed, and convenience of user operation can be improved.
Additional aspects and advantages of embodiments of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a schematic view of an application scenario of a cleaning setting method according to some embodiments of the present application;
FIG. 2 is a flow chart of a cleaning setup method of certain embodiments of the present application;
FIG. 3 is a flow chart of a cleaning setup method of certain embodiments of the present application;
FIG. 4 is a schematic illustration of a scenario of a cleaning setup method of certain embodiments of the present application;
FIG. 5 is a flow chart of a cleaning setup method of certain embodiments of the present application;
FIG. 6 is a schematic illustration of a scenario of a cleaning setup method of certain embodiments of the present application;
FIG. 7 is a schematic illustration of a scenario of a cleaning setup method of certain embodiments of the present application;
FIG. 8 is a flow chart of a cleaning setup method of certain embodiments of the present application;
FIG. 9 is a schematic illustration of a scenario of a cleaning setup method of certain embodiments of the present application;
FIG. 10 is a schematic illustration of a scenario of a cleaning setup method of certain embodiments of the present application;
FIG. 11 is a schematic illustration of a scenario of a cleaning setup method of certain embodiments of the present application;
FIG. 12 is a flow chart of a cleaning setup method of certain embodiments of the present application;
FIG. 13 is a schematic block diagram of a cleaning arrangement according to certain embodiments of the present application;
FIG. 14 is a schematic structural diagram of a computer device according to some embodiments of the present application;
fig. 15 is a schematic diagram of a connection state of a nonvolatile computer-readable storage medium and a processor.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the embodiments of the present application and are not to be construed as limiting the embodiments of the present application.
To facilitate an understanding of the present application, the following description of terms appearing in the present application will be provided:
self-moving robot: a machine device for automatically performing work. It can accept human command, run pre-programmed program and act according to the principle set by artificial intelligence technology.
The sweeping robot is one of self-moving robots. The floor sweeping robot, also called automatic sweeping machine, intelligent dust collector, robot dust collector, etc., is one kind of intelligent household appliance and can complete floor cleaning automatically inside room via artificial intelligence. Generally, the brushing and vacuum modes are adopted, and the ground sundries are firstly absorbed into the garbage storage box of the ground, so that the function of cleaning the ground is completed. Generally, robots for performing cleaning, dust collection, and floor scrubbing are collectively referred to as floor cleaning robots.
The cleaning device of the present application may be a self-moving robot or a device including a self-moving robot, which may be a sweeping robot, or may be an autonomous moving robot (Automatic Mobile Robot, AMR), etc., and for brevity, the present application describes the self-moving robot as a sweeping robot, and the principle of the self-moving robot is similar to that of other types of robots, and will not be described herein again.
After the map of the appointed area is saved by the sweeping robot, a user edits the range and the position of the map at the base station and the terminal according to the requirements, and sets appointed cleaning parameters and the like, and then the sweeping robot executes corresponding cleaning parameter commands on the appointed area according to the edited navigation map. Although the customization requirement of the user can be met, in the using process, the user is required to manually edit, the operation is inconvenient, and the using convenience is reduced.
In order to solve the above technical problems, an embodiment of the present application provides a cleaning method.
An application scenario of the technical solution of the present application is described first, as shown in fig. 1, and the cleaning setting method provided in the present application may be applied to the application scenario shown in fig. 1. The cleaning setting method is applied to the cleaning system 1000, and the cleaning system 1000 includes the robot cleaner 100 and the base station 200.
The cleaning apparatus of the present application may include only the robot cleaner 100, or the cleaning apparatus includes the robot cleaner 100 and the base station 200 (or referred to as a dust collecting station), and the robot cleaner 100 and the base station 200 may be connected through a network to determine a current state (e.g., an electric quantity state, an operating state, position information, etc.) of the opposite terminal.
Wherein the robot cleaner 100 includes a vision sensor 10, a radar sensor 20, a processor 30, a memory 40, and a body 50; the processor 30 communicates with the vision sensor 10 and the radar sensor 200 through a network, respectively, and the vision sensor 10, the radar sensor 20, and the processor 30 are provided on the body 50 of the robot 100.
The vision sensor 10 is used for acquiring images of a scene; the vision sensor 10 may be a visible light camera (Red-Green-Blue, RGB), a visible light Depth camera (Red-Green-Blue-Depth, RGBD), an infrared camera, a thermal imaging camera, a Depth camera, etc., the RGB camera and the RGBD camera may collect a visible light image of a scene, the infrared camera may collect an infrared image of the scene, the thermal imaging camera may collect a thermal imaging image of the scene, and the Depth camera may collect a Depth image.
Optionally, the camera 10 includes one or more. The camera 10 may be disposed at a side wall of the body 50, for example, the camera 10 is disposed in a direction right in front of the robot 100 to collect images of a scene right in front of the robot 100, or the camera 10 is disposed at both sides of the robot 100 to collect images of a scene at both sides of the robot 100 during the forward movement of the robot 100.
The radar sensor 20 is used to acquire point cloud information of objects in a scene. The radar sensor 20 may be a lidar (Laser Direct Structuring, LDS), such as a Time of Flight (TOF) radar based on the TOF principle, a structured light radar based on the structured light principle, or the like.
The radar sensor 20 is provided at the top wall of the robot cleaner 100. The radar sensor 20 may be provided protruding from the ceiling wall, or the radar sensor 20 may be provided within the body 50 without protruding from the body 50, that is, the radar sensor 20 may have a height lower than that of the ceiling wall.
The robot cleaner 100 collects a scene image of a current scene through the camera 10, recognizes the type of a target object in the current scene, determines the scene type of the current scene according to the type of the target object, determines a cleaning area, and sets cleaning parameters of the cleaning area according to the scene type through the processor 30.
In one embodiment, the camera 10 of the sweeping robot is a camera 10 having an artificial intelligence (Artificial Intelligence, AI) recognition function.
In one embodiment, the cleaning system 1000 further includes a server 400, the server 400 and the cleaning device communicating over a network; the server 400 is configured to receive a scene image acquired by the camera 10 and sent by the robot 100 or the base station 200, acquire the scene image, identify a current scene according to the scene image, and set a cleaning parameter according to the identification result, so as to clean the current scene.
In one embodiment, the server 400 may be a separate physical server 400, or may be a server 400 cluster or a distributed system formed by a plurality of physical servers 400, or may be a cloud server 400 that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, and basic cloud computing services such as big data and artificial intelligence platforms. The embodiments of the present application are not limited in this regard.
The base station 200 may include a display 201, and the base station 200 may be capable of communicating with the sweeping robot 100 to obtain data transmitted by the sweeping robot 100, and may process the data by using a processing capability of the base station 200, so as to implement functions of controlling the sweeping robot 100 (e.g., controlling the sweeping robot 100 to move to a target position for cleaning), displaying relevant contents of the sweeping robot 100, and the like.
In one embodiment, the cleaning device further comprises a terminal 300, the terminal 300 comprising a display 301. The terminal 300 can communicate with the sweeping robot 100 to obtain data transmitted by the sweeping robot 100, and can process the data by using the processing capability of the terminal 300, so as to realize functions of controlling the sweeping robot 100 (for example, controlling the sweeping robot 100 to move to a target position for cleaning), displaying related contents of the sweeping robot 100, and the like.
In one embodiment, the terminal 300 may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, portable wearable devices, and the like.
For example, at least one of the display screen 201 of the base station 200 and the display 301 of the terminal may display the acquired cleaning parameters of the area to be cleaned and the area to be cleaned to determine the accuracy of the recognition of the current scene by the robot cleaner 100 and the rationality of the cleaning parameter setting by the robot cleaner 100. For another example, the base station 200 or the terminal 300 may display the edge profile of the area to be cleaned, and may also display the target object (position, profile, etc.) according to the relative position of the target object within the area to be cleaned.
In one embodiment, the sweeping robot 100, the base station 200, the terminal 300 and the server 400 all communicate through a network, for example, any two of the sweeping robot 100, the base station 200, the terminal 300 and the server 400 may communicate in a wireless manner (for example, wireless local area network (Wireless Fidelity, wifi) communication, bluetooth communication, infrared communication, etc.). It is understood that the communication among the robot cleaner 100, the base station 200, the terminal 300, and the server 400 is not limited to the above-described communication manner, and is not limited thereto.
When wifi communication is performed, the sweeping robot 100 and the base station 200 communicate with the cloud server 400 respectively, and then the cloud server 400 realizes communication between the sweeping robot 100 and the base station 200 (or the terminal 300); when the communication is performed through bluetooth or infrared, the robot cleaner 100 and the base station 200 (or the terminal 300) are each provided with a corresponding communication module to directly implement communication therebetween.
In one embodiment, the cleaning method may be implemented by the robot cleaner 100 and at least one of the base station 200, the terminal 300, and the server 400. Such as through the cooperation of the sweeping robot 100 with the base station 200, the terminal 300 and the server 400, or through the cooperation of the sweeping robot 100 with the base station 200, or through the cooperation of the sweeping robot 100 with the terminal 300, etc.
The cleaning setting method of the present application will be described in detail below:
referring to fig. 1 and 2, the present embodiment provides a cleaning setting method, which may be applied to at least one of a cleaning device or a terminal, and the cleaning device may be a cleaning robot 100, a base station 200, a terminal 300, a server 400, or the like, as an example of application of the method to the cleaning device. The cleaning setting method comprises the following steps:
step 011: identifying a target object in the current scene, and determining the scene type of the current scene according to the type of the target object;
specifically, the sweeping robot has a scene recognition function that enables the sweeping robot to recognize position information and type information of each target object in a scene when the current scene is edgewise swept, and to determine a scene type according to the position and type of the target object.
The scene recognition function may be based on artificial intelligence (Artificial Intelligence, AI) to recognize the target object. The method comprises the steps that a sweeping robot collects scene images of a current scene through a camera with an AI identification function, identifies the position and the shape of a target object of the current scene, performs edge detection and contour extraction on each target object contained in the current scene to obtain an identification result of the type of the target object, wherein the identification result comprises position information and type information of each target object, and finally determines the scene type of the current scene in a processor according to the identification result.
The current scene may be a scene including dirt, which requires cleaning by the robot for sweeping, and the type of the target object corresponds to the type of the current scene. The type of the target object can be the type of an obstacle, such as furniture type, clothing type or equipment type, which needs to be avoided when the robot works in the current scene, and then the corresponding scene type can be a living room scene, a bedroom scene or a factory scene.
For example, when the sweeping robot cleans a scene containing dirt currently, the camera acquires an image of the current scene, the AI identifies that target objects existing in the current scene are sofas, tea tables and chairs, determines that the identified sofas, tea tables and chairs are common furniture types in living rooms, and determines that the current scene type is a living room scene according to the furniture types and position information thereof.
Step 012: a region to be cleaned is determined that covers an associated target object of the current scene.
Specifically, when the sweeping robot identifies the target object in the current scene, the area including all the target objects associated with the current scene in the current scene can be determined as the area to be cleaned according to the position information of the target object (such as the relative position information of the target object in the scene) and the distribution situation of the target object in the current scene, and the area to be cleaned corresponds to the range where dirt is generated when the target object is used in the current scene or is moved nearby the target object.
For example, the current scene is a dining room, the target object type is a dining table and a chair of furniture type, and the area range of the dining table and the chair which cover the association of the dining table scene is determined as the area to be cleaned according to the position information and the distribution condition of the dining table and the chair in the current scene, wherein the area to be cleaned corresponds to the range of dirt generated when a user uses the dining table and the chair in the current scene.
Step 013: and setting the cleaning parameters of the area to be cleaned according to the scene type.
Specifically, in order to improve the cleaning effect on the current scene, different cleaning parameters are set according to different scene types. Different cleaning parameters can be preset for different scene types, and after the scene type of the current scene is identified, corresponding preset cleaning parameters are set, for example, the cleaning parameters of the preset living room scene are in a rolling brush mode, and when the current scene is identified as the living room scene, the cleaning parameters of the area to be cleaned are set to be in the rolling brush mode.
It will be appreciated that different scene types have different levels of soiling and that different cleaning parameters may be set according to the different levels of soiling of the scene types. For example, the dirt level is divided into three dirt levels of light dirt, moderate dirt and heavy dirt, the three dirt levels are correspondingly provided with three cleaning parameters of light cleaning, moderate cleaning and heavy cleaning, the three dirt levels are mutually corresponding to scene types, if the kitchen scene is corresponding to heavy dirt, when the sweeping robot recognizes that the current scene is the kitchen scene, the dirt level is determined to be heavy dirt, and the heavy cleaning parameters are correspondingly provided according to the kitchen scene at the moment.
Therefore, after the current scene type is determined according to the recognized types of the target object and the target object, the area covered with the related target object in the current scene is determined as the area to be cleaned, and then the cleaning parameters of the area to be cleaned are set based on the scene type, so that the set area to be cleaned is more closely matched with the actually existing dirt area of the current scene, and the set cleaning parameters are more closely matched with the actual dirt degree, thereby ensuring the cleaning effect and improving the convenience of user operation.
Referring to fig. 3, in certain embodiments, step 012: determining an area to be cleaned of an associated target object covering a current scene, comprising:
step 0121: determining the dirt coefficient of each target object in the current scene, wherein the dirt coefficient is determined according to at least one of the activity duration of the user in the target object and the cleaning period corresponding to the current scene;
step 0122: determining that a target object with the pollution coefficient larger than a preset threshold value is a target object associated with the current scene;
step 0123: determining the areas to be combined corresponding to all target objects associated with the current scene;
step 0124: and merging all the areas to be merged to determine the area to be cleaned, wherein the area to be cleaned covers all the areas to be merged and the areas among the areas to be merged.
Specifically, the fouling coefficient reflects the fouling condition of each target object in the current scene. The dirt coefficient is determined according to at least one of the activity duration of the user on the target object and the cleaning period corresponding to the current scene, for example, the activity duration of the user on the target object is determined; or determining according to the cleaning period corresponding to the current scene; or determining according to the activity duration of the user on the target object and the cleaning period corresponding to the current scene.
The activity duration of the user on the target object can be a preset experience value, for example, when the target object is a sofa, the pollution coefficient of the target object of the sofa is determined to be higher through the preset experience value that the activity duration of the user near the target object is longer; or based on a preset experience value, the cleaning robot dynamically corrects the historical cleaning information (such as cleaning parameters, dirt degree and the like) of the to-be-cleaned area covering the target object to determine the dirt coefficient of the target object, for example, the dirt coefficient of the target object is light because the user has long activity duration near the sofa in the preset experience value, but the actual dirt condition of the sofa is only light because the historical cleaning information is obtained, and the dirt coefficient of the target object of the sofa is dynamically corrected to be light.
The fouling coefficients of the various target objects in the current scene may be determined according to the cleaning period corresponding to the current scene, e.g. in a kitchen scene, the user typically has 13 pm: 00 has left the kitchen, then the cleaning period is set at noon 13: the dirt coefficient before 00 and the set at noon 13: after 00, the fouling coefficients of the target objects corresponding to the two time periods are necessarily different.
And determining the target object with the dirt coefficient larger than the preset threshold value as a target object associated with the current scene, determining the areas of all the target objects with the dirt coefficients larger than the preset threshold value in the current scene as areas to be combined, and combining the areas between each area to be combined and the area to be combined to determine the area to be cleaned.
For example, referring to fig. 4, if it is determined that the dirt coefficient of the target object 1 and the target object 2 is greater than the preset threshold, the target object 1 corresponds to the region S1 to be combined, the target object 2 corresponds to the region S2 to be combined, the region between the region S1 to be combined and the region S2 to be combined is S3, and the region s=s1+s2+s3 to be cleaned.
Wherein, a target object with a pollution coefficient greater than a preset threshold value is determined as a target object associated with the current scene, and a target object with a pollution coefficient smaller than the preset threshold value is determined as a target object not associated with the current scene, and when the pollution coefficient is equal to the preset threshold value, the target object is determined to be feasible to be associated with the current scene or not associated with the current scene, and the implementation of the application is not greatly influenced, so that no limitation is imposed.
Referring to fig. 5, in some embodiments, in step 0121: determining a fouling coefficient of each target object in the current scene, comprising:
step 01111: determining initial dirt coefficients of all target objects in the current scene according to the activity duration of the user on the target objects;
step 01212: and adjusting the initial fouling coefficient according to the cleaning period to determine the fouling coefficient of each target object in the current scene.
Specifically, in order to obtain the pollution coefficient more matched with the target object existing in the current scene, the sweeping robot determines the initial pollution coefficient of each target object in the current scene based on the activity duration of the user on the target object, and then adjusts the initial pollution coefficient according to the cleaning period to determine the pollution coefficient more attached to the actual pollution condition.
Optionally, the area to be cleaned is a rectangular area covering an associated target object of the current scene, and the cleaning setting method of the present application further includes: determining a target area surrounding the edge contour of the area to be cleaned, wherein the distance between two opposite edge contours in the edge contour of the target area is a preset distance; merging the target area and the area to be cleaned to generate a target cleaning area; and re-determining the area to be cleaned according to the target cleaning area.
Specifically, an area surrounding (e.g., rectangular surrounding) the edge contour of the area to be cleaned is determined as a target area, the target cleaning area is determined by merging the target area with the area to be cleaned, and the area to be cleaned is updated according to the target cleaning area.
The distance between the edge contour of the target area and the edge contour of the corresponding area to be cleaned is a preset distance. For example, the preset interval is set to be 0.5m (m), and the preset interval corresponding to the edge is adjusted to be 0.3 if the preset interval is greater than 0.5m and smaller than 0.5m, such as 0.3m, and the preset interval can be 0.7m,0.6m,0.5m,0.4m,0.3m and the like.
Alternatively, the preset spacing is an empirical value, and in some embodiments, the preset spacing is 0.5m.
It can be understood that the region to be cleaned before merging is a region which inevitably generates dirt and needs to be cleaned by the sweeping robot in the process of using the target object, and dirt is generated around the target object when the target object is used by a user, so that the region to be cleaned is rectangular and surrounds the region to be cleaned on the basis of the region to be cleaned to obtain the target region, the region around the target object can be cleaned, and the cleaning effect of the range containing dirt in the current scene can be improved.
According to cleaning experience or history parameters of the sweeping robot, the preset distance between the area to be cleaned and the target area is set, the area which is within the preset distance range and is dirty when the target object is used is determined under the condition that the area to be cleaned is taken as the center, and the area which is outside the preset distance range and is not dirty when the target object is used is determined, so that the sweeping robot can effectively clean the dirty in the current scene.
For example, referring to fig. 6, after identifying the target object (including the sofa and the tea table) in the current scene, a rectangular area (dotted rectangular frame) that frames the edge profile of the sofa and the tea table is determined as the area to be cleaned S4, a rectangular area (solid rectangular frame) that surrounds the edge profile of the area to be cleaned with a predetermined distance rectangle is determined as the target area S5, and finally the area to be cleaned S4 is combined with the target area S5 to update the area to be cleaned S5.
When the area to be cleaned is surrounded, and when the maximum surrounding distance is smaller than or equal to the distance between the wall body and the target object, the preset distance is equal to the maximum distance; when the maximum surrounding distance is greater than or equal to the distance between the wall and the target object, the preset distance is equal to the distance between the wall and the target object.
For example, referring to fig. 7, it can be seen that when the preset distance is set to 0.5m, the distance between the L edge of the area to be cleaned and the wall is determined to be 0.3m, and is smaller than the preset distance by 0.5m, the preset distance of the L edge is equal to 0.3m; the distance between the other edges and the wall body is larger than 0.5m, so that the preset distance between the other edges is 0.5m, and the target area can be rapidly determined according to the preset distance.
It can be understood that if the preset interval is set to be too small, that is, when the area range surrounding the area to be cleaned is small, the area where dirt may exist near all the target objects cannot be covered, so that dirt may be missed when the cleaning robot executes the cleaning command according to the updated map of the area to be cleaned, resulting in incomplete cleaning and affecting the cleaning effect; and if the preset interval is set too large, the region with the dirt degree far smaller than the region to be cleaned can be set and even execute the same cleaning parameter command as the region to be cleaned, so that the cleaning energy consumption of the sweeping robot is easily increased, the cleaning efficiency of the sweeping robot is reduced, and the like, and the preset interval is reasonably set, so that the cleaning efficiency can be maximally improved on the premise of ensuring the cleaning effect.
Optionally, determining the area to be cleaned includes: acquiring position information of an edge contour of a target object associated with a current scene, wherein the position information comprises a first coordinate and a second coordinate; determining the minimum first coordinate, the maximum first coordinate, the minimum second coordinate and the maximum second coordinate in the position information of all the associated target objects; and determining the area to be cleaned according to the minimum first coordinate, the maximum first coordinate, the minimum second coordinate and the maximum second coordinate.
Specifically, in order to update the area to be cleaned conveniently, when determining the area to be cleaned, a map of the current scene may be established according to the current scene, and the position of the target object in the current scene may be determined in the map.
Optionally, the map includes a first coordinate axis and a second coordinate axis perpendicular to each other, and for data processing, the scene area may be placed in a first quadrant of a coordinate system of the map, so that coordinates of each position in the scene area are positive values. The first coordinate axis may be the x-axis in the horizontal direction or the y-axis in the vertical direction. The following description will take the first coordinate axis as the x-axis and the second coordinate axis as the y-axis as an example.
With continued reference to fig. 7, as described above, the robot has an artificial intelligence-based scene recognition function, when the region to be cleaned is set, the region where the target object is located can be first identified and determined in the current scene region, and then the position information of the edge contour of the target object is scanned and obtained, where the position information is represented by a first coordinate and a second coordinate, such as a plurality of position points exist on the edge contour of the target object, the position information of each position point includes a corresponding first coordinate and second coordinate, and then the minimum first coordinate x1, the maximum first coordinate x2, the minimum second coordinate y1, and the maximum second coordinate y2 are found out from the determined position information of all the target objects, according to the combination of the minimum first coordinate and the maximum second coordinate, finding four corner points of a rectangular area just framing the edge of the target object, namely (the minimum first coordinate, the minimum second coordinate), (the minimum first coordinate, the maximum second coordinate), (the maximum first coordinate, the minimum second coordinate), (the maximum first coordinate and the maximum second coordinate), and the four corner points are respectively combined, wherein the rectangular area framed by the four corner points in the four corner points A (x 1, y 1), B (x 1, y 2), C (x 2, y 1) and D (x 2, y 2) in fig. 7 is the rectangular area just framing the edge of the target object, so that a rectangular area to be cleaned is determined, and then the rectangular area to be cleaned is subjected to rectangular surrounding to obtain the target area so as to update the area to be cleaned.
It can be understood that the rectangular area obtained by respectively combining the minimum first coordinate, the maximum first coordinate, the minimum second coordinate and the maximum second coordinate of the identified target object completely covers all the target objects in the current scene, and the area to be cleaned obtained by updating the area to be cleaned on the basis can ensure that the target objects are not missed and the cleaning effect can be ensured.
Referring to fig. 8, in some embodiments, the cleaning setting method of the present application further includes:
step 014: displaying one or more pieces of cleaning recommendation information, wherein the cleaning recommendation information corresponds to a scene;
step 015: receiving target cleaning recommendation information determined by a user from the cleaning recommendation information;
step 016: and cleaning according to the cleaning parameters, the to-be-cleaned area and the cleaning time of the target cleaning recommended information.
Specifically, the sweeping robot generally has a matched application program (the application program at least includes a computer program corresponding to the cleaning setting method), and the application program is installed in a base station or a terminal which can be in communication connection with the sweeping robot, and can control the sweeping robot or display the running condition of the sweeping robot by loading and running the application program corresponding to the sweeping robot.
Optionally, the sweeping robot is provided with a display screen. After the cleaning robot (or a server connected with the cleaning robot) plans the cleaning area and the cleaning parameters, a base station or a terminal connected with the cleaning robot (or the server) in a communication way can acquire the cleaning area and the cleaning parameters, and the cleaning area and the corresponding cleaning parameters are displayed on at least one of a display screen of the cleaning robot, a display screen of the base station or a display of the terminal.
Optionally, one scene may correspond to one or more cleaning recommendation information, where the cleaning recommendation information includes a cleaning region to be cleaned and a cleaning parameter of the cleaning region to be cleaned corresponding to the scene. When the display screen of the robot cleaner, the display screen of the base station or the display of the terminal displays the cleaning area and the cleaning parameters, one or more cleaning recommended information corresponding to the scene can be displayed, and the clicked target cleaning recommended information is displayed under the condition that a viewing instruction (such as clicking the corresponding target cleaning recommended information by a user) is received, so that the corresponding cleaning area and the cleaning parameters are displayed.
For example, referring to fig. 9 and 10, a display screen of the sweeping robot, a display screen of the base station or a display of the terminal may display a plurality of scenes as shown in fig. 9, so that a user may select to view and set cleaning recommendation information of which scene, for example, a living room scene is viewed by the user at the moment, after clicking the living room scene, a map of an area to be cleaned of the living room scene as shown in fig. 6 and a plurality of cleaning recommendation parameters as shown in fig. 10 may be displayed, and the user may select whether to clean according to the current area to be cleaned and the cleaning recommendation parameters.
Optionally, the cleaning recommendation information includes a cleaning area corresponding to the scene, a cleaning parameter of the cleaning area, and a cleaning time. When the display screen of the sweeping robot, the display screen of the base station or the display of the terminal displays the cleaning recommendation information, the corresponding recommendation can also comprise cleaning recommendation information of the area to be cleaned, the cleaning parameters of the area to be cleaned and the cleaning time, which correspond to the scene, based on the cleaning time.
For example, referring to fig. 11, when the sweeping robot obtains that the user finishes lunch at noon and leaves the restaurant at a time of typically 13:00 noon, then cleaning recommendation information (named as post-meal cleaning recommendation in fig. 11) including a cleaning area of the restaurant, cleaning parameters of the cleaning area of the restaurant and cleaning time after 13:00 can be recommended to the user at 13:00, and when a viewing instruction (such as clicking corresponding target cleaning recommendation information by the user) is received, the clicked post-meal cleaning recommendation information is displayed, and after the user determines that the piece of cleaning recommendation information is the target cleaning recommendation information, the restaurant is cleaned according to the target cleaning recommendation information.
Optionally, the target object may be displayed in the overall map, or may be displayed separately, the information of the target object acquired by the sweeping robot may be generated into a map of the area to be cleaned, after the area to be cleaned is rectangular and surrounded based on the preset interval to obtain an updated area to be cleaned, when the cleaning recommendation information is displayed, an edge profile of the area to be cleaned corresponding to the cleaning recommendation information and a relative position of the target object displayed in the displayed area to be cleaned may be a static obstacle such as furniture or a dynamic object existing in the current scene, and may also be displayed, or the type information and the position information of the target object may also be displayed separately, so that a user may intuitively view the relative position of the target object and the area to be cleaned, and determine whether an error exists in the area or the scene type.
For example, when the current scene is determined to be a bedroom scene, a bed and a dynamic target object cat are identified to exist in the current scene, and the information that the target object is the relative position information of the bed and the bed in the whole scene and the cat exists in the current scene is displayed in the updated and generated map of the area to be cleaned.
Referring to fig. 12, in some embodiments, the cleaning setting method further includes:
step 017: displaying the edge contour of the region to be cleaned corresponding to the target cleaning recommendation information under the condition that a viewing instruction of the target recommendation information is received;
step 018: acquiring editing information of a user on a displayed area to be cleaned so as to determine the edited area to be cleaned;
step 019: displaying the edge profile of the edited region to be cleaned;
step 020: and displaying the target object according to the relative position of the target object in the edited region to be cleaned.
Specifically, when receiving a viewing instruction of target recommended information, at least one of a display screen of the sweeping robot, a display screen of a base station or a display of a terminal displays an edge profile of an area to be cleaned when displaying cleaning recommended information corresponding to a scene, and a user can decide whether to modify the edge profile of the area to be cleaned or not and how to modify the edge profile specifically according to the displayed cleaning recommended information. According to the acquired editing information of the area to be cleaned, determining the edited area to be cleaned, and displaying the edge profile of the edited area to be cleaned and the relative position of the target object in the edited area to be cleaned, so that the user can confirm again.
Referring to fig. 11, to facilitate better implementation of the cleaning setting method according to the embodiment of the present application, the embodiment of the present application further provides a cleaning setting device 10. The cleaning device 10 may include an identification module 11, a determination module 12, and a setting module 13, where the identification module 11 is configured to identify a target object in a current scene, and determine a scene type of the current scene according to a type of the target object; the determining module 12 is configured to determine a region to be cleaned of an associated target object covering the current scene; the setting module 13 is used for setting the cleaning parameters of the area to be cleaned according to the scene type.
In some embodiments, the determining module 12 is specifically configured to determine a fouling coefficient of each target object in the current scene, where the fouling coefficient is determined according to at least one of an activity duration of the target object by the user and a cleaning period corresponding to the current scene; determining that a target object with the pollution coefficient larger than a preset threshold value is a target object associated with the current scene; determining the areas to be combined corresponding to all target objects associated with the current scene; and merging all the areas to be merged to determine the area to be cleaned, wherein the area to be cleaned covers all the areas to be merged and the areas among the areas to be merged.
In some embodiments, the determining module 12 is specifically further configured to determine an initial fouling coefficient of each target object in the current scene according to the activity duration of the user on the target object; and adjusting the initial fouling coefficient according to the cleaning period to determine the fouling coefficient of each target object in the current scene.
In some embodiments, the determining module 12 is specifically further configured to determine a target area surrounding an edge contour of the area to be cleaned, where a distance between two opposite edge contours is a preset distance; merging the target area and the area to be cleaned to generate a target cleaning area; and re-determining the area to be cleaned according to the target cleaning area.
In some embodiments, the determining module 12 is specifically further configured to obtain location information of an edge contour of the target object associated with the current scene, where the location information includes a first coordinate and a second coordinate; determining the minimum first coordinate, the maximum first coordinate, the minimum second coordinate and the maximum second coordinate in the position information of all the associated target objects; and determining the area to be cleaned according to the minimum first coordinate, the maximum first coordinate, the minimum second coordinate and the maximum second coordinate.
In some embodiments, the cleaning setting device 10 further includes a display module 14. The display module is used for displaying one or more pieces of cleaning recommendation information, and the cleaning recommendation information corresponds to a scene; receiving target cleaning recommendation information determined by a user from the cleaning recommendation information; and cleaning according to the cleaning parameters, the to-be-cleaned area and the cleaning time of the target cleaning recommended information.
In some embodiments, the display module is specifically further configured to display an edge contour of the to-be-cleaned area corresponding to the target cleaning recommendation information when receiving a viewing instruction of the target recommendation information; acquiring editing information of a user on a displayed area to be cleaned so as to determine the edited area to be cleaned; displaying the edge profile of the edited region to be cleaned; and displaying the target object according to the relative position of the target object in the edited region to be cleaned.
The cleaning device 10 has been described above in connection with the accompanying drawings from the perspective of functional modules, which may be implemented in hardware, instructions in software, or a combination of hardware and software modules. Specifically, each step of the method embodiments in the embodiments of the present application may be implemented by an integrated logic circuit of hardware in the processor 20 and/or an instruction in software form, and the steps of the method disclosed in connection with the embodiments of the present application may be directly implemented as a hardware encoding processor or implemented by a combination of hardware and software modules in the encoding processor. Alternatively, the software modules may be located in a well-established storage medium in the art such as random access memory, flash memory, read-only memory, programmable read-only memory, electrically erasable programmable memory, registers, and the like. The storage medium is located in the memory 40, and the processor 30 reads the information in the memory 40, and in combination with its hardware, performs the steps in the above-described method embodiment.
Referring again to fig. 1, the robot cleaner 100 of the present embodiment includes a processor 30, a memory 40, and a computer program, wherein the computer program is stored in the memory 40 and executed by the processor 30, and the computer program includes instructions for executing the cleaning method of any of the above embodiments.
In one embodiment, the terminal 300 or the base station 200 may be a computer device. The internal structure thereof can be shown in fig. 12. The computer device comprises a processor 502, a memory 503, a network interface 504, a display 501 and an input means 505, which are connected by a system bus.
Wherein the processor 502 of the computer device is adapted to provide computing and control capabilities. The memory 503 of the computer device includes a nonvolatile storage medium, internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface 504 of the computer device is used to communicate with external devices via a network connection. The computer program is executed by a processor to implement the map saving method according to any of the above embodiments. The display screen 501 of the computer device may be a liquid crystal display screen or an electronic ink display screen, and the input device 505 of the computer device may be a touch layer covered on the display screen 501, or may be a key, a track ball or a touch pad arranged on a casing of the computer device, or may be an external keyboard, a touch pad or a mouse.
It will be appreciated by those skilled in the art that the structure shown in fig. 12 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
Referring to fig. 13, the embodiment of the present application further provides a computer readable storage medium 600, on which a computer program 610 is stored, where the computer program 610, when executed by the processor 620, implements the steps of the map saving method of any of the above embodiments, which is not described herein for brevity.
In the description of the present specification, reference to the terms "certain embodiments," "in one example," "illustratively," and the like, means that a particular feature, structure, material, or characteristic described in connection with the embodiments or examples is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the present application, and that variations, modifications, alternatives, and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the present application.

Claims (10)

1. A cleaning setting method, characterized by comprising:
identifying a target object in a current scene, and determining the scene type of the current scene according to the type of the target object;
determining a region to be cleaned of the associated target object covering the current scene;
And setting the cleaning parameters of the area to be cleaned according to the scene type.
2. The cleaning setting method according to claim 1, wherein the determining a region to be cleaned of the associated target object covering the current scene includes:
determining a pollution coefficient of each target object in the current scene, wherein the pollution coefficient is determined according to at least one of the activity duration of a user in the target object and a cleaning period corresponding to the current scene;
determining that the target object with the dirt coefficient larger than a preset threshold value is the target object associated with the current scene;
determining areas to be combined corresponding to the target objects associated with the current scene;
merging the areas to be merged to determine the area to be cleaned, wherein the area to be cleaned covers all the areas to be merged and the areas between the areas to be merged.
3. The cleaning setting method according to claim 2, wherein the determining of the fouling coefficients of the respective target objects in the current scene includes:
determining initial pollution coefficients of all the target objects in the current scene according to the activity duration of the user on the target objects;
And adjusting the initial fouling coefficients according to the cleaning period to determine the fouling coefficients of the target objects in the current scene.
4. The cleaning setting method according to claim 1, wherein the area to be cleaned is a rectangular area covering the associated target object of the current scene; the method further comprises the steps of:
determining a target area surrounding the edge contour of the area to be cleaned, wherein the distance between two opposite edge contours in the edge contour of the target area is a preset distance;
merging the target area and the area to be cleaned to generate a target cleaning area;
and re-determining the area to be cleaned according to the target cleaning area.
5. The cleaning setting method according to claim 1, wherein the determining the area to be cleaned includes:
acquiring position information of an edge contour of the target object associated with the current scene, wherein the position information comprises a first coordinate and a second coordinate;
determining the smallest first coordinate, the largest first coordinate, the smallest second coordinate and the largest second coordinate in the position information of all the associated target objects;
And determining the area to be cleaned according to the minimum first coordinate, the maximum first coordinate, the minimum second coordinate and the maximum second coordinate.
6. The cleaning setting method according to claim 1, characterized by further comprising:
displaying one or more pieces of cleaning recommendation information, wherein the cleaning recommendation information corresponds to a scene;
receiving target cleaning recommendation information determined by a user from the cleaning recommendation information;
and cleaning according to the cleaning parameters, the to-be-cleaned area and the cleaning time of the target cleaning recommended information.
7. The cleaning setting method according to claim 6, characterized in that the method further comprises:
displaying the edge contour of the region to be cleaned corresponding to the target cleaning recommendation information under the condition that a viewing instruction of the target recommendation information is received;
acquiring editing information of a user on the displayed area to be cleaned so as to determine the edited area to be cleaned;
displaying the edited edge profile of the area to be cleaned;
and displaying the target object according to the relative position of the target object in the edited region to be cleaned.
8. A cleaning apparatus, comprising:
A processor, a memory; and
A computer program, wherein the computer program is stored in the memory and executed by the processor, the computer program comprising instructions for performing the cleaning setting method of any one of claims 1 to 7.
9. A computer program product comprising a computer program comprising instructions for performing the cleaning setup method of any one of claims 1 to 7.
10. A non-transitory computer-readable storage medium containing a computer program which, when executed by a processor, causes the processor to perform the cleaning setting method of any one of claims 1 to 7.
CN202311438744.7A 2023-10-31 2023-10-31 Cleaning setting method, cleaning apparatus, computer program product, and storage medium Pending CN117297449A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311438744.7A CN117297449A (en) 2023-10-31 2023-10-31 Cleaning setting method, cleaning apparatus, computer program product, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311438744.7A CN117297449A (en) 2023-10-31 2023-10-31 Cleaning setting method, cleaning apparatus, computer program product, and storage medium

Publications (1)

Publication Number Publication Date
CN117297449A true CN117297449A (en) 2023-12-29

Family

ID=89249924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311438744.7A Pending CN117297449A (en) 2023-10-31 2023-10-31 Cleaning setting method, cleaning apparatus, computer program product, and storage medium

Country Status (1)

Country Link
CN (1) CN117297449A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118497875A (en) * 2024-07-19 2024-08-16 南通安耐华科技有限公司 Control method, equipment and system for battery plate hanging silver plating production line

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118497875A (en) * 2024-07-19 2024-08-16 南通安耐华科技有限公司 Control method, equipment and system for battery plate hanging silver plating production line
CN118497875B (en) * 2024-07-19 2024-10-11 南通安耐华科技有限公司 Control method, equipment and system for battery plate hanging silver plating production line

Similar Documents

Publication Publication Date Title
US11709497B2 (en) Method for controlling an autonomous mobile robot
CN111839371B (en) Ground sweeping method and device, sweeper and computer storage medium
KR101637906B1 (en) Methods, devices, program and recording medium for clearing garbage
CN111096714B (en) Control system and method of sweeping robot and sweeping robot
WO2022111539A1 (en) Floor sweeping control method, apparatus, floor sweeping robot, and computer-readable medium
CN111973075B (en) Floor sweeping method and device based on house type graph, sweeper and computer medium
CN111248818B (en) State control method, sweeping robot and computer storage medium
CN117297449A (en) Cleaning setting method, cleaning apparatus, computer program product, and storage medium
US10293489B1 (en) Control method and system, and cleaning robot using the same
CN112401763A (en) Control method of sweeping robot, sweeping robot and computer readable storage medium
CN108803586B (en) Working method of sweeping robot
US11269350B2 (en) Method for creating an environment map for a processing unit
CN112826373B (en) Cleaning method, device, equipment and storage medium of cleaning robot
CN113749562B (en) Sweeping robot and control method, device, equipment and storage medium thereof
CN113703439A (en) Autonomous mobile device control method, device, equipment and readable storage medium
US11819174B2 (en) Cleaning control method and device, cleaning robot and storage medium
WO2020244121A1 (en) Map information processing method and apparatus, and mobile device
WO2023125698A1 (en) Cleaning device, and control method and control apparatus therefor
CN111142531A (en) Household appliance linkage-based cleaning robot control method and cleaning robot
CN111487980B (en) Control method of intelligent device, storage medium and electronic device
JP7173846B2 (en) Vacuum cleaner control system, autonomous vacuum cleaner, cleaning system, and vacuum cleaner control method
CN113313089B (en) Data processing method, device and computer readable storage medium
CN114680740B (en) Cleaning control method and device, intelligent equipment, mobile equipment and server
CN115607052A (en) Cleaning method, device and equipment of robot and cleaning robot
CN114521841A (en) Cleaning area management method, system, intelligent terminal, robot and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination