CN117530620A - Cleaning method, cleaning device, cleaning apparatus, and storage medium - Google Patents

Cleaning method, cleaning device, cleaning apparatus, and storage medium Download PDF

Info

Publication number
CN117530620A
CN117530620A CN202311362166.3A CN202311362166A CN117530620A CN 117530620 A CN117530620 A CN 117530620A CN 202311362166 A CN202311362166 A CN 202311362166A CN 117530620 A CN117530620 A CN 117530620A
Authority
CN
China
Prior art keywords
scene
image
cleaning
algorithm
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311362166.3A
Other languages
Chinese (zh)
Inventor
黄杰文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen 3irobotix Co Ltd
Original Assignee
Shenzhen 3irobotix Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen 3irobotix Co Ltd filed Critical Shenzhen 3irobotix Co Ltd
Priority to CN202311362166.3A priority Critical patent/CN117530620A/en
Publication of CN117530620A publication Critical patent/CN117530620A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation

Abstract

The application discloses a cleaning method, a cleaning device, a computer apparatus, and a non-volatile computer-readable storage medium. The method comprises the steps of detecting the intensity of ambient light to determine whether a current scene is a dim light scene according to the intensity of the ambient light; performing darkness processing on a scene image acquired by a cleaning device when the current scene is the darkness scene so as to improve the definition of the scene image in the darkness scene; and carrying out scene recognition on the current scene according to the scene image subjected to the dim light treatment, and cleaning the current scene according to a recognition result. The perception capability of the cleaning device on the current scene in the dim light environment can be enhanced, the definition of the scene image in the dim light scene can be improved, and the cleaning effect and the cleaning efficiency of the cleaning device in the dim light scene are improved.

Description

Cleaning method, cleaning device, cleaning apparatus, and storage medium
Technical Field
The present application relates to the field of cleaning robot technology, and more particularly, to a cleaning method, a cleaning apparatus, a cleaning device, and a non-volatile computer-readable storage medium.
Background
The image that is gathered under the dim light condition like the robot that sweeps floor from mobile robot is comparatively fuzzy, and the definition is lower, leads to sweeping floor the robot and can't make accurate judgement to the surrounding environment, influences the perception effect of robot to the surrounding environment that sweeps floor to it has the relatively poor problem of cleaning effect to make sweeping floor the robot under the dark environment, leads to clean efficiency to reduce.
Disclosure of Invention
The embodiment of the application provides a cleaning method, a cleaning device and a nonvolatile computer readable storage medium, which can carry out darkness treatment on a scene image collected by the cleaning device in a darkness scene so as to improve the definition of the scene image in the darkness scene and improve the cleaning effect and the cleaning efficiency of the cleaning device in the darkness scene.
The cleaning method is applied to a cleaning device, the cleaning device comprises a visual sensor, the visual sensor is used for acquiring a scene image, the method comprises the steps of detecting ambient light intensity, and determining whether a current scene is a dim light scene according to the ambient light intensity; performing darkness processing on the scene image acquired by the cleaning device when the current scene is the darkness scene so as to improve the definition of the scene image in the darkness scene; and carrying out scene recognition on the current scene according to the scene image subjected to the dim light treatment, and cleaning the current scene according to a recognition result.
In some embodiments, the determining whether the current scene is a dim light scene according to the ambient light intensity includes: and under the condition that the ambient light intensity is smaller than a preset intensity threshold value, determining that the current scene is the dim light scene.
In certain embodiments, the detecting the intensity of ambient light comprises: determining the ambient light intensity from the scene image prior to darkness processing; or determining the ambient light intensity according to the light signal collected by the ambient light sensor of the cleaning device.
In some embodiments, the dim light process includes at least one of an anti-shake process, an image enhancement process, an image noise reduction process, an image deblurring process, and an image de-exposure process.
In some embodiments, darkening a scene image acquired by a cleaning device includes: performing dim light processing on the scene image based on a preset image processing algorithm; or carrying out dim light processing on the scene image based on a preset image processing model.
In some embodiments, the performing, based on a preset image processing algorithm, the darkness processing on the scene image includes at least one of: performing anti-shake processing on the scene image based on at least one of a template matching algorithm, a mean filtering algorithm and an optical flow analysis algorithm; performing image enhancement processing on the scene image based on at least one of a histogram equalization algorithm, a Laplace algorithm, an object transformation algorithm and a gamma transformation algorithm; performing image noise reduction processing on the scene image based on at least one of a spatial domain filtering algorithm, a transform domain filtering algorithm and a partial differential equation algorithm; performing image deblurring processing on the scene image based on at least one of a wiener filtering algorithm, a RL filtering algorithm and a total variation algorithm; and performing image de-exposure processing on the scene image based on at least one of a template detection algorithm, a wavelet decomposition algorithm and a color and gray feature extraction algorithm.
In some embodiments, the scene recognition of the current scene according to the scene image after the dim light processing, and the cleaning of the current scene according to the recognition result, includes: according to the scene image processed by the dim light, performing scene recognition on the current scene to acquire floor materials and barrier information in the current scene; determining the scene type and the area to be cleaned of the current scene according to the obstacle information; identifying dirt and objects of the current scene to obtain corresponding dirt parameters and target objects; determining a cleaning parameter of the area to be cleaned based on the floor material, one or more combinations of the dirt parameter and the target object, and the scene type.
The cleaning device of the embodiment of the application comprises a detection module, a dim light processing module and an identification module. The detection module is used for detecting the intensity of ambient light so as to determine whether the current scene is a dim light scene or not according to the intensity of the ambient light; the dim light processing module is used for performing dim light processing on a scene image acquired by the cleaning equipment when the current scene is the dim light scene so as to improve the definition of the scene image in the dim light scene; the identification module is used for carrying out scene identification on the current scene according to the scene image processed by the dim light, and cleaning the current scene according to the identification result.
The cleaning device of the present application embodiment includes a processor, a memory, and a computer program, wherein the computer program is stored in the memory and executed by the processor, and the computer program includes instructions for executing the cleaning method of any of the above embodiments.
The non-transitory computer readable storage medium of the present embodiments includes a computer program that, when executed by a processor, causes the processor to perform the cleaning method of any of the above embodiments.
According to the cleaning method, the cleaning device, the cleaning equipment and the computer readable storage medium, whether the current scene is a darkness scene is determined according to the ambient light intensity by detecting the ambient light intensity, under the condition that the current scene is determined to be the darkness scene, the darkness processing is carried out on the scene image collected by the cleaning equipment, scene recognition is carried out according to the scene image after the darkness processing, and finally the current scene is cleaned according to the recognition result of the scene recognition.
Through carrying out the dim light processing to the scene image that gathers under the dim light scene, can improve the definition of the scene image that gathers under the dim light scene, strengthen cleaning equipment and to the perceptibility of current scene under the dim light environment to improve the accuracy of cleaning equipment to the judgement of scene type, barrier (position and type) of current scene, guarantee cleaning equipment's cleaning performance, make cleaning equipment still can normally work under dim even dark condition, improve cleaning equipment's practicality and cleaning efficiency.
Additional aspects and advantages of embodiments of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic illustration of an application scenario of a cleaning method according to certain embodiments of the present application;
FIG. 2 is a schematic flow chart of a cleaning method according to certain embodiments of the present application;
FIG. 3 is a schematic flow chart of a cleaning method according to certain embodiments of the present application;
FIG. 4 is a schematic flow chart of a cleaning method according to certain embodiments of the present application;
FIG. 5 is a schematic flow chart of a cleaning method according to certain embodiments of the present application;
FIG. 6 is a schematic flow chart of a cleaning method according to certain embodiments of the present application;
FIG. 7 is a schematic block diagram of a cleaning device according to certain embodiments of the present application;
FIG. 8 is a schematic structural diagram of a computer device according to some embodiments of the present application;
fig. 9 is a schematic diagram of a connection state of a non-volatile computer readable storage medium and a processor according to some embodiments of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the embodiments of the present application and are not to be construed as limiting the embodiments of the present application.
To facilitate an understanding of the present application, the following description of terms appearing in the present application will be provided:
self-moving robot: a machine device for automatically performing work. It can accept human command, run pre-programmed program and act according to the principle set by artificial intelligence technology.
The sweeping robot is one of self-moving robots. The floor sweeping robot, also called automatic sweeping machine, intelligent dust collector, robot dust collector, etc., is one kind of intelligent household appliance and can complete floor cleaning automatically inside room via artificial intelligence. Generally, the brushing and vacuum modes are adopted, and the ground sundries are firstly absorbed into the garbage storage box of the ground, so that the function of cleaning the ground is completed. Generally, robots for performing cleaning, dust collection, and floor scrubbing are collectively referred to as floor cleaning robots.
The cleaning device of the present application may be a self-moving robot or a device including a self-moving robot, which may be a sweeping robot, or may be an autonomous moving robot (Automatic Mobile Robot, AMR), etc., and for brevity, the present application describes the self-moving robot as a sweeping robot, and the principle of the self-moving robot is similar to that of other types of robots, and will not be described herein again.
When the sweeping robot works in a low-light or dark-light environment (such as at night), three working modes are generally available, the first mode is that a visible light camera (Red-Green-Blue, RGB) is not carried, and exploration is carried out by means of hardware such as a radar, a collision sensor and the like, so that the sweeping robot cannot fully sense semantic information of surrounding environments by the exploration mode, only a fixed cleaning method can be used, when the sweeping robot faces a more complex environment (such as a living room with more furniture), the self-adaptability is poor, and the cleaning effect is poor due to the fact that the fixed cleaning method is still used;
the second is to carry RGB while an active supplemental light source (e.g., LED light supplement) is assembled. The light source is actively supplemented to improve the illumination intensity of the surrounding environment, so that the limitation of working in a dark light environment can be reduced, but the exposure phenomenon is easy to generate when the robot works in the dark light environment, so that the quality of the acquired image information is poor, the perception effect of the robot to the surrounding environment is affected, the performance of a visual algorithm based on RGB (red, green and blue) is seriously reduced, the cleaning effect of the cleaning robot is poor, and the hardware cost in the production process of the robot is increased due to the fact that the RGB is assembled;
The third is that the special night vision camera is carried, the obstacle can be clearly identified at night, but the night vision camera for both day and night is high in price, even the condition that the price of the night vision camera is higher than the whole machine price of the sweeping robot exists in the market, and the special night vision camera is high in cost and unsuitable for being introduced aiming at the conventional sweeping robot.
It can be understood that under the condition of dim light, the images generally collected by the sweeping robot are relatively fuzzy and have lower definition, so that the sweeping robot cannot accurately judge the surrounding environment, and the perception effect of the sweeping robot on the surrounding environment is affected, and the sweeping robot has the problems of poor cleaning effect and low cleaning efficiency.
In order to solve the above technical problems, an embodiment of the present application provides a cleaning method.
An application scenario of the technical solution of the present application is described first, as shown in fig. 1, and the cleaning method provided in the present application may be applied to the application scenario shown in fig. 1. The cleaning method is applied to the cleaning system 1000, and the cleaning system 1000 includes the robot cleaner 100 and the base station 200.
The cleaning apparatus of the present application may include only the robot cleaner 100, or the cleaning apparatus includes the robot cleaner 100 and the base station 200 (or referred to as a dust collecting station), and the robot cleaner 100 and the base station 200 may be connected through a network to determine a current state (e.g., an electric quantity state, an operating state, position information, etc.) of the opposite terminal.
Wherein the robot cleaner 100 includes a vision sensor 10, a radar sensor 20, a processor 30, a memory 40, and a body 50; the processor 30 communicates with the vision sensor 10 and the radar sensor 200 through a network, respectively, and the vision sensor 10, the radar sensor 20, and the processor 30 are provided on the body 50 of the robot 100.
The vision sensor 10 is used for acquiring images of a scene; the vision sensor 10 may be a visible light camera (Red-Green-Blue, RGB), a visible light Depth camera (Red-Green-Blue-Depth, RGBD), an infrared camera, a thermal imaging camera, a Depth camera, etc., the RGB camera and the RGBD camera may collect a visible light image of a scene, the infrared camera may collect an infrared image of the scene, the thermal imaging camera may collect a thermal imaging image of the scene, and the Depth camera may collect a Depth image.
Optionally, the camera 10 includes one or more. The camera 10 may be disposed at a side wall of the body 50, for example, the camera 10 is disposed in a direction right in front of the robot 100 to collect images of a scene right in front of the robot 100, or the camera 10 is disposed at both sides of the robot 100 to collect images of a scene at both sides of the robot 100 during the forward movement of the robot 100.
The radar sensor 20 is used to acquire point cloud information of objects in a scene. The radar sensor 20 may be a lidar (Laser Direct Structuring, LDS), such as a Time of Flight (TOF) radar based on the TOF principle, a structured light radar based on the structured light principle, or the like.
The radar sensor 20 is provided at the top wall of the robot cleaner 100. The radar sensor 20 may be provided protruding from the ceiling wall, or the radar sensor 20 may be provided within the body 50 without protruding from the body 50, that is, the radar sensor 20 may have a height lower than that of the ceiling wall.
The robot cleaner 100 collects a scene image of a current scene through the camera 10, performs a darkness process on the collected scene image in case that it is determined that the current scene is a darkness scene, and recognizes the current scene through the processor 30 according to the darkness processed scene image, thereby cleaning the current scene according to the recognition result.
In one embodiment, the robot cleaner 100 further includes a memory 40, where the memory 40 is configured to store the scene image captured by the camera 10 and the scene image after the dim light process.
In one embodiment, the robot 100 further comprises an ambient light sensor 60, the ambient light sensor 60 being configured to collect light signals of the current scene.
Optionally, the ambient light sensor 60 includes one or more of a top wall of the body 50, or a side wall of the body 50.
In one embodiment, the cleaning system 1000 further includes a server 400, the server 400 and the cleaning device communicating over a network; the server 400 is configured to receive a scene image collected by the camera 10 sent by the sweeping robot 100 or the base station 200, obtain a scene image after the dim light processing, identify a current scene according to the scene image after the dim light processing, and control the sweeping robot 100 to clean the current scene according to the identification result.
In one embodiment, the server 400 may be a separate physical server 400, or may be a server 400 cluster or a distributed system formed by a plurality of physical servers 400, or may be a cloud server 400 that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, and basic cloud computing services such as big data and artificial intelligence platforms. The embodiments of the present application are not limited in this regard.
The base station 200 may include a display 201, and the base station 200 may be capable of communicating with the sweeping robot 100 to obtain data transmitted by the sweeping robot 100, and may process the data by using a processing capability of the base station 200, so as to implement functions of controlling the sweeping robot 100 (e.g., controlling the sweeping robot 100 to move to a target position for cleaning), displaying relevant contents of the sweeping robot 100, and the like.
In one embodiment, the cleaning device further comprises a terminal 300, the terminal 300 comprising a display 301. The terminal 300 can communicate with the sweeping robot 100 to obtain data transmitted by the sweeping robot 100, and can process the data by using the processing capability of the terminal 300, so as to realize functions of controlling the sweeping robot 100 (for example, controlling the sweeping robot 100 to move to a target position for cleaning), displaying related contents of the sweeping robot 100, and the like.
In one embodiment, the terminal 300 may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, portable wearable devices, and the like.
For example, at least one of the display 201 of the base station 200 and the display 301 of the terminal 300 may display the recognition result of the scene image processed by the dim light according to the recognition result of the scene, such as the position and contour of the obstacle, etc., to determine the accuracy of the perception of the current scene by the robot 100. For another example, the base station 200 or the terminal 300 may process the position information transmitted by the robot cleaner 100 to determine and display the position of the robot cleaner 100 in the current scene, the cleaning path of the robot cleaner 100, and the like.
In one embodiment, the sweeping robot 100, the base station 200, the terminal 300 and the server 400 all communicate through a network, for example, any two of the sweeping robot 100, the base station 200, the terminal 300 and the server 400 may communicate in a wireless manner (for example, wireless local area network (Wireless Fidelity, wifi) communication, bluetooth communication, infrared communication, etc.). It is understood that the communication among the robot cleaner 100, the base station 200, the terminal 300, and the server 400 is not limited to the above-described communication manner, and is not limited thereto.
When wifi communication is performed, the sweeping robot 100 and the base station 200 communicate with the cloud server 400 respectively, and then the cloud server 400 realizes communication between the sweeping robot 100 and the base station 200 (or the terminal 300); when the communication is performed through bluetooth or infrared, the robot cleaner 100 and the base station 200 (or the terminal 300) are each provided with a corresponding communication module to directly implement communication therebetween.
In one embodiment, the cleaning method may be implemented by the robot cleaner 100 and at least one of the base station 200, the terminal 300, and the server 400. Such as through the cooperation of the sweeping robot 100 with the base station 200, the terminal 300 and the server 400, or through the cooperation of the sweeping robot 100 with the base station 200, or through the cooperation of the sweeping robot 100 with the terminal 300, etc.
The cleaning method of the present application will be described in detail below:
referring to fig. 1 and 2, an embodiment of the present application provides a cleaning method applied to a cleaning device, the cleaning device including a vision sensor for acquiring an image of a scene, the cleaning method including:
step 011: detecting the ambient light intensity to determine whether the current scene is a dim light scene according to the ambient light intensity;
specifically, when the robot 100 performs a task on a current scene, a scene image of the current scene is generally acquired by the camera 10. The sharpness of the scene image collected by the camera 10 has a great relationship with the illumination intensity of the current scene, for example, the scene image collected by the camera 10 may be blurred or even difficult to distinguish under dim environmental conditions.
The sweeping robot 100 may detect the intensity of ambient light in a variety of ways while in operation.
For example, the robot 100 may be equipped with the ambient light sensor 60, and a preset intensity threshold of the illumination intensity is set by an empirical value, and whether the current scene is a dim light scene is determined by determining whether the ambient light intensity of the current scene collected by the ambient light sensor 60 is less than the preset intensity threshold. For example, when the preset intensity threshold is 30 Lux (Lux), the ambient light intensity of the current scene collected by the ambient light sensor 60 is 25 Lux (Lux), and is smaller than the preset intensity threshold, the current scene is determined to be a dark scene.
Referring to fig. 3, in some embodiments, step 011: determining whether the current scene is a dim light scene according to the ambient light intensity, including:
step 0111: and under the condition that the ambient light intensity is smaller than a preset intensity threshold value, determining that the current scene is a dim light scene.
Specifically, in order to more accurately and rapidly determine whether the current scene is a dim light scene, a minimum illumination intensity value may be set, and whether the current scene is a dim light scene is determined by determining whether the ambient light intensity is less than the minimum illumination intensity value, where the minimum illumination intensity value is a preset intensity threshold.
Alternatively, the detection of the ambient light intensity may be by any one of the following means:
(1) The ambient light intensity is determined from the scene image acquired prior to the darkness process.
It will be appreciated that the gray value of the image of the current scene and the ambient light intensity of the current scene are positively correlated, i.e. the greater the gray value the greater the ambient light intensity. Thus, the preset gray value may be corresponding to the preset intensity threshold of the ambient light intensity by setting the preset gray value of the current scene, and then when the gray value is smaller than the preset gray value, it may be determined that the ambient light intensity of the current scene is smaller than the preset intensity threshold.
Specifically, a gray value square of the current scene can be obtained according to the current scene image, an average gray value of the current scene is counted, when the average gray value is lower than a preset gray value, the ambient light intensity of the current environment is determined to be smaller than a preset intensity threshold, and therefore the current scene is determined to be a dim light scene.
For example, the preset intensity threshold is set to be 30Lux (Lux), the 30Lux (Lux) and the preset gray value are set to be 77, that is, when the gray value of the current scene is smaller than 77, the ambient light intensity is determined to be smaller than 30Lux, if the average gray value is 72 according to the gray value square acquired by the current scene at this time and is lower than the preset gray value, the ambient light intensity of the current environment is considered to be smaller than the preset intensity threshold, and the current scene is determined to be a dim light scene.
Specifically, the area of the current scene can be partitioned according to the current scene image to obtain a plurality of areas, then the average gray value of each area is obtained in the plurality of areas, the median of each average gray value is taken as the gray value of the current scene, when the median of each average gray value is lower than the preset gray value, the ambient light intensity of the current environment is considered to be lower than the preset intensity threshold, and the current scene is determined to be a dim light scene.
For example, continuing with the previous example, the region of the current scene is divided into 5 regions, and the average gray values of the 5 regions are respectively: 66 The median of the average gray values is 72, and is lower than the preset gray value 77, 68, 72, 76, 78, the ambient light intensity of the current environment is considered to be lower than the preset intensity threshold, and the current scene is determined to be a dim light scene.
Since the intensity of the ambient light of the robot 100 is relatively uniform, a situation that the intensity of the ambient light suddenly changes in a certain area of the current scene, that is, the gray value suddenly changes, i.e., in a case that most of the area of the current scene is dim, the current scene is regarded as dim, so the intensity of the ambient light can be determined by the average gray value of the current scene and the median of the gray values of the divided areas.
(2) The ambient light intensity is determined from the light signal collected by the ambient light sensor 60 of the cleaning device.
Specifically, the robot 100 may be equipped with the ambient light sensor 60, where the ambient light sensor 60 may determine the intensity (brightness) of the light of the surrounding environment, and the ambient light sensor 60 may collect the light signal, determine the ambient light intensity of the current scene, and determine that the current scene is a dim light scene when the ambient light intensity of the current scene is less than a preset intensity threshold.
In this way, it is possible to determine whether the current scene is a dim scene by acquiring a gray value of a scene image before the dim process or by mounting the ambient light sensor 60 on the robot 100 to collect a light signal to realize detection of the intensity of ambient light.
Step 012: and under the condition that the current scene is a dim light scene, performing dim light processing on the scene image acquired by the cleaning equipment so as to improve the definition of the scene image under the dim light scene.
Optionally, the dim light process includes at least one of an anti-shake process, an image enhancement process, an image noise reduction process, an image deblurring process, and an image exposure removal process. That is, the darkness processing may be realized by at least one of an anti-shake processing, an image enhancement processing, an image noise reduction processing, an image deblurring processing, and an image exposure removal processing, such as by performing an anti-shake processing, an image enhancement processing, an image noise reduction processing, an image deblurring processing, and an image exposure removal processing on an acquired scene image, or by performing an anti-shake processing and an image enhancement processing on an acquired scene image, or by performing an image enhancement processing and an image deblurring processing on an acquired scene image, or the like.
Referring to fig. 4, optionally, step 012: performing darkness processing on a scene image acquired by the cleaning device, including:
step 0121: and carrying out dim light processing on the scene image based on a preset image processing algorithm.
Specifically, the dark light processing is performed on the scene image based on a preset image processing algorithm, including at least one of the following modes:
(1) Performing anti-shake processing on the scene image based on at least one of a template matching algorithm, a mean filtering algorithm and an optical flow analysis algorithm;
(2) Performing image enhancement processing on the scene image based on at least one of a histogram equalization algorithm, a Laplace algorithm, an object transformation algorithm and a gamma transformation algorithm;
(3) Performing image noise reduction processing on the scene image based on at least one of a spatial domain filtering algorithm, a transform domain filtering algorithm and a partial differential equation algorithm;
(4) Performing image deblurring processing on the scene image based on at least one of a wiener filtering algorithm, a RL filtering algorithm and a total variation algorithm;
(5) And performing image de-exposure processing on the scene image based on at least one of a template detection algorithm, a wavelet decomposition algorithm and a color and gray feature extraction algorithm.
Wherein, based on a preset image processing algorithm, the method is realized by at least one of the above modes when the scene image is subjected to dim light processing.
The anti-shake processing of the scene image can be realized through at least one algorithm of a template matching algorithm, a mean filtering algorithm and an optical flow analysis algorithm, for example, the anti-shake processing of the scene image can be realized through the template matching algorithm, the mean filtering algorithm and the optical flow analysis algorithm, or the template matching algorithm and the mean filtering algorithm, or the mean filtering algorithm and the optical flow analysis algorithm can be realized, and the like, so that the selection and processing modes of the specifically adopted algorithms for performing the image enhancement processing, the image noise reduction processing, the image deblurring processing, the image exposure processing and the like on the scene image are similar to the selection and processing modes of the algorithm for performing the anti-shake processing on the scene image, and are not repeated herein for brevity.
For example, when the robot 100 works in a bedroom scene determined to be a darkness scene, the collected scene image is subjected to darkness processing, in the collected scene image, the robot 100 cannot accurately identify that an obstacle shoe exists in front of 20 centimeters (cm), when in darkness processing, the robot 100 performs anti-shake processing on the scene image through a template matching algorithm and a mean value filtering algorithm, performs image enhancement processing on the scene image through a histogram equalization algorithm, a laplace algorithm, an object transformation algorithm and a gamma transformation algorithm, performs image noise reduction processing on the scene image through a spatial domain filtering algorithm, a transform domain filtering algorithm and a partial differential equation algorithm, performs image deblurring processing on the scene image through a wiener filtering algorithm and a RL filtering algorithm, finally obtains the scene image after darkness processing, and the scene image after darkness processing can clearly identify that an obstacle exists in front of 20cm of the robot 100, and the obstacle is a shoe.
Referring to fig. 5, optionally, step 012: performing darkness processing on a scene image acquired by the cleaning device, including:
step 0122: and carrying out dim light processing on the scene image based on a preset image processing model.
Specifically, the preset image processing model may be performed by the robot cleaner 100, or the image processing model is located in at least one of the base station 200, the terminal 300, and the server 400 communicatively connected to the robot cleaner 100 and is performed in at least one of them. Also, the preset image processing model may be disposed in the robot cleaner 100 to be executed by the processor 30 of the robot cleaner 100, or the preset image processing model may be disposed in at least one of the base station 200, the terminal 300, and the server 400 communicatively connected to the robot cleaner 100 and to be executed by at least one of the base station 200, the terminal 300, and the server 400 stored correspondingly.
The processor 30 inputs the acquired scene image in the dark scene into a preset image processing model, and the image processing model can output the scene image after the dark processing, thereby improving the definition of the scene image in the dark scene.
The image processing model may be a neural network model. The processor 30 may use a convolutional neural network algorithm (Convolutional Neural Networks, CNN) based on deep learning or a feature extraction algorithm in other deep learning models to perform feature extraction on the scene image, then perform feature processing on the extracted features to implement dim light processing on the scene image, and finally output a scene image with higher definition.
The image processing model can be obtained by training according to a pre-collected training set, wherein the training set comprises a first training image under dim light and a second training image with higher definition corresponding to the first training image, the first training image is input into the initial model for training, the loss value of the model is calculated according to the output image and the second training image, and then model parameters of the initial model are continuously adjusted according to the loss value until the model converges, so that the image processing model is obtained.
In this way, in the dark environment, the robot 100 can obtain the scene image through the dark processing, and has more accurate perception on the current scene, so that the robot 100 can judge the scene type, the obstacle (position and type) of the current scene more accurately.
Step 013: and carrying out scene recognition on the current scene according to the scene image processed by the dim light, and cleaning the current scene according to the recognition result.
Specifically, when the robot 100 acquires the scene image after the dim light processing, the dim light processing is processed and analyzed, so as to obtain an identification result, where the identification result includes the type of the current scene, the position and the type of the obstacle, and the like, and cleaning the current scene according to the identification result.
The scene type can be a family scene such as bedrooms, living rooms, kitchens and the like, and furniture such as sofas, tea tables and the like can be used as corresponding obstacle types; the scene type can also be a factory scene, such as a production workshop, a warehouse and the like, and the corresponding obstacle type is factory equipment, such as a forklift, a goods shelf and the like.
Referring to fig. 6, in certain embodiments, step 013: scene identification is carried out on the current scene according to the scene image processed by the dim light, and the current scene is cleaned according to the identification result, which comprises the following steps:
step 0131: according to the scene image processed by the dim light, performing scene recognition on the current scene to acquire floor materials and barrier information in the current scene;
step 0132: determining a scene type and a region to be cleaned of a current scene according to the obstacle information;
step 0133: recognizing dirt and objects of the current scene to obtain corresponding dirt parameters and target objects;
step 0134: cleaning parameters of the area to be cleaned are determined based on one or more combinations of floor material, soil parameters, and target objects, and scene type.
Wherein the soil parameters include soil type and soil size.
Specifically, the robot 100 performs scene recognition on a current scene according to the scene image after the dim light processing, and obtains floor material (such as tile, wood floor, carpet, cement or stone ground, etc.) information and obstacle information of the position where the robot 100 is located, where the obstacle information includes specific position information (such as coordinate position information relative to the current scene) and specific category information (such as belonging to furniture category, household daily necessities category or factory equipment category, etc.) and specific attribute information (such as dynamic or static) of the obstacle in the current scene.
The robot 100 may determine a scene type of the current scene according to the obstacle information, construct a map of the current scene in combination with the scene image, the point cloud information collected by the radar sensor 20, and the like, and determine a region to be cleaned (e.g., a region located within the scene but outside the region where the obstacle is located) of the current scene. For example, the sweeping robot 100 recognizes that there is an obstacle in the current scene: beds, bedside cabinets, etc., which belong to furniture categories and are generally placed in bedrooms, so that the floor sweeping robot 100 determines that the current scene is a bedroom scene in a home scene, and constructs a map of the bedroom scene when the bedroom scene is cleaned along edges to determine a region to be cleaned of the bedroom scene.
It will be appreciated that the cleaning parameters may be different for different types of scene types, such as living room scenes (bedrooms, living rooms, kitchens, etc.) and factory scenes (workshops, warehouses, etc.), the degree of soiling may be substantially different during cleaning, and the required suction and brush rotational speed may be different, as well as the cleaning duration. Thus, different scene types require different cleaning parameters to be set separately.
The cleaning parameters may be different for different types of floor materials, such as a dry-cleaning mode for the floor material of the wood floor, in which the rotation speed of the side brush is required to be slower in order to avoid scratching and damaging the surface of the wood floor, and in order to meet the requirement of the wood floor on the dryness; in the case of the flooring material of the tile material, the required suction level is also different for cleaning the floor seams between the tiles. Therefore, the dirt of each floor material needs to be set with different cleaning parameters for cleaning.
The cleaning parameters may be different for different types of soiling, such as the required suction and amount of water during cleaning, and the rotational speed of the side brush and the roller mop, for solids (e.g. powder, granules, fruit shells, paper dust, hair, etc.) and liquids (e.g. water, drinks, oils, liquid dry stains (e.g. soy sauce, urine, etc.). Thus, each type of soil requires a separate setting of cleaning parameters.
The cleaning parameters may also be different for different dirt sizes (e.g. the area occupied by the dirt), such as for liquids where the dirt size is large, where the rotation speed of the side brush and the roller mop is slow in order to ensure that the dirt does not splash, and where the dirt size is small, the rotation speed of the side brush and the roller mop is fast in order to ensure the cleaning effect; or in the case of larger dirt size, the water yield required when cleaning is larger, so that the water outlet gear can be set higher when the dirt size is larger, and lower when the dirt size is smaller.
The cleaning parameters may also be different for different types of target objects, for example, when there are target objects such as cats and dogs in the current scene, and when there are no target objects such as cats and dogs in the current scene, the cleaning parameters may be different, when there are target objects such as cats and dogs in the current scene, due to the fact that the cats and dogs are prone to falling off of hairs, etc., most areas in the current scene have hairs, the suction gear when cleaning dirt (or areas other than dirt) may be higher than the suction gear when there are no target objects such as cats and dogs, and the rotation speed of the edge brush when cleaning dirt (or areas other than dirt) may be lower than the rotation speed of the edge brush when there are no target objects such as cats and dogs.
Accordingly, the cleaning parameters when the robot 100 cleans the cleaning area may be determined according to the floor material, the type of dirt, at least one of the size of dirt and the target object, and the scene type. Such as determining cleaning parameters by determining scene type and floor material, or determining scene type, floor material, dirt type and target object, or determining cleaning parameters by determining scene type, dirt type and dirt size, etc.
For example, the cleaning parameters are determined by determining scene type, soil type, and soil size: when the current scene is identified as a kitchen scene in the family scene type and liquid with smaller dirt size exists (identified as kitchen sauce), the preset cleaning parameters comprise sweeping and mopping the kitchen, but the target cleaning parameters are set to only mopping and not sweeping as the kitchen sauce with smaller dirt size is identified.
For example, description will be given by taking a cleaning parameter when the cleaning robot 100 cleans a cleaning region as an example according to a floor material, a type of dirt, a size of dirt, and a type of scene as a target object:
Based on the mapping relation between the scene type and the cleaning parameters, the experience value or the historical cleaning parameters of the scene type and the cleaning parameters are in one-to-one correspondence through experience values or the historical cleaning parameters of the user, a cleaning parameter table for mapping the relation between the scene type and the cleaning parameters is generated, wherein the cleaning parameters can comprise a cleaning mode, a suction gear, water yield and the like, the cleaning parameter table can be stored in at least one of a memory 40, a base station 200, a terminal 300 or a server 400 of the sweeping robot 100, after the scene type of the current scene is determined, the preset cleaning parameters can be found through the mapping relation between the scene type and the preset cleaning parameters, so that the target cleaning parameters of the current scene are determined, for example, the sweeping robot 100 determines that the current scene is a living room scene in the family scene type, the target cleaning parameters are determined as sweeping and mopping according to the mapping relation between the scene type and the preset cleaning parameters, and the water yield is the same.
And correspondingly adjusting the target cleaning parameters according to the identified floor materials. Continuing the above example, the robot 100 determines that the current scene is a living room scene in the family scene type, and when recognizing that the floor material is a wooden floor, the preset cleaning parameters of the living room scene include sweeping and wet-cloth mopping the living room according to the mapping relation between the scene type and the cleaning parameters, but because the wooden floor material is recognized, the robot is not suitable for executing a mopping command with too large water yield, and thus the target cleaning parameters are adjusted to be sweeping without mopping.
The target cleaning parameters are also readjusted accordingly based on the identified type of soil and the size of the soil. Continuing the above example, when there is still liquid with smaller dirt size on the floor and the dirt type is tea stain, in order to clean the water stain on the basis of not damaging the wooden floor, the target cleaning parameter is adjusted to sweep and drag the floor, and the water yield is low.
The target cleaning parameters are correspondingly readjusted according to the identified target objects. Continuing the above example, the target object is identified in the current scene, and the target object is female, and the current scene is determined to possibly have long-hair female dirt to be cleaned, so that the target cleaning parameters are adjusted to sweep and drag the floor, the water yield is low, and the suction gear is set to be medium. Then, according to the final determined target cleaning parameters according to the floor material, the soil type, the soil size and the target object and scene type, the following are: the water yield is low when sweeping and mopping, and the suction gear is set as the middle.
Therefore, the cleaning parameters when cleaning the area to be cleaned can be determined according to at least one of the floor material, the dirt type, the dirt size and the target object and the scene type, the dirt existing in the current scene is cleaned according to the determined cleaning parameters, the corresponding cleaning parameters are determined according to the related information of the dirt, the dirt is cleaned by using the corresponding cleaning parameters, and the cleaning effect of the dirt is ensured.
Referring to fig. 7, to facilitate better implementation of the cleaning method according to the embodiments of the present application, the present application further provides a cleaning device 10. The cleaning device 10 may include a detection module 11, a dim light processing module 12, and an identification module 13. The detection module 11 is configured to detect an ambient light intensity, so as to determine whether the current scene is a dim light scene according to the ambient light intensity; the darkness processing module 12 is configured to perform darkness processing on the scene image acquired by the cleaning device when the current scene is a darkness scene, so as to improve the definition of the scene image in the darkness scene; and the recognition module 13 is used for recognizing the current scene according to the scene image subjected to the dim light treatment and cleaning the current scene according to the recognition result.
In some embodiments, the detection module 11 is further configured to determine that the current scene is a dim light scene if the intensity of the ambient light is less than a preset intensity threshold.
In some embodiments, the detection module 11 is further configured to determine the intensity of ambient light from the image of the scene prior to the darkness process; or determining the intensity of the ambient light based on the light signal collected by the ambient light sensor of the cleaning device.
In some embodiments, the darkness processing module 12 is further configured to darkness process the scene image based on a preset image processing algorithm; or performing dim light processing on the scene image based on a preset image processing model.
In some embodiments, the dim light processing module 12 is specifically further configured to perform anti-shake processing on the scene image based on at least one of a template matching algorithm, a mean filtering algorithm, and an optical flow analysis algorithm; and performing image enhancement processing on the scene image based on at least one of a histogram equalization algorithm, a laplace algorithm, an object transformation algorithm and a gamma transformation algorithm; and performing image noise reduction processing on the scene image based on at least one of a spatial domain filtering algorithm, a transform domain filtering algorithm and a partial differential equation algorithm; and performing image deblurring processing on the scene image based on at least one of a wiener filtering algorithm, a RL filtering algorithm and a total variation algorithm; and performing image de-exposure processing on the scene image based on at least one of a template detection algorithm, a wavelet decomposition algorithm and a color and gray feature extraction algorithm.
In some embodiments, the identification module 13 is further configured to perform scene identification on the current scene according to the scene image after the dim light processing, so as to obtain the floor material and the obstacle information in the current scene; determining the scene type and the area to be cleaned of the current scene according to the obstacle information; identifying the dirt type, the dirt size and the target object of the current scene; and determining cleaning parameters of the cleaning equipment when cleaning the area to be cleaned according to the scene type, the floor material, the dirt type, the dirt size and the target object.
The cleaning device 10 has been described above in connection with the accompanying drawings from the perspective of functional modules, which may be implemented in hardware, instructions in software, or a combination of hardware and software modules. Specifically, each step of the method embodiments in the embodiments of the present application may be implemented by an integrated logic circuit of hardware in the processor 20 and/or an instruction in software form, and the steps of the method disclosed in connection with the embodiments of the present application may be directly implemented as a hardware encoding processor or implemented by a combination of hardware and software modules in the encoding processor. Alternatively, the software modules may be located in a well-established storage medium in the art such as random access memory, flash memory, read-only memory, programmable read-only memory, electrically erasable programmable memory, registers, and the like. The storage medium is located in the memory 40, and the processor 30 reads the information in the memory 40, and in combination with its hardware, performs the steps in the above-described method embodiment.
Referring again to fig. 1, the robot cleaner 100 of the present embodiment includes a processor 30, a memory 40, and a computer program, wherein the computer program is stored in the memory 40 and executed by the processor 30, and the computer program includes instructions for executing the cleaning method of any of the above embodiments.
In one embodiment, the terminal 300 or the base station 200 may be a computer device. The internal structure thereof can be shown in fig. 8. The computer device comprises a processor 502, a memory 503, a network interface 504, a display 501 and an input means 505, which are connected by a system bus.
Wherein the processor 502 of the computer device is adapted to provide computing and control capabilities. The memory 503 of the computer device includes a nonvolatile storage medium, internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface 504 of the computer device is used to communicate with external devices via a network connection. The computer program is executed by a processor to implement the map saving method according to any of the above embodiments. The display screen 501 of the computer device may be a liquid crystal display screen or an electronic ink display screen, and the input device 505 of the computer device may be a touch layer covered on the display screen 501, or may be a key, a track ball or a touch pad arranged on a casing of the computer device, or may be an external keyboard, a touch pad or a mouse.
It will be appreciated by those skilled in the art that the structure shown in fig. 8 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
Referring to fig. 9, the embodiment of the present application further provides a computer readable storage medium 600, on which a computer program 610 is stored, where the computer program 610, when executed by the processor 620, implements the steps of the map saving method of any one of the above embodiments, which is not described herein for brevity.
In the description of the present specification, reference to the terms "certain embodiments," "in one example," "illustratively," and the like, means that a particular feature, structure, material, or characteristic described in connection with the embodiments or examples is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the present application, and that variations, modifications, alternatives, and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the present application.

Claims (10)

1. A cleaning method, applied to a cleaning apparatus, the cleaning apparatus comprising a vision sensor for acquiring an image of a scene, the method comprising:
detecting the intensity of ambient light to determine whether the current scene is a dim light scene according to the intensity of ambient light;
Performing darkness processing on the scene image acquired by the cleaning device when the current scene is the darkness scene so as to improve the definition of the scene image in the darkness scene;
and carrying out scene recognition on the current scene according to the scene image subjected to the dim light treatment, and cleaning the current scene according to a recognition result.
2. The method of claim 1, wherein determining whether the current scene is a dim light scene based on the ambient light intensity comprises:
and under the condition that the ambient light intensity is smaller than a preset intensity threshold value, determining that the current scene is the dim light scene.
3. The cleaning method according to claim 1 or 2, wherein the detecting the intensity of the ambient light comprises:
determining the ambient light intensity from the scene image prior to darkness processing; or,
and determining the ambient light intensity according to the light signal collected by the ambient light sensor of the cleaning device.
4. The cleaning method according to claim 1, wherein the dim light treatment includes at least one of an anti-shake treatment, an image enhancement treatment, an image noise reduction treatment, an image deblurring treatment, and an image exposure treatment.
5. The method of claim 4, wherein darkening the scene image acquired by the cleaning apparatus comprises:
performing dim light processing on the scene image based on a preset image processing algorithm; or alternatively
And carrying out dim light processing on the scene image based on a preset image processing model.
6. The cleaning method of claim 5, wherein the darkening the scene image based on a preset image processing algorithm comprises at least one of:
performing anti-shake processing on the scene image based on at least one of a template matching algorithm, a mean filtering algorithm and an optical flow analysis algorithm;
performing image enhancement processing on the scene image based on at least one of a histogram equalization algorithm, a Laplace algorithm, an object transformation algorithm and a gamma transformation algorithm;
performing image noise reduction processing on the scene image based on at least one of a spatial domain filtering algorithm, a transform domain filtering algorithm and a partial differential equation algorithm;
performing image deblurring processing on the scene image based on at least one of a wiener filtering algorithm, a RL filtering algorithm and a total variation algorithm;
and performing image de-exposure processing on the scene image based on at least one of a template detection algorithm, a wavelet decomposition algorithm and a color and gray feature extraction algorithm.
7. The method according to claim 1, wherein the scene recognition of the current scene based on the darkly processed scene image and the cleaning of the current scene based on the recognition result comprise:
according to the scene image processed by the dim light, performing scene recognition on the current scene to acquire floor materials and barrier information in the current scene;
determining the scene type and the area to be cleaned of the current scene according to the obstacle information;
identifying dirt and objects of the current scene to obtain corresponding dirt parameters and target objects;
determining a cleaning parameter of the area to be cleaned based on the floor material, one or more combinations of the dirt parameter and the target object, and the scene type.
8. A cleaning device, comprising:
the detection module is used for detecting the intensity of ambient light so as to determine whether the current scene is a dim light scene or not according to the intensity of the ambient light;
the dim light processing module is used for performing dim light processing on the scene image acquired by the cleaning equipment when the current scene is the dim light scene so as to improve the definition of the scene image in the dim light scene;
And the identification module is used for carrying out scene identification on the current scene according to the scene image subjected to the dim light treatment and cleaning the current scene according to an identification result.
9. A cleaning apparatus, comprising:
a processor, a memory; and
A computer program, wherein the computer program is stored in the memory and executed by the processor, the computer program comprising instructions for performing the cleaning method of any one of claims 1 to 7.
10. A non-transitory computer readable storage medium containing a computer program which, when executed by a processor, causes the processor to perform the cleaning method of any one of claims 1-7.
CN202311362166.3A 2023-10-19 2023-10-19 Cleaning method, cleaning device, cleaning apparatus, and storage medium Pending CN117530620A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311362166.3A CN117530620A (en) 2023-10-19 2023-10-19 Cleaning method, cleaning device, cleaning apparatus, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311362166.3A CN117530620A (en) 2023-10-19 2023-10-19 Cleaning method, cleaning device, cleaning apparatus, and storage medium

Publications (1)

Publication Number Publication Date
CN117530620A true CN117530620A (en) 2024-02-09

Family

ID=89787006

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311362166.3A Pending CN117530620A (en) 2023-10-19 2023-10-19 Cleaning method, cleaning device, cleaning apparatus, and storage medium

Country Status (1)

Country Link
CN (1) CN117530620A (en)

Similar Documents

Publication Publication Date Title
US11042760B2 (en) Mobile robot, control method and control system thereof
CN110693397B (en) Control method of cleaning robot, cleaning robot and medium
US10293489B1 (en) Control method and system, and cleaning robot using the same
US20210244249A1 (en) Configuration of a cleaning head for an autonomous vacuum
CN111657798B (en) Cleaning robot control method and device based on scene information and cleaning robot
CN112401763A (en) Control method of sweeping robot, sweeping robot and computer readable storage medium
CN105395144A (en) Control method, system and cloud server of sweeping robot and sweeping robot
CN111543902A (en) Floor cleaning method and device, intelligent cleaning equipment and storage medium
DE102013113426A1 (en) Mobile cleaning device and method for operating such a device
WO2022111539A1 (en) Floor sweeping control method, apparatus, floor sweeping robot, and computer-readable medium
CN111487980B (en) Control method of intelligent device, storage medium and electronic device
CN111643010A (en) Cleaning robot control method and device, cleaning robot and storage medium
CN111552764A (en) Parking space detection method, device and system, robot and storage medium
JP7173846B2 (en) Vacuum cleaner control system, autonomous vacuum cleaner, cleaning system, and vacuum cleaner control method
CN210704858U (en) Cleaning robot with binocular camera
CN110315538B (en) Method and device for displaying barrier on electronic map and robot
CN114938927A (en) Automatic cleaning apparatus, control method, and storage medium
CN112971644B (en) Cleaning method and device of sweeping robot, storage medium and sweeping robot
CN113703439A (en) Autonomous mobile device control method, device, equipment and readable storage medium
CN117530620A (en) Cleaning method, cleaning device, cleaning apparatus, and storage medium
CN111528739A (en) Sweeping mode switching method and system, electronic equipment, storage medium and sweeper
CN117297449A (en) Cleaning setting method, cleaning apparatus, computer program product, and storage medium
CN112716377B (en) Water absorption method and device of sweeper, readable storage medium and electronic equipment
CN110780664A (en) Robot control method and device and sweeping robot
EP4163819A1 (en) Control method for self-moving device, apparatus, storage medium, and self-moving device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination