CN111012261A - Sweeping method and system based on scene recognition, sweeping equipment and storage medium - Google Patents

Sweeping method and system based on scene recognition, sweeping equipment and storage medium Download PDF

Info

Publication number
CN111012261A
CN111012261A CN201911138059.6A CN201911138059A CN111012261A CN 111012261 A CN111012261 A CN 111012261A CN 201911138059 A CN201911138059 A CN 201911138059A CN 111012261 A CN111012261 A CN 111012261A
Authority
CN
China
Prior art keywords
scene
cleaning
preset
recognition
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911138059.6A
Other languages
Chinese (zh)
Inventor
杨勇
吴泽晓
张康健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen 3irobotix Co Ltd
Original Assignee
Shenzhen 3irobotix Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen 3irobotix Co Ltd filed Critical Shenzhen 3irobotix Co Ltd
Priority to CN201911138059.6A priority Critical patent/CN111012261A/en
Publication of CN111012261A publication Critical patent/CN111012261A/en
Priority to PCT/CN2020/127232 priority patent/WO2021098536A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning

Landscapes

  • Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a cleaning method, a cleaning system, a sweeping device and a computer readable storage medium based on scene recognition, which are characterized in that a real-time image of the surrounding environment is acquired; inputting the real-time image into a preset scene classification model for scene recognition; and adjusting a cleaning strategy in real time for cleaning based on the identification result obtained by scene identification. According to the intelligent sweeping robot, intelligent sweeping is performed on different scenes where the sweeping robot is located, and the sweeping efficiency of the sweeping robot is improved.

Description

Sweeping method and system based on scene recognition, sweeping equipment and storage medium
Technical Field
The invention relates to the technical field of sweeping robots, in particular to a sweeping method and system based on scene recognition, sweeping equipment and a computer readable storage medium.
Background
With the rapid development of scientific technology, the sweeping robot is applied more and more in daily cleaning at home. However, in the current operation algorithm of the sweeping robot, the sweeping robot can only execute a preset fixed sweeping strategy to sweep the surrounding environment, so that the fixed sweeping strategy or the sweeping mode of the sweeping robot is difficult to effectively sweep for different sweeping environments, and the sweeping efficiency is low.
Disclosure of Invention
The invention mainly aims to provide a sweeping method, a sweeping device, sweeping equipment and a computer readable storage medium based on scene recognition, and aims to solve the technical problems that an existing sweeping robot is difficult to effectively sweep different sweeping environments and low in sweeping efficiency due to a fixed sweeping strategy or an operation algorithm of a sweeping mode.
In order to achieve the above object, the present invention provides a cleaning method based on scene recognition, including:
acquiring a real-time image of the surrounding environment;
inputting the real-time image into a preset scene classification model for scene recognition;
and adjusting a cleaning strategy in real time for cleaning based on the identification result obtained by scene identification.
Further, before the step of inputting the real-time image into a preset scene classification model for scene recognition, the method further includes:
and generating picture data sets of all preset cleaning scenes based on scene definition, and training according to all the picture data sets to obtain the preset scene classification model.
Further, the step of obtaining the preset scene classification model according to the picture data set training includes:
performing data format conversion on each picture data set to obtain each training sample data;
inputting each training sample data into a deep learning network, and training by adopting a preset mode until the deep learning network converges;
and converting the model type of the deep learning network with the converged training into the preset scene classification model.
Further, the step of training in a preset mode until the deep learning network converges includes:
adjusting the application scene category of the deep learning network classification layer to the category of each preset cleaning scene;
and performing cyclic training on the weight of the classification layer after the application scene category is adjusted according to preset times until the training of the classification layer is converged.
Further, the step of inputting the real-time image into a preset scene classification model for scene recognition includes:
inputting the real-time image into a preset scene classification model for calculation of the preset scene classification model and outputting a label result;
and identifying and determining a target scene according to the corresponding relation between the label result and each preset cleaning scene.
Further, the step of adjusting the cleaning strategy in real time for cleaning based on the recognition result obtained by the scene recognition includes:
detecting and identifying a target strategy mapped by the target scene according to the mapping relation between each preset cleaning scene and the cleaning strategy;
and reading each cleaning parameter of the target strategy, correspondingly adjusting each current cleaning parameter to each cleaning parameter of the target strategy, and cleaning.
In addition, to achieve the above object, the present invention further provides a cleaning system based on scene recognition, including:
the acquisition module is used for acquiring a real-time image of the surrounding environment;
the recognition module is used for inputting the real-time image to a preset scene classification model for scene recognition;
and the cleaning module is used for adjusting a cleaning strategy in real time for cleaning based on the identification result obtained by scene identification.
Further, the sweeping system based on scene recognition further comprises:
and the training module is used for generating picture data sets of all preset cleaning scenes based on scene definition and training according to all the picture data sets to obtain the preset scene classification model.
The invention also provides a sweeping device, comprising: the cleaning system comprises a memory, a processor and a scene recognition-based cleaning program which is stored on the memory and can run on the processor, wherein the scene recognition-based cleaning program realizes the steps of the scene recognition-based cleaning method when being executed by the processor.
The present invention also provides a computer-readable storage medium, wherein the computer-readable storage medium stores thereon a computer program, and the computer program, when executed by a processor, implements the steps of the cleaning method based on scene recognition as described above.
The sweeping method, the sweeping system, the sweeping equipment and the computer readable storage medium based on scene recognition are realized by acquiring real-time images of the surrounding environment; inputting the real-time image into a preset scene classification model for scene recognition; and adjusting a cleaning strategy in real time for cleaning based on the identification result obtained by scene identification. According to the sweeping robot, in the real-time sweeping walking process, the surrounding environment is photographed to acquire real-time images of the surrounding environment, the acquired real-time images are transmitted to the preset scene classification model to be calculated by the preset scene classification model so as to perform scene recognition on the current environment of the sweeping robot, the sweeping strategy of the sweeping robot is correspondingly adjusted according to the scene information of the current scene according to the recognition result of the scene recognition on the current environment, namely the current environment of the sweeping robot is a kitchen, a bedroom, a bathroom, a living room, a dining room, a corridor or a balcony, so that intelligent sweeping on different scenes is realized, and the sweeping efficiency of the sweeping robot is improved.
Drawings
FIG. 1 is a schematic diagram of the hardware operation involved in an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a cleaning method based on scene recognition according to a first embodiment of the present invention;
fig. 3 is a schematic view of an operation flow of the sweeping device in an embodiment of the sweeping method based on scene recognition according to the present invention;
fig. 4 is a schematic structural diagram of a cleaning system based on scene recognition according to the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, fig. 1 is a schematic structural diagram of a hardware operating environment according to an embodiment of the present invention.
It should be noted that fig. 1 is a schematic structural diagram of a hardware operating environment of the sweeping apparatus. The sweeping equipment provided by the embodiment of the invention can be PC, portable computers and the like.
As shown in fig. 1, the sweeping apparatus may include: a processor 1001, such as a CPU, a network interface 1004, a user interface 1003, a memory 1005, a communication bus 1002. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
It will be appreciated by those skilled in the art that the construction of the sweeping apparatus shown in figure 1 is not intended to be limiting and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
As shown in fig. 1, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and a distributed task processing program. The operating system is a program for managing and controlling hardware and software resources of the sample sweeping device, and supports a processing program of distributed tasks and the running of other software or programs.
In the sweeping apparatus shown in fig. 1, the user interface 1003 is mainly used for data communication with each terminal; the network interface 1004 is mainly used for connecting a background server and performing data communication with the background server; and the processor 1001 may be configured to call the scene recognition based cleaning program stored in the memory 1005, and perform the following operations:
acquiring a real-time image of the surrounding environment;
inputting the real-time image into a preset scene classification model for scene recognition;
and adjusting a cleaning strategy in real time for cleaning based on the identification result obtained by scene identification.
Further, the processor 1001 may call a cleaning program based on scene recognition stored in the memory 1005, and before performing the inputting of the real-time image to the preset scene classification model for scene recognition, further perform the following operations:
and generating picture data sets of all preset cleaning scenes based on scene definition, and training according to all the picture data sets to obtain the preset scene classification model.
Further, the processor 1001 may call the scene recognition based cleaning program stored in the memory 1005, and further perform the following operations:
performing data format conversion on each picture data set to obtain each training sample data;
inputting each training sample data into a deep learning network, and training by adopting a preset mode until the deep learning network converges;
and converting the model type of the deep learning network with the converged training into the preset scene classification model.
Further, the processor 1001 may call the scene recognition based cleaning program stored in the memory 1005, and further perform the following operations:
adjusting the application scene category of the deep learning network classification layer to the category of each preset cleaning scene;
and performing cyclic training on the weight of the classification layer after the application scene category is adjusted according to preset times until the training of the classification layer is converged.
Further, the processor 1001 may call the scene recognition based cleaning program stored in the memory 1005, and further perform the following operations:
inputting the real-time image into a preset scene classification model for calculation of the preset scene classification model and outputting a label result;
and identifying and determining a target scene according to the corresponding relation between the label result and each preset cleaning scene.
Further, the processor 1001 may call the scene recognition based cleaning program stored in the memory 1005, and further perform the following operations:
detecting and identifying a target strategy mapped by the target scene according to the mapping relation between each preset cleaning scene and the cleaning strategy;
and reading each cleaning parameter of the target strategy, and correspondingly adjusting each current cleaning parameter to each cleaning parameter of the target strategy.
Based on the above structure, the present invention provides various embodiments of the cleaning method based on scene recognition.
Referring to fig. 2, fig. 2 is a flowchart illustrating a cleaning method based on scene recognition according to a first embodiment of the present invention.
While a logical order is shown in the flow chart, in some cases, the steps shown or described may be performed in an order different than that shown.
The sweeping method based on scene recognition in the embodiment of the present invention is applied to the sweeping device, and the sweeping device in the embodiment of the present invention may be a sweeping device such as a PC, a portable computer, or the like, and is not particularly limited herein.
The cleaning method based on scene recognition in the embodiment comprises the following steps:
step S100, acquiring a real-time image of the surrounding environment.
In this embodiment, the sweeping robot takes a picture of the surrounding environment based on the camera in the real-time running and walking process, so as to acquire a real-time image of the surrounding environment where the sweeping robot is currently located.
It should be noted that, in this embodiment, the sweeping apparatus may specifically be a sweeping robot, and a camera for taking a picture of an environment around the sweeping robot is configured on the sweeping robot; for convenience of understanding, the subsequent sweeping equipment is expressed by the sweeping robot instead, and it should be understood that the sweeping method based on scene recognition in the present invention does not limit the specific model and other functional configurations of the sweeping robot.
And S200, inputting the real-time image into a preset scene classification model for scene recognition.
The sweeping robot collects a real-time image of the current surrounding environment, transmits the collected real-time image to a preset scene classification model, receives the real-time image and immediately calculates to output a label result for identifying the scene category, and identifies and determines the scene category of the current surrounding environment according to the label result which is trained and calculated by the preset scene classification model.
It should be noted that, in this embodiment, the preset scene model may specifically be a TFLite (environmental _ flow _ Lite: an open source deep learning framework for device-side inference) type model based on a deep learning network mobilonenet v1 (effective _ global _ network for _ mobile _ vision application), where mobilonenet v1 may also be replaced by other deep learning networks, such as VGG (virtual _ Geometry _ Group _ convolutional _ neural network), inclusion v3(google _ net: one of types of classical networks), and a res (residual _ neural network).
And step S300, adjusting a cleaning strategy in real time for cleaning based on the recognition result obtained by scene recognition.
After recognizing and determining that the scene category of the current surrounding environment belongs to a kitchen, a bedroom, a bathroom, a living room, a dining room, a corridor or a balcony according to the label result output by the sweeping robot through training and calculation according to the preset scene classification model, correspondingly adjusting the current sweeping strategy of the sweeping robot in real time according to the scene information of the kitchen, the bedroom, the bathroom, the living room, the dining room, the corridor or the balcony, and sweeping the current surrounding environment according to the adjusted sweeping strategy.
In this embodiment, the sweeping robot photographs the surrounding environment in real time based on a camera during a real-time running and walking process, so as to acquire a real-time image of the surrounding environment where the sweeping robot is currently located, the sweeping robot transmits the acquired real-time image to a preset scene classification model after acquiring the real-time image of the surrounding environment where the sweeping robot is currently located, the preset scene classification model receives the real-time image and performs calculation in real time to output a labeling result for identifying a scene type, the sweeping robot identifies and determines whether the scene type of the surrounding environment where the sweeping robot is currently located is a kitchen, a bedroom, a bathroom, a living room, a dining room, a corridor or a balcony according to the labeling result output by the calculation trained by the preset scene classification model, and correspondingly determines the scene information of the kitchen, the bedroom, the bathroom, the living room, the dining room, the corridor or the balcony, and adjusting the current cleaning strategy of the sweeping robot in real time, and cleaning the current surrounding environment according to the adjusted cleaning strategy.
The sweeping robot realizes real-time scene recognition of the current environment, and adjusts the sweeping strategy of the sweeping robot in real time based on the scene recognition result, so that the sweeping robot can intelligently sweep according to different scene categories, and the sweeping efficiency of the sweeping robot is improved.
Further, based on the first embodiment of the cleaning method based on scene recognition, a second embodiment of the cleaning method based on scene recognition is provided.
In a second embodiment of the cleaning method based on scene recognition of the present invention, before the step S200 inputs the real-time image to a preset scene classification model for scene recognition, the cleaning method based on scene recognition of the present invention further includes:
and S400, generating picture data sets of all preset cleaning scenes based on scene definition, and training according to all the picture data sets to obtain the preset scene classification model.
The method comprises the steps that a scene definition is preset by an algorithm researcher, pictures shot and collected according to different cleaning scene types of the surrounding environment have different characteristic objects, the pictures of all scenes are collected and classified, a preset number of picture data sets of the preset cleaning scenes are generated, then deep learning network training is carried out on the basis of the picture data sets owned by the preset cleaning scenes by adopting a preset training mode, and therefore a preset scene classification model used for scene recognition of the sweeping robot is obtained.
Specifically, for example, an algorithm researcher of the sweeping robot running algorithm formulates scene definitions in advance, that is, defines the surrounding environment where the sweeping robot performs sweeping as seven preset sweeping scenes, namely, a kitchen, a bedroom, a bathroom, a living room, a dining room, a corridor and a balcony, and formulates image definitions of the preset sweeping scenes according to whether characteristic objects in the scenes (such as sofas in the living room, beds in the bedroom and the like) are included in photo images acquired in different sweeping scenes. Based on the preset scene definitions of the preset cleaning scenes, a data collector collects seven photo images of the preset cleaning scenes of a kitchen, a bedroom, a bathroom, a living room, a dining room, a corridor and a balcony according to different scene definitions in a large number of residential families, and then arranges the collected photo images of the preset cleaning scenes to obtain respective picture data sets of the seven preset cleaning scenes.
Further, in another embodiment, unqualified photo images obtained in the process of acquiring the photo images by a data acquirer can be screened and deleted, and further the picture data sets of the preset cleaning scenes obtained through sorting are randomly cut and screened to increase the number of pictures of the picture data sets of the cleaning scenes; randomly cut and screened photo images which do not accord with the scene definition preset by an algorithm researcher are unified as another classified scene, namely a background class, so that training sample data of one background class is added when the deep learning network is trained on the basis of the picture data set.
It should be noted that, in this embodiment, based on the visual limitation that the sweeping robot acquires a photo image, most photos taken by the sweeping robot may be difficult to accurately distinguish and identify a sweeping scene, so training sample data of a background class is introduced into the picture data set, and thus the background class is added on the basis of training 7 sweeping scenes of a family, and since a large number of non-sweeping scene class photo images are taken at the viewing angle of the sweeping robot, the background class is subdivided into three classes: ground, wall corner and debris to in the process of robot of sweeping the floor and carrying out scene recognition, the scene classification model all corresponds the label result of this three background class output that discerns to a label of background class, so, has balanced each and has cleaned the training picture quantity between the scene class and the class, so as not to influence the categorised recognition effect of final scene classification model, has guaranteed the final discernment accuracy of scene classification model.
After the final picture data set subjected to deep learning network training is obtained through sorting, a preset training mode is called to train the selected deep learning network (namely any one of networks such as MobileNet V1, VGG, inclusion V3 and ResNet) until the training converges, and then the deep learning network model with the training converged is converted into a scene classification model capable of being applied to a sweeping robot for scene recognition based on model type conversion.
Further, in step S400, the obtaining of the preset scene classification model according to the picture data set training includes:
step S401, performing data format conversion on each picture data set to obtain each training sample data.
And calling an existing data set conversion algorithm, and converting the image data set of each preset cleaning scene generated by sorting into training sample data which can be directly used for training a deep learning network.
Specifically, for example, using tensorflow (an open source code software library for performing high performance numerical calculation) official existing code, the image data sets of seven cleaning scenes of each kitchen, bedroom, bathroom, living room, dining room, corridor and balcony are identically converted to generate TFRecord (a binary data format) format, so that training sample data for training by the deep learning network MobileNetV1 can be directly input.
Step S402, inputting each training sample data into a deep learning network, and training by adopting a preset mode until the deep learning network converges.
And inputting the converted and generated training sample data into the selected deep learning network, and then calling a preset training mode to train until the training of the deep learning network is converged.
Further, in step S402, training in a preset mode until the deep learning network converges includes:
step S4021, adjusting the application scene category of the deep learning network classification layer to the category of each preset cleaning scene.
Step S4022, performing cyclic training on the weight of the classification layer after the application scene category is adjusted according to preset times until the training of the classification layer is converged.
Specifically, for example, since the pictures of the cleaning scenes to which the respective environments of the home belong are similar between the scenes of different categories of the cleaning scenes and have the property of difference between the scenes of the same category (for example, in several cleaning scenes such as a kitchen, a bedroom, a bathroom, a living room, a dining room, a corridor, and a balcony, the picture data between the living room and the dining room may be similar, and the picture data of the corridor may be different from each other), when the mobilenetV1 is trained, a fine-tune (model fine tuning) official model is adopted, a feature extraction layer of the model mobilenetV1 is not trained, 1000 types of official original recognition scenes are applied to recognition and classification of 8 cleaning scenes in the scheme, and only training the weights of the Mobilene/logs and the Mobilene/Predictions of the classification layer, training 3 ten thousand epochs (1 epoch is equal to one time of training by using all samples in training sample data), and reducing the loss (loss value) to about 0.2 (namely judging that the training is converged).
And step S403, converting the model type of the deep learning network with the converged training into the preset scene classification model.
Specifically, for example, based on calling an existing arbitrary model conversion algorithm, a deep learning network model with converged training is converted into a scene classification model of a TFLite type (i.e., a model file with a suffix of ". tflifte") that can be applied to a sweeping robot chip for scene recognition.
In this embodiment, scene definitions are preset by a researcher based on an algorithm, pictures shot and collected according to different cleaning scene categories to which the surrounding environment belongs have different feature objects, the pictures of each scene are collected and classified, so that a preset number of picture data sets of the preset cleaning scene are generated, then, a preset training mode is adopted to perform deep learning network training based on the picture data sets owned by each preset cleaning scene, and thus, a preset scene classification model for scene recognition of the cleaning robot is obtained.
The method and the system have the advantages that the scene classification model which is used for carrying out scene recognition when the sweeping robot runs is trained and generated by combining a deep learning network, so that the sweeping robot can carry out scene recognition on the current environment in real time, intelligent sweeping is carried out according to different scene categories, and the sweeping efficiency of the sweeping robot is improved.
Further, a third embodiment of the cleaning method based on scene recognition according to the present invention is proposed based on the first embodiment and the second embodiment of the first cleaning method based on scene recognition.
In a third embodiment of the cleaning method based on scene recognition of the present invention, the step S200 of inputting the real-time image into a preset scene classification model for scene recognition includes:
step S201, inputting the real-time image into a preset scene classification model for calculation of the preset scene classification model and outputting a label result;
step S202, identifying and determining a target scene according to the corresponding relation between the label result and each preset cleaning scene.
Specifically, for example, in the real-time running and walking process of the sweeping robot based on a camera, the sweeping robot takes a picture of the surrounding environment, so that after a real-time image of the current surrounding environment of the sweeping robot is acquired, the real-time image is input into a scene classification model of a TFLite type for scene recognition, and the scene classification model calculates and outputs a digital label result of one of 10 digital labels (7 family scenes +3 background digital labels) ranging from 0 to 9 based on the real-time image; when the digital label result output by the scene classification model is a digital label of 0-6, the sweeping robot respectively corresponds to 7 different cleaning scenes according to the digital label of 0-6: kitchen, bedroom, bathroom, sitting room, eating room, corridor and balcony, the target scene of confirming the present environment of locating is kitchen, bedroom, bathroom, sitting room, eating room, corridor or balcony, perhaps, when the digital label result of scene classification model output is 7 ~ 9 digital label, the robot of sweeping the floor corresponds 3 different background class scenes respectively according to this 7 ~ 9 digital label: the method comprises the steps of determining the target background of the current environment to be the ground, the wall corner or sundries.
Further, in step S300, the adjusting the cleaning strategy in real time for cleaning based on the recognition result obtained by performing the scene recognition includes:
step S301, detecting and identifying a target strategy mapped by the target scene according to the mapping relation between each preset cleaning scene and the cleaning strategy;
step S302, reading each cleaning parameter of the target strategy, correspondingly adjusting each current cleaning parameter to each cleaning parameter of the target strategy, and cleaning.
After the sweeping robot identifies and determines the scene category to which the current surrounding environment belongs according to the label result output by the training and calculation of the preset scene classification model, the corresponding target sweeping strategy used for sweeping the current environment is determined according to the respective mapped request strategies of the scene categories, each sweeping parameter (such as the suction force of the sweeping robot) of the determined target sweeping strategy is read, the currently set sweeping parameter is correspondingly adjusted to the read target parameter, and then the current environment is swept according to each adjusted sweeping parameter.
Specifically, for example, in the operation flow of the sweeping robot shown in fig. 3, during the operation and cleaning process of the sweeping robot, the sweeping robot takes a photo image in real time through a camera, and when it is determined (self-determination or determination according to user settings) that scene recognition is needed, the photo image is input into a TFLite type scene classification model based on the deep network MobileNetV1, and after the current scene (i.e. returned scene category) is determined by calculating an output digital label result based on the scene classification model, the upper layer processing logic of the sweeping robot receives the recognition to determine the current scene, and adapts a corresponding cleaning strategy according to different home scenes (for example, a living room and a bedroom are easy to drop sundries such as particles with large adhesion, and the cleaning can be performed only by using a larger suction force; a bedroom, a corridor, and a balcony are mainly used for cleaning ash layers, relatively less suction may be used) for intelligent sweeping.
In this embodiment, after the sweeping robot identifies and determines that the scene category to which the current surrounding environment belongs is a kitchen, a bedroom, a bathroom, a living room, a dining room, a corridor, or a balcony according to the tag result calculated and output by the preset scene classification model, the current cleaning strategy of the sweeping robot is adjusted in real time according to the corresponding scene information of the kitchen, the bedroom, the bathroom, the living room, the dining room, the corridor, or the balcony, and the current surrounding environment is cleaned according to the adjusted cleaning strategy.
The sweeping robot realizes that the sweeping robot adjusts the sweeping strategy of the sweeping robot in real time based on the result of scene recognition, so that the sweeping robot can intelligently sweep according to different scene categories, and the sweeping efficiency of the sweeping robot is improved.
In addition, referring to fig. 4, an embodiment of the present invention further provides a cleaning system based on scene recognition, where the cleaning system based on scene recognition includes:
the acquisition module is used for acquiring a real-time image of the surrounding environment;
the recognition module is used for inputting the real-time image to a preset scene classification model for scene recognition;
and the cleaning module is used for adjusting a cleaning strategy in real time for cleaning based on the identification result obtained by scene identification.
Preferably, the sweeping system based on scene recognition of the present invention further comprises:
and the training module is used for generating picture data sets of all preset cleaning scenes based on scene definition and training according to all the picture data sets to obtain the preset scene classification model.
Preferably, the training module comprises:
the first conversion unit is used for performing data format conversion on each picture data set to obtain each training sample data;
the training unit is used for inputting the training sample data into a deep learning network and training the deep learning network in a preset mode until the deep learning network converges;
and the second conversion unit is used for converting the model type of the deep learning network subjected to training convergence into the preset scene classification model.
Preferably, the training unit further comprises:
the adjusting unit is used for adjusting the application scene category of the deep learning network classification layer into the category of each preset cleaning scene;
and the training subunit is used for carrying out cyclic training on the weight of the classification layer after the application scene category is adjusted according to preset times until the training of the classification layer is converged.
Preferably, the identification module comprises:
the transmission unit is used for inputting the real-time image into a preset scene classification model so as to calculate the preset scene classification model and output a label result;
and the identification determining unit is used for identifying and determining a target scene according to the corresponding relation between the label result and each preset cleaning scene.
Preferably, the sweeping module further comprises:
the detection unit is used for detecting and identifying a target strategy mapped by the target scene according to the mapping relation between each preset cleaning scene and the cleaning strategy;
and the adjusting and cleaning unit is used for reading each cleaning parameter of the target strategy, correspondingly adjusting each current cleaning parameter to each cleaning parameter of the target strategy and cleaning.
The steps of the cleaning method based on scene recognition described above are implemented when each functional module of the cleaning system based on scene recognition provided in this embodiment runs, and are not described herein again.
In addition, an embodiment of the present invention further provides a computer-readable storage medium, which is applied to a computer, and the computer-readable storage medium may be a non-volatile computer-readable storage medium, on which a cleaning program based on scene recognition is stored, and when being executed by a processor, the cleaning program based on scene recognition implements the steps of the cleaning method based on scene recognition as described above.
The steps implemented when the cleaning program based on scene recognition running on the processor is executed may refer to various embodiments of the cleaning method based on scene recognition of the present invention, and are not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk), and includes several instructions for enabling a sweeping device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A cleaning method based on scene recognition is characterized in that the cleaning method based on scene recognition comprises the following steps:
acquiring a real-time image of the surrounding environment;
inputting the real-time image into a preset scene classification model for scene recognition;
and adjusting a cleaning strategy in real time for cleaning based on the identification result obtained by scene identification.
2. The cleaning method based on scene recognition according to claim 1, wherein before the step of inputting the real-time image to a preset scene classification model for scene recognition, the method further comprises:
and generating picture data sets of all preset cleaning scenes based on scene definition, and training according to all the picture data sets to obtain the preset scene classification model.
3. The cleaning method based on scene recognition according to claim 2, wherein the step of training the preset scene classification model according to the picture data set includes:
performing data format conversion on each picture data set to obtain each training sample data;
inputting each training sample data into a deep learning network, and training by adopting a preset mode until the deep learning network converges;
and converting the model type of the deep learning network with the converged training into the preset scene classification model.
4. The cleaning method based on scene recognition according to claim 3, wherein the step of training with a preset pattern until the deep learning network converges comprises:
adjusting the application scene category of the deep learning network classification layer to the category of each preset cleaning scene;
and performing cyclic training on the weight of the classification layer after the application scene category is adjusted according to preset times until the training of the classification layer is converged.
5. The cleaning method based on scene recognition according to any one of claims 1 to 4, wherein the step of inputting the real-time image to a preset scene classification model for scene recognition comprises:
inputting the real-time image into a preset scene classification model for calculation of the preset scene classification model and outputting a label result;
and identifying and determining a target scene according to the corresponding relation between the label result and each preset cleaning scene.
6. The cleaning method based on scene recognition according to claim 5, wherein the step of adjusting the cleaning strategy in real time for cleaning based on the recognition result obtained by the scene recognition comprises:
detecting and identifying a target strategy mapped by the target scene according to the mapping relation between each preset cleaning scene and the cleaning strategy;
and reading each cleaning parameter of the target strategy, correspondingly adjusting each current cleaning parameter to each cleaning parameter of the target strategy, and cleaning.
7. A cleaning system based on scene recognition is characterized in that the cleaning system based on scene recognition comprises:
the acquisition module is used for acquiring a real-time image of the surrounding environment;
the recognition module is used for inputting the real-time image to a preset scene classification model for scene recognition;
and the cleaning module is used for adjusting a cleaning strategy in real time for cleaning based on the identification result obtained by scene identification.
8. The scene recognition based sweeping system of claim 7, further comprising:
and the training module is used for generating picture data sets of all preset cleaning scenes based on scene definition and training according to all the picture data sets to obtain the preset scene classification model.
9. A floor sweeping apparatus, comprising: a memory, a processor and a scene recognition based cleaning program stored on the memory and executable on the processor, the scene recognition based cleaning program when executed by the processor implementing the steps of the scene recognition based cleaning method according to any one of claims 1 to 6.
10. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, implements the steps of the scene recognition based cleaning method according to any one of claims 1 to 6.
CN201911138059.6A 2019-11-18 2019-11-18 Sweeping method and system based on scene recognition, sweeping equipment and storage medium Pending CN111012261A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911138059.6A CN111012261A (en) 2019-11-18 2019-11-18 Sweeping method and system based on scene recognition, sweeping equipment and storage medium
PCT/CN2020/127232 WO2021098536A1 (en) 2019-11-18 2020-11-06 Cleaning method and system employing scene recognition, sweeping device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911138059.6A CN111012261A (en) 2019-11-18 2019-11-18 Sweeping method and system based on scene recognition, sweeping equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111012261A true CN111012261A (en) 2020-04-17

Family

ID=70200582

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911138059.6A Pending CN111012261A (en) 2019-11-18 2019-11-18 Sweeping method and system based on scene recognition, sweeping equipment and storage medium

Country Status (2)

Country Link
CN (1) CN111012261A (en)
WO (1) WO2021098536A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111539398A (en) * 2020-07-13 2020-08-14 追创科技(苏州)有限公司 Control method and device of self-moving equipment and storage medium
CN111528739A (en) * 2020-05-09 2020-08-14 小狗电器互联网科技(北京)股份有限公司 Sweeping mode switching method and system, electronic equipment, storage medium and sweeper
CN111539400A (en) * 2020-07-13 2020-08-14 追创科技(苏州)有限公司 Control method and device of self-moving equipment, storage medium and self-moving equipment
CN111539399A (en) * 2020-07-13 2020-08-14 追创科技(苏州)有限公司 Control method and device of self-moving equipment, storage medium and self-moving equipment
CN111568314A (en) * 2020-05-26 2020-08-25 深圳市杉川机器人有限公司 Cleaning method and device based on scene recognition, cleaning robot and storage medium
CN111643010A (en) * 2020-05-26 2020-09-11 深圳市杉川机器人有限公司 Cleaning robot control method and device, cleaning robot and storage medium
CN112515536A (en) * 2020-10-20 2021-03-19 深圳市银星智能科技股份有限公司 Control method and device of dust collection robot and dust collection robot
WO2021098536A1 (en) * 2019-11-18 2021-05-27 深圳市杉川机器人有限公司 Cleaning method and system employing scene recognition, sweeping device, and storage medium
CN112926512A (en) * 2021-03-25 2021-06-08 深圳市无限动力发展有限公司 Environment type identification method and device and computer equipment
CN113111735A (en) * 2021-03-25 2021-07-13 西安电子科技大学 Rapid scene recognition method and device under complex environment
WO2021174889A1 (en) * 2020-03-06 2021-09-10 珠海格力电器股份有限公司 Method and apparatus for recognizing target region, terminal, and computer readable medium
WO2022012471A1 (en) * 2020-07-13 2022-01-20 追觅创新科技(苏州)有限公司 Control method for self-moving device, apparatus, storage medium, and self-moving device
CN114343478A (en) * 2020-09-30 2022-04-15 宁波方太厨具有限公司 Scene recognition method of cleaning robot and cleaning robot
CN116098536A (en) * 2021-11-08 2023-05-12 青岛海尔科技有限公司 Robot control method and device

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113925396B (en) * 2021-10-29 2024-05-31 青岛海尔科技有限公司 Method and device for floor cleaning and storage medium
CN114415657A (en) * 2021-12-09 2022-04-29 安克创新科技股份有限公司 Cleaning robot wall-following method based on deep reinforcement learning and cleaning robot
CN114451841B (en) * 2022-03-11 2023-04-18 深圳市无限动力发展有限公司 Sweeping method and device of sweeping robot, storage medium and sweeping robot
CN114747980B (en) * 2022-03-31 2024-06-18 苏州三六零机器人科技有限公司 Method, device, equipment and storage medium for determining water yield of sweeping robot

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106845549A (en) * 2017-01-22 2017-06-13 珠海习悦信息技术有限公司 A kind of method and device of the scene based on multi-task learning and target identification
CN108742360A (en) * 2018-09-03 2018-11-06 信利光电股份有限公司 A kind of cleaning method of sweeping robot, device, equipment and storage medium
CN109452914A (en) * 2018-11-01 2019-03-12 北京石头世纪科技有限公司 Intelligent cleaning equipment, cleaning mode selection method, computer storage medium
CN110174888A (en) * 2018-08-09 2019-08-27 深圳瑞科时尚电子有限公司 Self-movement robot control method, device, equipment and storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102015325B1 (en) * 2013-01-29 2019-08-28 삼성전자주식회사 Robot cleaner and method for controlling the same
CN107684401A (en) * 2017-09-25 2018-02-13 北京石头世纪科技有限公司 The control method and control device of intelligent cleaning equipment
CN108125622A (en) * 2017-12-15 2018-06-08 珊口(上海)智能科技有限公司 Control method, system and the clean robot being applicable in
CN108594692A (en) * 2017-12-18 2018-09-28 深圳市奇虎智能科技有限公司 A kind of cleaning equipment control method, device, computer equipment and storage medium
CN108416271A (en) * 2018-02-06 2018-08-17 宁夏宁信信息科技有限公司 Cleaning method and purging system
CN108968811A (en) * 2018-06-20 2018-12-11 四川斐讯信息技术有限公司 A kind of object identification method and system of sweeping robot
CN111035328B (en) * 2018-10-12 2022-12-16 科沃斯机器人股份有限公司 Robot cleaning method and robot
CN109753890B (en) * 2018-12-18 2022-09-23 吉林大学 Intelligent recognition and sensing method for road surface garbage and implementation device thereof
CN110200549A (en) * 2019-04-22 2019-09-06 深圳飞科机器人有限公司 Clean robot control method and Related product
KR20190106891A (en) * 2019-08-28 2019-09-18 엘지전자 주식회사 Artificial intelligence monitoring device and operating method thereof
CN111012261A (en) * 2019-11-18 2020-04-17 深圳市杉川机器人有限公司 Sweeping method and system based on scene recognition, sweeping equipment and storage medium
CN111568314B (en) * 2020-05-26 2022-04-26 深圳市杉川机器人有限公司 Cleaning method and device based on scene recognition, cleaning robot and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106845549A (en) * 2017-01-22 2017-06-13 珠海习悦信息技术有限公司 A kind of method and device of the scene based on multi-task learning and target identification
CN110174888A (en) * 2018-08-09 2019-08-27 深圳瑞科时尚电子有限公司 Self-movement robot control method, device, equipment and storage medium
CN108742360A (en) * 2018-09-03 2018-11-06 信利光电股份有限公司 A kind of cleaning method of sweeping robot, device, equipment and storage medium
CN109452914A (en) * 2018-11-01 2019-03-12 北京石头世纪科技有限公司 Intelligent cleaning equipment, cleaning mode selection method, computer storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
程光等: "《加密流量测量和分析》", 31 December 2018 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021098536A1 (en) * 2019-11-18 2021-05-27 深圳市杉川机器人有限公司 Cleaning method and system employing scene recognition, sweeping device, and storage medium
WO2021174889A1 (en) * 2020-03-06 2021-09-10 珠海格力电器股份有限公司 Method and apparatus for recognizing target region, terminal, and computer readable medium
CN111528739A (en) * 2020-05-09 2020-08-14 小狗电器互联网科技(北京)股份有限公司 Sweeping mode switching method and system, electronic equipment, storage medium and sweeper
CN111643010B (en) * 2020-05-26 2022-03-11 深圳市杉川机器人有限公司 Cleaning robot control method and device, cleaning robot and storage medium
CN111568314A (en) * 2020-05-26 2020-08-25 深圳市杉川机器人有限公司 Cleaning method and device based on scene recognition, cleaning robot and storage medium
CN111643010A (en) * 2020-05-26 2020-09-11 深圳市杉川机器人有限公司 Cleaning robot control method and device, cleaning robot and storage medium
WO2022012471A1 (en) * 2020-07-13 2022-01-20 追觅创新科技(苏州)有限公司 Control method for self-moving device, apparatus, storage medium, and self-moving device
CN111539399A (en) * 2020-07-13 2020-08-14 追创科技(苏州)有限公司 Control method and device of self-moving equipment, storage medium and self-moving equipment
CN111539398B (en) * 2020-07-13 2021-10-01 追觅创新科技(苏州)有限公司 Control method and device of self-moving equipment and storage medium
CN111539398A (en) * 2020-07-13 2020-08-14 追创科技(苏州)有限公司 Control method and device of self-moving equipment and storage medium
CN111539400A (en) * 2020-07-13 2020-08-14 追创科技(苏州)有限公司 Control method and device of self-moving equipment, storage medium and self-moving equipment
CN114343478A (en) * 2020-09-30 2022-04-15 宁波方太厨具有限公司 Scene recognition method of cleaning robot and cleaning robot
CN112515536A (en) * 2020-10-20 2021-03-19 深圳市银星智能科技股份有限公司 Control method and device of dust collection robot and dust collection robot
CN112515536B (en) * 2020-10-20 2022-05-03 深圳市银星智能科技股份有限公司 Control method and device of dust collection robot and dust collection robot
CN112926512A (en) * 2021-03-25 2021-06-08 深圳市无限动力发展有限公司 Environment type identification method and device and computer equipment
CN113111735A (en) * 2021-03-25 2021-07-13 西安电子科技大学 Rapid scene recognition method and device under complex environment
CN112926512B (en) * 2021-03-25 2024-03-15 深圳市无限动力发展有限公司 Environment type identification method and device and computer equipment
CN116098536A (en) * 2021-11-08 2023-05-12 青岛海尔科技有限公司 Robot control method and device

Also Published As

Publication number Publication date
WO2021098536A1 (en) 2021-05-27

Similar Documents

Publication Publication Date Title
CN111012261A (en) Sweeping method and system based on scene recognition, sweeping equipment and storage medium
CN107481327B (en) About the processing method of augmented reality scene, device, terminal device and system
JP2022504704A (en) Target detection methods, model training methods, equipment, equipment and computer programs
CN104573706A (en) Object identification method and system thereof
JP2022553252A (en) IMAGE PROCESSING METHOD, IMAGE PROCESSING APPARATUS, SERVER, AND COMPUTER PROGRAM
US20200074175A1 (en) Object cognitive identification solution
CN115129848A (en) Method, device, equipment and medium for processing visual question-answering task
CN113312957A (en) off-Shift identification method, device, equipment and storage medium based on video image
CN111124863B (en) Intelligent device performance testing method and device and intelligent device
CN112613548A (en) User customized target detection method, system and storage medium based on weak supervised learning
CN112820071A (en) Behavior identification method and device
CN112529149A (en) Data processing method and related device
CN109741108A (en) Streaming application recommended method, device and electronic equipment based on context aware
CN111813910A (en) Method, system, terminal device and computer storage medium for updating customer service problem
CN106777066B (en) Method and device for image recognition and media file matching
CN113378768A (en) Garbage can state identification method, device, equipment and storage medium
CN112418159A (en) Attention mask based diner monitoring method and device and electronic equipment
CN112183460A (en) Method and device for intelligently identifying environmental sanitation
CN111797874B (en) Behavior prediction method and device, storage medium and electronic equipment
CN116977256A (en) Training method, device, equipment and storage medium for defect detection model
Sun et al. Automatic building age prediction from street view images
CN111798019A (en) Intention prediction method, intention prediction device, storage medium and electronic equipment
CN115937662A (en) Intelligent household system control method and device, server and storage medium
CN111797867A (en) System resource optimization method and device, storage medium and electronic equipment
CN115424346A (en) Human body sitting posture detection method and device, computer equipment and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200417

RJ01 Rejection of invention patent application after publication