CN114343504A - Sweeping strategy generation method, device, equipment and storage medium of sweeping robot - Google Patents

Sweeping strategy generation method, device, equipment and storage medium of sweeping robot Download PDF

Info

Publication number
CN114343504A
CN114343504A CN202210095116.2A CN202210095116A CN114343504A CN 114343504 A CN114343504 A CN 114343504A CN 202210095116 A CN202210095116 A CN 202210095116A CN 114343504 A CN114343504 A CN 114343504A
Authority
CN
China
Prior art keywords
ground
cleaned
characteristic information
pixel characteristic
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210095116.2A
Other languages
Chinese (zh)
Inventor
何围
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202210095116.2A priority Critical patent/CN114343504A/en
Publication of CN114343504A publication Critical patent/CN114343504A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application is suitable for the technical field of artificial intelligence, and provides a sweeping strategy generation method, a sweeping strategy generation device, sweeping equipment and a storage medium of a sweeping robot, wherein the method comprises the following steps: acquiring a ground image to be cleaned, and performing image pixel characteristic extraction processing on the ground image to be cleaned to obtain first pixel characteristic information of the ground image to be cleaned; according to the first pixel characteristic information, performing ground type analysis and dirt degree analysis on the ground to be cleaned displayed in the ground image to be cleaned, and determining the ground type and the dirt degree of the ground to be cleaned; and inputting the ground type and the dirt degree of the ground to be cleaned into a pre-trained cleaning strategy generation model for strategy generation processing, and generating the cleaning strategy of the ground to be cleaned. Based on the method, an accurate and effective cleaning strategy can be made according to the ground type and the dirt degree of the ground, the situations of unclean cleaning or excessive cleaning are reduced, and the cleaning efficiency is improved.

Description

Sweeping strategy generation method, device, equipment and storage medium of sweeping robot
Technical Field
The application belongs to the technical field of artificial intelligence equipment, and particularly relates to a sweeping strategy generation method, a sweeping strategy generation device, sweeping equipment and a storage medium for a sweeping robot.
Background
With the development of artificial intelligence technology, intelligent robots are widely used in people's lives and works. The floor sweeping robot is also called an automatic floor sweeping machine, an intelligent dust collection robot, a dust collector and the like, and automatically finishes cleaning work such as floor sweeping, dust collection, floor wiping and the like through intelligent control. At present, the existing sweeping robots in the market are generally two types as follows: one is to manually set a cleaning mode, and the other is to fixedly set a cleaning mode in the sweeping robot. However, the operation flow of the sweeping robot for manually setting the sweeping mode is complex, and it is difficult to make an accurate and effective sweeping strategy according to the ground type and the dirt degree of the ground; the sweeping robot with the fixed sweeping mode is single in sweeping mode and inflexible, cannot make an accurate and effective sweeping strategy according to the ground type and the dirt degree of the ground, and is easy to cause the situation of unclean sweeping or excessive sweeping.
Disclosure of Invention
In view of this, embodiments of the present application provide a cleaning strategy generation method, apparatus, device and storage medium for a floor cleaning robot, which can make an accurate and effective cleaning strategy according to a ground type and a dirt degree of a ground, and is flexible in cleaning mode, and can reduce the occurrence of unclean cleaning or excessive cleaning, and improve the cleaning efficiency.
A first aspect of an embodiment of the present application provides a cleaning strategy generation method for a cleaning robot, where the cleaning strategy generation method for the cleaning robot includes:
acquiring a ground image to be cleaned, and performing image pixel characteristic extraction processing on the ground image to be cleaned to obtain first pixel characteristic information of the ground image to be cleaned;
according to the first pixel characteristic information, performing ground type analysis and dirt degree analysis on the ground to be cleaned displayed in the ground image to be cleaned, and determining the ground type and the dirt degree of the ground to be cleaned;
and inputting the ground type and the dirt degree of the ground to be cleaned into a pre-trained cleaning strategy generation model for strategy generation processing, and generating the cleaning strategy of the ground to be cleaned.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the sweeping strategy includes at least one of information on number of times of sweeping, information on detergent addition, and information on sweeping manner, where the detergent addition information indicates an addition amount of detergent, and the sweeping manner information indicates at least one of a pure sweeping type, a pure suction type, a suction sweeping type, a linear sweeping type, and a rotary sweeping type.
With reference to the first aspect, in a second possible implementation manner of the first aspect, the step of performing ground type analysis and dirt degree analysis on the ground to be cleaned displayed in the ground image to be cleaned according to the first pixel feature information to obtain the ground type and the dirt degree of the ground to be cleaned includes:
acquiring second pixel characteristic information from a preset ground type-pixel characteristic information corresponding relation table, performing ground type characteristic comparison analysis on the first pixel characteristic information and the second pixel characteristic information, and calculating ground type characteristic similarity between the first pixel characteristic information and the second pixel characteristic information, wherein the second pixel characteristic information is pixel characteristic information corresponding to any one ground type in the ground type-pixel characteristic information corresponding relation table;
and comparing the ground type feature similarity with a preset first threshold, and if the ground type feature similarity is greater than the preset first threshold, determining the ground type corresponding to the second pixel feature information as the ground type of the ground to be cleaned.
With reference to the first aspect, in a third possible implementation manner of the first aspect, the step of performing ground type analysis and dirt degree analysis on the ground to be cleaned displayed in the ground image to be cleaned according to the first pixel feature information to obtain the ground type and the dirt degree of the ground to be cleaned includes:
acquiring third pixel characteristic information from a preset dirty degree-pixel characteristic information corresponding relation table, performing dirty characteristic comparison analysis on the first pixel characteristic information and the third pixel characteristic information, and calculating dirty characteristic similarity between the first pixel characteristic information and the third pixel characteristic information, wherein the third pixel characteristic information is pixel characteristic information corresponding to any dirty degree in the dirty degree-pixel characteristic information corresponding relation table;
and comparing the dirty characteristic similarity with a preset second threshold, and if the dirty characteristic similarity is greater than the preset second threshold, determining the dirty degree corresponding to the third pixel characteristic information as the dirty degree of the ground to be cleaned.
With reference to the first aspect or the first, second, or third possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, after the step of performing, according to the pixel feature information, ground type analysis and dirt level analysis on the ground to be cleaned displayed in the ground image to be cleaned, the method further includes:
if the ground type and the dirt degree of the ground to be cleaned cannot be determined, the ground image to be cleaned is fed back to a user operation end, so that a user is instructed to manually set a cleaning strategy of the ground to be cleaned at the user operation end according to the ground image to be cleaned.
With reference to the fourth possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, if the ground type and the degree of soiling of the ground to be cleaned are not determined, the feeding back the image of the ground to be cleaned to a user operation end to instruct a user to manually set a cleaning strategy for the ground to be cleaned according to the image of the ground to be cleaned at the user operation end further includes:
constructing a training sample for optimizing the cleaning strategy generation model according to the first pixel characteristic information of the ground image to be cleaned and the cleaning strategy of the ground to be cleaned, wherein the training sample comprises the pixel characteristic information and the cleaning strategy;
and taking the pixel characteristic information in the training sample as the input of the cleaning strategy generation model and the cleaning strategy in the training sample as the output of the cleaning strategy generation model, and carrying out neural network training on the cleaning strategy generation model so as to optimize the cleaning strategy generation model.
A second aspect of the embodiments of the present application provides a cleaning strategy generating device for a cleaning robot, where the cleaning strategy generating device for the cleaning robot includes:
the characteristic acquisition module is used for acquiring a ground image to be cleaned and carrying out image pixel characteristic extraction processing on the ground image to be cleaned to acquire first pixel characteristic information of the ground image to be cleaned;
the characteristic analysis module is used for carrying out ground type analysis and dirt degree analysis on the ground to be cleaned displayed in the ground image to be cleaned according to the first pixel characteristic information, and determining the ground type and the dirt degree of the ground to be cleaned;
and the strategy generation module is used for inputting the ground type and the dirt degree of the ground to be cleaned into a pre-trained cleaning strategy generation model to perform strategy generation processing so as to generate the cleaning strategy of the ground to be cleaned.
With reference to the second aspect, in a first possible implementation manner of the second aspect, the cleaning strategy generating device of the cleaning robot further includes:
the first analysis submodule is used for acquiring second pixel characteristic information from a preset ground type-pixel characteristic information corresponding relation table, performing ground type characteristic comparison analysis on the first pixel characteristic information and the second pixel characteristic information, and calculating the ground type characteristic similarity between the first pixel characteristic information and the second pixel characteristic information, wherein the second pixel characteristic information is pixel characteristic information corresponding to any ground type in the ground type-pixel characteristic information corresponding relation table;
and the first determining submodule is used for comparing the ground type feature similarity with a preset first threshold value, and if the ground type feature similarity is greater than the preset first threshold value, determining the ground type corresponding to the second pixel feature information as the ground type of the ground to be cleaned.
A third aspect of the embodiments of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the cleaning strategy generation method for the cleaning robot according to any one of the first aspect when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the cleaning strategy generating method of the cleaning robot according to any one of the first aspects.
Compared with the prior art, the embodiment of the application has the advantages that:
the application acquires the pixel characteristic information of the ground image to be cleaned through machine vision identification, then based on the pixel characteristic information, the ground to be cleaned displayed in the ground image to be cleaned is subjected to ground type analysis and dirt degree analysis, the ground type and the dirt degree reflecting the current situation of the ground to be cleaned can be acquired, and then the ground type and the dirt degree reflecting the current situation of the ground to be cleaned are input into a pre-trained cleaning strategy generation model to be subjected to strategy generation processing.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a basic method flow diagram of a cleaning strategy generation method of a cleaning robot provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of a method for analyzing a ground type in the sweeping strategy generating method of the sweeping robot according to the embodiment of the present application;
fig. 3 is a schematic flow chart of a method for analyzing a degree of soiling in the sweeping strategy generation method of the sweeping robot according to the embodiment of the present application;
fig. 4 is a schematic flow chart of a method for optimizing a cleaning strategy generation model in the cleaning strategy generation method of the cleaning robot provided in the embodiment of the present application;
fig. 5 is a schematic structural diagram of a cleaning strategy generating device of a cleaning robot according to an embodiment of the present application;
fig. 6 is a schematic diagram of a first detailed structure of a cleaning strategy generation device of a cleaning robot provided in the embodiment of the present application;
fig. 7 is a schematic view of an electronic device for implementing a cleaning strategy generation method of a cleaning robot according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
In some embodiments of the present application, please refer to fig. 1, and fig. 1 is a basic method flowchart of a cleaning strategy generation method of a cleaning robot provided in the embodiments of the present application. The details are as follows:
step S11: the method comprises the steps of collecting a ground image to be cleaned, carrying out image pixel feature extraction processing on the ground image to be cleaned, and obtaining first pixel feature information corresponding to the ground image to be cleaned.
In this embodiment, install the camera among the robot of sweeping the floor, the robot of sweeping the floor carries out ground shooting to the ground that needs to clean through the camera along its motion trail at the operation in-process, can gather and obtain the corresponding ground image of waiting to clean. After the ground image to be cleaned is obtained, image pixel feature extraction processing is carried out on the ground image to be cleaned, so that first pixel feature information corresponding to the ground image to be cleaned is obtained. The first pixel characteristic information represents color values of B, G, R channels of a pixel point in the ground image to be cleaned, and the color values of the three channels can be obtained by identifying and reading through an Opencv (Open Source Computer Vision Library) technology. Exemplarily, in this embodiment, the ground image to be cleaned can be divided into a plurality of small squares by performing pixel segmentation on the ground image to be cleaned, each small square is characterized as a pixel of the ground image to be cleaned, and color data of each pixel is read, and the color data of each pixel is characterized as color values of B, G, R three channels. And collecting the color data of all the pixel points to obtain a color data set of the image to be cleaned, and taking the color data set as first pixel characteristic information corresponding to the image of the ground to be cleaned.
Step S12: and according to the first pixel characteristic information, performing ground type analysis and dirt degree analysis on the ground to be cleaned displayed in the ground image to be cleaned, and determining the ground type and the dirt degree of the ground to be cleaned.
In this embodiment, the ground type analysis specifically includes comparing and analyzing the first pixel characteristic information with each of the pixel characteristic information representing different ground types, calculating similarities between the first pixel characteristic information and each of the pixel characteristic information representing different ground types, and determining which ground type the ground to be cleaned belongs to based on the similarities. The dirt degree analysis specifically compares and analyzes the first pixel characteristic information with each pixel characteristic information representing different dirt degrees, calculates the similarity between the first pixel characteristic information and each pixel characteristic information representing different dirt degrees, and then judges which dirt degree the ground to be cleaned is based on the similarity. In this embodiment, each piece of pixel characteristic information representing different ground types and each piece of pixel characteristic information representing different dirt degrees may be stored in the sweeping robot in advance, and when the sweeping robot performs ground type analysis and dirt degree analysis on the ground to be swept, which is displayed in the ground image to be swept, according to the first piece of pixel characteristic information, each piece of pixel characteristic information representing different ground types and each piece of pixel characteristic information representing different dirt degrees may be directly obtained from the sweeping robot.
Step S13: and inputting the ground type and the dirt degree of the ground to be cleaned into a pre-trained cleaning strategy generation model for strategy generation processing, and generating the cleaning strategy of the ground to be cleaned.
In this embodiment, the cleaning strategy generation model is a neural network model obtained by performing neural network pre-training using a large amount of sample data (cleaning pattern data used by a user for different ground types/different soil levels). The cleaning strategy generation model is used for making an individualized cleaning strategy for the ground according to the ground type and the dirt degree of the ground. In this embodiment, after the ground to be cleaned is analyzed for the ground type and the dirt degree to obtain the corresponding ground type and the dirt degree, the obtained ground type and the dirt degree can be input into the pre-trained cleaning strategy generation model for strategy generation and processing, and an accurate and effective cleaning strategy is specifically formulated by the cleaning strategy model according to the current situation of the ground to be cleaned to serve as the cleaning strategy of the ground to be cleaned. In this embodiment, the cleaning strategy generated by the cleaning strategy model includes, but is not limited to, the following information: cleaning frequency information, detergent addition information, cleaning mode information, and the like. The sweeping mode of the sweeping robot specifically includes, but is not limited to, pure sweeping, pure suction, suction sweeping, linear scraping sweeping, rotation, and the like.
The sweeping strategy generating method of the sweeping robot provided by the embodiment obtains the pixel characteristic information of the image of the ground to be swept through machine vision identification, then, based on the pixel characteristic information, the ground type and the dirt degree of the ground to be cleaned displayed in the ground image to be cleaned are analyzed to obtain the ground type and the dirt degree reflecting the current situation of the ground to be cleaned, then the ground type and the dirt degree reflecting the current situation of the ground to be cleaned are input into a pre-trained cleaning strategy generation model for strategy generation processing, the cleaning strategy model specifically makes an accurate and effective cleaning strategy according to the current situation of the ground to be cleaned, the sweeping robot has the advantages that the sweeping mode is flexible, the situation that sweeping is not clean or excessive sweeping is reduced, and sweeping efficiency is effectively improved. Moreover, the intelligent recognition and decision-making cleaning mode can reduce the ineffective energy consumption of the sweeping robot and realize longer endurance time of the sweeping robot.
In some embodiments of the present application, please refer to fig. 2, and fig. 2 is a schematic flow chart of a method for analyzing a ground type in a cleaning strategy generation method of a cleaning robot provided in the embodiments of the present application. The details are as follows:
s21: acquiring second pixel characteristic information from a preset ground type-pixel characteristic information corresponding relation table, performing ground type characteristic comparison analysis on the first pixel characteristic information and the second pixel characteristic information, and calculating ground type characteristic similarity between the first pixel characteristic information and the second pixel characteristic information, wherein the second pixel characteristic information is pixel characteristic information corresponding to any one ground type in the ground type-pixel characteristic information corresponding relation table;
s22: and comparing the ground type feature similarity with a preset first threshold, and if the ground type feature similarity is greater than the preset first threshold, determining the ground type corresponding to the second pixel feature information as the ground type of the ground to be cleaned.
In this embodiment, in the sweeping robot, a ground type-pixel feature information correspondence table is generated in advance, a plurality of different ground types and respective pixel feature information of each ground type are recorded in the ground type-pixel feature information correspondence table, and a correspondence relationship is established between each ground type and its corresponding pixel feature information. Specifically, a plurality of different ground types may be obtained in advance by classifying common ground, for example, by material, by color, and the like. Then, for each ground type, a ground image of the ground type is collected, and image pixel feature extraction processing is performed on the collected ground image, thereby obtaining pixel feature information corresponding to the ground type and used for representing the ground type. After the pixel characteristic information is obtained, the ground type and the pixel characteristic information are mapped and associated, so that a corresponding relation is established between the ground type and the pixel characteristic information. And then, storing the ground type and the pixel characteristic information with the corresponding relation in a corresponding relation table, and when all the ground types obtained by classification establish the corresponding relation with the corresponding pixel characteristic information and store the corresponding relation in the corresponding relation table, generating a ground type-pixel characteristic information corresponding relation table. In this embodiment, after the first pixel feature information is obtained, the ground type-pixel feature information correspondence table may be traversed, and the pixel feature information corresponding to each ground type is obtained one by one from the ground type-pixel feature information correspondence table as the second pixel feature information, so that ground type feature comparison analysis is performed on the first pixel feature information and the second pixel feature information, and the ground type feature similarity between the first pixel feature information and the second pixel feature information is calculated. In the second pixel feature information, the color value data of each pixel point are collected to form a fixed feature sequence for representing the ground type. In this embodiment, when the ground type feature similarity between the first pixel feature information and the second pixel feature information is calculated, the first pixel feature information may be traversed based on a fixed feature sequence formed by the second pixel feature information, and whether the first pixel feature information includes a complete feature sequence or a partial feature sequence consistent with the fixed feature sequence is searched, if the first pixel feature information includes the complete feature sequence consistent with the fixed feature sequence, the ground type feature similarity between the first pixel feature information and the second pixel feature information is calculated to be 100%, and if the first pixel feature information includes the partial feature sequence consistent with the fixed feature sequence, the percentage of the partial feature sequence restoring the fixed feature sequence is calculated as the ground type feature similarity between the first pixel feature information and the second pixel feature information. After the ground type feature similarity is obtained through calculation, the ground type feature similarity is compared with a preset first threshold, if the ground type feature similarity is larger than the preset first threshold, the ground type corresponding to the second pixel feature information is determined as the ground type of the ground to be cleaned, otherwise, pixel feature information corresponding to another ground type is obtained from the ground type-pixel feature information corresponding relation table and is used as second pixel feature information to conduct ground type feature comparison analysis until the ground type of the ground to be cleaned is determined or pixel feature information corresponding to all the ground types in the ground type-pixel feature information corresponding relation table is determined to be subjected to ground type feature comparison analysis with the first pixel feature information. Based on the embodiment, the floor sweeping robot can intelligently identify the floor type through the analysis of the floor type, and the floor sweeping is realized by adopting different sweeping modes based on different floor types.
For example, in this embodiment, the ground type analysis may also be implemented by constructing an image analysis model based on a neural network in advance, and training the image analysis model to a convergence state by using a large number of ground images representing different ground types, so that the image analysis model has the capability of determining the ground type corresponding to the ground displayed in the ground image according to the pixel features in the ground image. Specifically, a large number of ground images of different ground types are collected, ground type labeling is carried out on each ground image, the ground image with the ground type label is used as a training sample, the ground image is used as input of an image analysis model, the ground type label of the ground image is used as output of the image analysis model, and a neural network layer in the image analysis model is trained, so that the neural network layer in the image analysis model obtains pixel characteristics corresponding to various ground type labels. After the image analysis model is trained to be in a convergence state, a neural network layer in the image analysis model is established to generate a ground type-pixel characteristic information corresponding relation table, and at the moment, the collected ground image to be cleaned is input into the trained image analysis model, so that the ground type of the ground image to be cleaned can be output by the image analysis model.
In some embodiments of the present application, please refer to fig. 3, and fig. 3 is a schematic flow chart of a method for analyzing a contamination level in a cleaning strategy generation method of a cleaning robot provided in the embodiments of the present application. The details are as follows:
s31: acquiring third pixel characteristic information from a preset dirty degree-pixel characteristic information corresponding relation table, performing dirty characteristic comparison analysis on the first pixel characteristic information and the third pixel characteristic information, and calculating dirty characteristic similarity between the first pixel characteristic information and the third pixel characteristic information, wherein the third pixel characteristic information is pixel characteristic information corresponding to any dirty degree in the dirty degree-pixel characteristic information corresponding relation table;
s32: and comparing the dirty characteristic similarity with a preset second threshold, and if the dirty characteristic similarity is greater than the preset second threshold, determining the dirty degree corresponding to the third pixel characteristic information as the dirty degree of the ground to be cleaned.
In this embodiment, in the sweeping robot, a corresponding relationship table of contamination degree-pixel characteristic information is generated in advance, a plurality of different contamination degrees and respective pixel characteristic information corresponding to each contamination degree are recorded in the corresponding relationship table of contamination degree-pixel characteristic information, and a corresponding relationship is established between each contamination degree and its corresponding pixel characteristic information. Specifically, a plurality of different contamination levels may be set in advance by using the number of contamination points, the contamination area, the nature or type of the contamination, and the like as the basis for dividing the different contamination levels. And for each contamination degree, collecting the ground image of the contamination degree, and performing image pixel feature extraction processing on the collected ground image, so as to obtain pixel feature information corresponding to the contamination degree and used for representing the contamination degree. After the pixel characteristic information is obtained, the contamination degree and the pixel characteristic information are mapped and associated, so that a corresponding relation is established between the contamination degree and the pixel characteristic information. And then, storing the contamination degrees and the pixel characteristic information with the corresponding relationship in a corresponding relationship table, and when all the contamination degrees obtained by setting establish the corresponding relationship with the corresponding pixel characteristic information and store the corresponding relationship in the corresponding relationship table, generating a contamination degree-pixel characteristic information corresponding relationship table. In this embodiment, after the first pixel feature information is obtained, the dirty degree-pixel feature information corresponding to each dirty degree may be obtained one by one from the dirty degree-pixel feature information corresponding table as the third pixel feature information by traversing the dirty degree-pixel feature information corresponding table, and then the dirty feature comparison analysis is performed on the first pixel feature information and the third pixel feature information to calculate the dirty feature similarity between the first pixel feature information and the third pixel feature information. In this embodiment, when the dirty feature similarity between the first pixel feature information and the third pixel feature information is calculated, based on the dirty number, the dirty area, and the dirty property or type information included in the third pixel feature information, the first pixel feature information is traversed, the dirty number, the dirty area, and the dirty property or type information included in the first pixel feature information are analyzed, the dirty feature similarity between the first pixel feature information and the third pixel feature information is calculated by integrating three dimensions of the dirty number, the dirty area, and the dirty property or type information, and then the dirty feature similarity is compared with a preset second threshold, if the dirty feature similarity is greater than the preset second threshold, the dirty degree corresponding to the third pixel feature information is determined as the dirty degree of the ground to be cleaned, otherwise, and acquiring pixel characteristic information corresponding to the other dirt degree from the dirt degree-pixel characteristic information corresponding relation table as third pixel characteristic information to perform dirt characteristic comparison analysis until the dirt degree of the ground to be cleaned or the pixel characteristic information corresponding to all the dirt degrees in the dirt degree-pixel characteristic information corresponding relation table is determined to perform dirt characteristic comparison analysis with the first pixel characteristic information. Based on this embodiment, make the dirty degree that robot can intelligent recognition ground of sweeping the floor through dirty degree analysis, realize adopting different modes of cleaning to clean the ground based on the dirty degree in different ground.
For example, in this embodiment, the contamination degree analysis may also be implemented by constructing an image analysis model based on a neural network in advance, and training the image analysis model to a convergence state by using a large number of ground images representing different contamination degrees, so that the image analysis model has the capability of determining the contamination degree corresponding to the ground displayed in the ground image according to the pixel features in the ground image. The image analysis model for the dirty degree analysis is substantially identical to the training method for the image analysis type for the ground type analysis, and the detailed description is omitted here.
In some embodiments of the present application, under the condition that the ground type feature similarity between the pixel feature information corresponding to all ground types calculated through the ground type analysis and the first pixel feature information corresponding to the ground image to be cleaned is relatively low and/or under the condition that the dirty feature similarity between the pixel feature information corresponding to all the dirty degrees calculated through the dirty degree analysis and the first pixel feature information corresponding to the ground image to be cleaned is relatively low, a situation that the ground type and the dirty degree of the ground image to be cleaned cannot be determined occurs, that is, the ground type and the dirty degree of the ground to be cleaned cannot be determined. In this embodiment, if the ground type and the dirt degree of the ground to be cleaned cannot be determined, the image of the ground to be cleaned may be fed back to the user operation end to indicate that the user manually specifies a cleaning strategy, specifically, the user may receive the image of the ground to be cleaned fed back by the sweeping robot according to the user operation end, and the user operation end manually sets the cleaning strategy of the ground to be cleaned according to the image of the ground to be cleaned, where the cleaning strategy includes specifying the number of times of cleaning, the amount of detergent, the cleaning manner, and the like that the sweeping robot executes on the ground to be cleaned. Based on the embodiment, the floor sweeping robot can actively push corresponding information to a user aiming at the ground type and the dirt degree which cannot be judged in advance or are abnormal, so that the user can know the ground condition in time and formulate a cleaning strategy aiming at the ground in time.
In some embodiments of the present application, please refer to fig. 4, and fig. 4 is a schematic flow chart of a method for optimizing a cleaning strategy generation model in a cleaning strategy generation method of a cleaning robot provided in the embodiments of the present application. The details are as follows:
s41: constructing a training sample for optimizing the cleaning strategy generation model according to the first pixel characteristic information of the ground image to be cleaned and the cleaning strategy of the ground to be cleaned, wherein the training sample comprises the pixel characteristic information and the cleaning strategy;
s42: and taking the pixel characteristic information in the training sample as the input of the cleaning strategy generation model and the cleaning strategy in the training sample as the output of the cleaning strategy generation model, and carrying out neural network training on the cleaning strategy generation model so as to optimize the cleaning strategy generation model.
In this embodiment, for the case that the ground type and the dirt degree of the ground to be cleaned cannot be determined, the ground type and the dirt degree that are not related to the cleaning strategy generation model may be present, at this time, based on the ground to be cleaned, a new ground type and/or a new dirt degree is added to the cleaning strategy generation model, according to the first pixel feature information of the ground image to be cleaned and the cleaning strategy of the ground to be cleaned, the pixel feature information and the cleaning strategy are mapped and associated in the training sample by obtaining the first pixel feature information of the ground image to be cleaned as the pixel feature information of the training sample and taking the cleaning strategy of the ground to be cleaned as the cleaning strategy of the training sample, so that a corresponding relationship is established between the pixel feature information and the cleaning strategy, thereby constructing a training sample for optimizing the cleaning strategy generation model, namely, the training sample contains pixel characteristic information and a cleaning strategy. After the training sample is constructed, the cleaning strategy generation model is subjected to neural network training by taking the pixel characteristic information in the training sample as the input of the cleaning strategy generation model and taking the cleaning strategy in the training sample as the output of the cleaning strategy generation model, and the cleaning strategy generation model is subjected to neural network training by collecting ground images similar to the ground image to be cleaned and constructing the training sample based on the similar ground images, so that the ground type and the dirt degree of the cleaning strategy generation model are enriched, the cleaning strategy generation model is optimized, and the cleaning capability of the cleaning robot is continuously improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
In some embodiments of the present application, please refer to fig. 5, and fig. 5 is a schematic structural diagram of a cleaning strategy generating device of a cleaning robot provided in the embodiments of the present application. As shown in fig. 5, the cleaning strategy generating device of the cleaning robot includes: a feature acquisition module 51, a feature analysis module 52 and a policy generation module 53. The feature acquisition module 51 is configured to acquire a ground image to be cleaned and perform image pixel feature extraction processing on the ground image to be cleaned, so as to acquire first pixel feature information of the ground image to be cleaned. The feature analysis module 52 is configured to perform ground type analysis and dirt degree analysis on the ground to be cleaned displayed in the ground image to be cleaned according to the first pixel feature information, and determine a ground type and a dirt degree of the ground to be cleaned. The strategy generating module 53 is configured to input the ground type and the dirt degree of the ground to be cleaned into a pre-trained cleaning strategy generating model for performing strategy generating processing, so as to generate a cleaning strategy for the ground to be cleaned.
In some embodiments of the present application, please refer to fig. 6, and fig. 6 is a schematic diagram of a first detailed structure of a cleaning strategy generating device of a cleaning robot provided in the embodiments of the present application. As shown in fig. 6, the cleaning strategy generating device of the cleaning robot further includes: a first analysis submodule 61 and a first determination submodule 62. The first analysis submodule 61 is configured to obtain second pixel feature information from a preset ground type-pixel feature information correspondence table, perform ground type feature comparison analysis on the first pixel feature information and the second pixel feature information, and calculate a ground type feature similarity between the first pixel feature information and the second pixel feature information, where the second pixel feature information is pixel feature information corresponding to any one ground type in the ground type-pixel feature information correspondence table. The first determining submodule 62 is configured to compare the ground type feature similarity with a preset first threshold, and if the ground type feature similarity is greater than the preset first threshold, determine the ground type corresponding to the second pixel feature information as the ground type of the ground to be cleaned.
The cleaning strategy generation device of the cleaning robot is corresponding to the cleaning strategy generation method of the cleaning robot one by one, and is not described herein again.
In some embodiments of the present application, please refer to fig. 7, and fig. 7 is a schematic view of an electronic device for implementing a cleaning strategy generation method of a cleaning robot according to an embodiment of the present application. As shown in fig. 7, the electronic apparatus 7 of this embodiment includes: a processor 71, a memory 72 and a computer program 73 stored in said memory 72 and executable on said processor 71, such as a cleaning strategy generating program of a sweeping robot. The processor 71 executes the computer program 72 to implement the steps in the embodiments of the cleaning strategy generation method of each cleaning robot. Alternatively, the processor 71 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 73.
Illustratively, the computer program 73 may be partitioned into one or more modules/units, which are stored in the memory 72 and executed by the processor 71 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 73 in the electronic device 7. For example, the computer program 73 may be divided into:
the characteristic acquisition module is used for acquiring a ground image to be cleaned and carrying out image pixel characteristic extraction processing on the ground image to be cleaned to acquire first pixel characteristic information of the ground image to be cleaned;
the characteristic analysis module is used for carrying out ground type analysis and dirt degree analysis on the ground to be cleaned displayed in the ground image to be cleaned according to the first pixel characteristic information, and determining the ground type and the dirt degree of the ground to be cleaned;
and the strategy generation module is used for inputting the ground type and the dirt degree of the ground to be cleaned into a pre-trained cleaning strategy generation model to perform strategy generation processing so as to generate the cleaning strategy of the ground to be cleaned.
The electronic device may include, but is not limited to, a processor 71, a memory 72. It will be appreciated by those skilled in the art that fig. 7 is merely an example of the electronic device 7, and does not constitute a limitation of the electronic device 7, and may include more or less components than those shown, or combine certain components, or different components, for example, the electronic device may also include input output devices, network access devices, buses, etc.
The Processor 71 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 72 may be an internal storage unit of the electronic device 7, such as a hard disk or a memory of the electronic device 7. The memory 72 may also be an external storage device of the electronic device 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device 7. Further, the memory 72 may also include both an internal storage unit and an external storage device of the electronic device 7. The memory 72 is used for storing the computer program and other programs and data required by the electronic device. The memory 72 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. . Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A sweeping strategy generation method of a sweeping robot is characterized by comprising the following steps:
acquiring a ground image to be cleaned, and performing image pixel characteristic extraction processing on the ground image to be cleaned to obtain first pixel characteristic information of the ground image to be cleaned;
according to the first pixel characteristic information, performing ground type analysis and dirt degree analysis on the ground to be cleaned displayed in the ground image to be cleaned, and determining the ground type and the dirt degree of the ground to be cleaned;
and inputting the ground type and the dirt degree of the ground to be cleaned into a pre-trained cleaning strategy generation model for strategy generation processing, and generating the cleaning strategy of the ground to be cleaned.
2. The sweeping robot sweeping strategy generating method according to claim 1, wherein the sweeping strategy includes at least one of information on number of times of sweeping, information on detergent addition, and information on sweeping manner, wherein the detergent addition information is indicated as an addition amount of detergent, and the sweeping manner information is indicated as at least one of a pure sweeping type, a pure suction type, a linear sweeping type, and a rotary sweeping type.
3. The method for generating a cleaning strategy of a cleaning robot according to claim 1, wherein the step of performing a ground type analysis and a contamination degree analysis on the ground to be cleaned displayed in the ground image to be cleaned according to the first pixel feature information to obtain the ground type and the contamination degree of the ground to be cleaned comprises:
acquiring second pixel characteristic information from a preset ground type-pixel characteristic information corresponding relation table, performing ground type characteristic comparison analysis on the first pixel characteristic information and the second pixel characteristic information, and calculating ground type characteristic similarity between the first pixel characteristic information and the second pixel characteristic information, wherein the second pixel characteristic information is pixel characteristic information corresponding to any one ground type in the ground type-pixel characteristic information corresponding relation table;
and comparing the ground type feature similarity with a preset first threshold, and if the ground type feature similarity is greater than the preset first threshold, determining the ground type corresponding to the second pixel feature information as the ground type of the ground to be cleaned.
4. The method for generating a cleaning strategy of a cleaning robot according to claim 1, wherein the step of performing a ground type analysis and a contamination degree analysis on the ground to be cleaned displayed in the ground image to be cleaned according to the first pixel feature information to obtain the ground type and the contamination degree of the ground to be cleaned comprises:
acquiring third pixel characteristic information from a preset dirty degree-pixel characteristic information corresponding relation table, performing dirty characteristic comparison analysis on the first pixel characteristic information and the third pixel characteristic information, and calculating dirty characteristic similarity between the first pixel characteristic information and the third pixel characteristic information, wherein the third pixel characteristic information is pixel characteristic information corresponding to any dirty degree in the dirty degree-pixel characteristic information corresponding relation table;
and comparing the dirty characteristic similarity with a preset second threshold, and if the dirty characteristic similarity is greater than the preset second threshold, determining the dirty degree corresponding to the third pixel characteristic information as the dirty degree of the ground to be cleaned.
5. The method for generating a cleaning strategy of a sweeping robot according to any one of claims 1 to 4, wherein after the step of performing a ground type analysis and a dirt level analysis on the ground to be cleaned displayed in the image of the ground to be cleaned according to the pixel feature information, the method further comprises:
if the ground type and the dirt degree of the ground to be cleaned cannot be determined, the ground image to be cleaned is fed back to a user operation end, so that a user is instructed to manually set a cleaning strategy of the ground to be cleaned at the user operation end according to the ground image to be cleaned.
6. The method for generating a cleaning strategy of a sweeping robot according to claim 5, wherein if the ground type and the dirt degree of the ground to be cleaned cannot be determined, the image of the ground to be cleaned is fed back to a user operation end to instruct a user to manually set the cleaning strategy of the ground to be cleaned according to the image of the ground to be cleaned at the user operation end, and the method further comprises:
constructing a training sample for optimizing the cleaning strategy generation model according to the first pixel characteristic information of the ground image to be cleaned and the cleaning strategy of the ground to be cleaned, wherein the training sample comprises the pixel characteristic information and the cleaning strategy;
and taking the pixel characteristic information in the training sample as the input of the cleaning strategy generation model and the cleaning strategy in the training sample as the output of the cleaning strategy generation model, and carrying out neural network training on the cleaning strategy generation model so as to optimize the cleaning strategy generation model.
7. The utility model provides a clean strategy generating device of robot of sweeping floor which characterized in that, clean strategy generating device of robot of sweeping floor includes:
the characteristic acquisition module is used for acquiring a ground image to be cleaned and carrying out image pixel characteristic extraction processing on the ground image to be cleaned to acquire first pixel characteristic information of the ground image to be cleaned;
the characteristic analysis module is used for carrying out ground type analysis and dirt degree analysis on the ground to be cleaned displayed in the ground image to be cleaned according to the first pixel characteristic information, and determining the ground type and the dirt degree of the ground to be cleaned;
and the strategy generation module is used for inputting the ground type and the dirt degree of the ground to be cleaned into a pre-trained cleaning strategy generation model to perform strategy generation processing so as to generate the cleaning strategy of the ground to be cleaned.
8. The cleaning strategy generation device of a cleaning robot according to claim 7, further comprising:
the first analysis submodule is used for acquiring second pixel characteristic information from a preset ground type-pixel characteristic information corresponding relation table, performing ground type characteristic comparison analysis on the first pixel characteristic information and the second pixel characteristic information, and calculating the ground type characteristic similarity between the first pixel characteristic information and the second pixel characteristic information, wherein the second pixel characteristic information is pixel characteristic information corresponding to any ground type in the ground type-pixel characteristic information corresponding relation table;
and the first determining submodule is used for comparing the ground type feature similarity with a preset first threshold value, and if the ground type feature similarity is greater than the preset first threshold value, determining the ground type corresponding to the second pixel feature information as the ground type of the ground to be cleaned.
9. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the cleaning strategy generation method of the cleaning robot according to any one of claims 1 to 6 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, implements the steps of the cleaning strategy generation method of the sweeping robot according to any one of claims 1 to 6.
CN202210095116.2A 2022-01-26 2022-01-26 Sweeping strategy generation method, device, equipment and storage medium of sweeping robot Pending CN114343504A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210095116.2A CN114343504A (en) 2022-01-26 2022-01-26 Sweeping strategy generation method, device, equipment and storage medium of sweeping robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210095116.2A CN114343504A (en) 2022-01-26 2022-01-26 Sweeping strategy generation method, device, equipment and storage medium of sweeping robot

Publications (1)

Publication Number Publication Date
CN114343504A true CN114343504A (en) 2022-04-15

Family

ID=81094061

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210095116.2A Pending CN114343504A (en) 2022-01-26 2022-01-26 Sweeping strategy generation method, device, equipment and storage medium of sweeping robot

Country Status (1)

Country Link
CN (1) CN114343504A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114794958A (en) * 2022-06-27 2022-07-29 山西嘉世达机器人技术有限公司 Cleaning device, spraying apparatus, spraying control method, spraying control apparatus, and storage medium
CN115373408A (en) * 2022-10-26 2022-11-22 科大讯飞股份有限公司 Cleaning robot, control method, device, equipment and storage medium thereof
CN115429155A (en) * 2022-07-29 2022-12-06 云鲸智能(深圳)有限公司 Control method, device and system of cleaning robot and storage medium
CN115444323A (en) * 2022-09-05 2022-12-09 广东浩海环保设备有限公司 Energy-saving control system and method for carpet extractor
CN115486762A (en) * 2022-08-08 2022-12-20 深圳市景创科技电子股份有限公司 Control method of sweeping equipment based on nine-axis sensor, sweeping equipment and medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103439973A (en) * 2013-08-12 2013-12-11 桂林电子科技大学 Household cleaning robot capable of establishing map by self and cleaning method
CN108053390A (en) * 2017-10-31 2018-05-18 珠海格力电器股份有限公司 Inner tank of washing machine clears up processing method and processing device
CN109998429A (en) * 2018-01-05 2019-07-12 艾罗伯特公司 Mobile clean robot artificial intelligence for context aware
CN111643014A (en) * 2020-06-08 2020-09-11 深圳市杉川机器人有限公司 Intelligent cleaning method and device, intelligent cleaning equipment and storage medium
CN113156928A (en) * 2020-01-03 2021-07-23 苏州宝时得电动工具有限公司 Method for automatically updating data model from mobile equipment, terminal and server
WO2021174851A1 (en) * 2020-03-05 2021-09-10 美智纵横科技有限责任公司 State control method, robot vacuum cleaner, and computer storage medium
CN113598652A (en) * 2021-06-16 2021-11-05 深圳甲壳虫智能有限公司 Robot control method, robot control device, cleaning robot and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103439973A (en) * 2013-08-12 2013-12-11 桂林电子科技大学 Household cleaning robot capable of establishing map by self and cleaning method
CN108053390A (en) * 2017-10-31 2018-05-18 珠海格力电器股份有限公司 Inner tank of washing machine clears up processing method and processing device
CN109998429A (en) * 2018-01-05 2019-07-12 艾罗伯特公司 Mobile clean robot artificial intelligence for context aware
CN113156928A (en) * 2020-01-03 2021-07-23 苏州宝时得电动工具有限公司 Method for automatically updating data model from mobile equipment, terminal and server
WO2021174851A1 (en) * 2020-03-05 2021-09-10 美智纵横科技有限责任公司 State control method, robot vacuum cleaner, and computer storage medium
CN111643014A (en) * 2020-06-08 2020-09-11 深圳市杉川机器人有限公司 Intelligent cleaning method and device, intelligent cleaning equipment and storage medium
CN113598652A (en) * 2021-06-16 2021-11-05 深圳甲壳虫智能有限公司 Robot control method, robot control device, cleaning robot and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114794958A (en) * 2022-06-27 2022-07-29 山西嘉世达机器人技术有限公司 Cleaning device, spraying apparatus, spraying control method, spraying control apparatus, and storage medium
CN115429155A (en) * 2022-07-29 2022-12-06 云鲸智能(深圳)有限公司 Control method, device and system of cleaning robot and storage medium
CN115429155B (en) * 2022-07-29 2023-09-29 云鲸智能(深圳)有限公司 Control method, device and system of cleaning robot and storage medium
CN115486762A (en) * 2022-08-08 2022-12-20 深圳市景创科技电子股份有限公司 Control method of sweeping equipment based on nine-axis sensor, sweeping equipment and medium
CN115444323A (en) * 2022-09-05 2022-12-09 广东浩海环保设备有限公司 Energy-saving control system and method for carpet extractor
CN115373408A (en) * 2022-10-26 2022-11-22 科大讯飞股份有限公司 Cleaning robot, control method, device, equipment and storage medium thereof

Similar Documents

Publication Publication Date Title
CN114343504A (en) Sweeping strategy generation method, device, equipment and storage medium of sweeping robot
US10540531B2 (en) Image identification method, terminal and non-volatile storage medium
CN111178197B (en) Mass R-CNN and Soft-NMS fusion based group-fed adherent pig example segmentation method
CN110321910B (en) Point cloud-oriented feature extraction method, device and equipment
Bello et al. Contour extraction of individual cattle from an image using enhanced Mask R-CNN instance segmentation method
CN105868708B (en) A kind of images steganalysis method and device
CN112842184B (en) Cleaning method and cleaning robot
CN109829467A (en) Image labeling method, electronic device and non-transient computer-readable storage medium
CN110097596B (en) Object detection system based on opencv
CN110533654A (en) The method for detecting abnormality and device of components
CN102346854A (en) Method and device for carrying out detection on foreground objects
CN108932509A (en) A kind of across scene objects search methods and device based on video tracking
CN111643014A (en) Intelligent cleaning method and device, intelligent cleaning equipment and storage medium
CN102915432A (en) Method and device for extracting vehicle-bone microcomputer image video data
CN108959998A (en) Two-dimensional code identification method, apparatus and system
CN106933861A (en) A kind of customized across camera lens target retrieval method of supported feature
CN105373813A (en) Equipment state image monitoring method and device
CN114169381A (en) Image annotation method and device, terminal equipment and storage medium
Rachna et al. Detection of Tuberculosis bacilli using image processing techniques
CN112232246A (en) Garbage detection and classification method and device based on deep learning
CN116977937A (en) Pedestrian re-identification method and system
CN114227717A (en) Intelligent inspection method, device, equipment and storage medium based on inspection robot
Thammasorn et al. Real-time method for counting unseen stacked objects in mobile
Chen et al. Image segmentation based on mathematical morphological operator
CN112541383B (en) Method and device for identifying weed area

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220415