CN114424916A - Cleaning mode selection method, intelligent cleaning device, computer storage medium - Google Patents

Cleaning mode selection method, intelligent cleaning device, computer storage medium Download PDF

Info

Publication number
CN114424916A
CN114424916A CN202111500223.0A CN202111500223A CN114424916A CN 114424916 A CN114424916 A CN 114424916A CN 202111500223 A CN202111500223 A CN 202111500223A CN 114424916 A CN114424916 A CN 114424916A
Authority
CN
China
Prior art keywords
working scene
current working
cleaning mode
neural network
cleaning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111500223.0A
Other languages
Chinese (zh)
Inventor
谢濠键
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Stone Innovation Technology Co ltd
Original Assignee
Beijing Stone Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Stone Innovation Technology Co ltd filed Critical Beijing Stone Innovation Technology Co ltd
Priority to CN202111500223.0A priority Critical patent/CN114424916A/en
Publication of CN114424916A publication Critical patent/CN114424916A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The invention discloses a cleaning mode selection method of intelligent cleaning equipment, which comprises the following steps: acquiring a current working scene image of the intelligent cleaning equipment, wherein the current working scene image comprises a landmark object of a current working scene; inputting the collected current working scene image into a working scene convolution neural network classification model of the intelligent cleaning equipment, wherein the working scene convolution neural network classification model is established according to a training sample set, and a pre-collected working scene image in the training sample set is marked with a corresponding room attribute label; determining the room attribute of the current working scene according to the landmark object of the current working scene included in the current working scene image and the pre-acquired working scene image in the training sample; and adopting a corresponding cleaning mode according to the room attribute of the current working scene.

Description

Cleaning mode selection method, intelligent cleaning device, computer storage medium
The application is a divisional application of Chinese patent applications with application numbers of 201811295518.7, application dates of 2018, 11/1/h, entitled "intelligent cleaning equipment, cleaning mode selection method, computer storage medium".
Technical Field
The invention relates to the technical field of intelligent cleaning, in particular to a cleaning mode selection method of intelligent cleaning equipment, the intelligent cleaning equipment and a computer storage medium.
Background
The existing intelligent cleaning equipment generally provides position information and motion state information for a control system through a sensing device of the intelligent cleaning equipment, so that path planning, obstacle avoidance and the like are realized. In addition, the existing intelligent cleaning equipment cannot directly distinguish the room where the intelligent cleaning equipment is located, and only a user can tell the specific position of the intelligent cleaning equipment by map division through a mobile terminal and what cleaning mode should be adopted. The above approach is not intelligent enough and requires too much manual intervention, resulting in lower cleaning efficiency.
Disclosure of Invention
In this summary, concepts in a simplified form are introduced that are further described in the detailed description. This summary of the invention is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In order to at least partially solve the above problem, an embodiment of the present invention provides a cleaning mode selection method for an intelligent cleaning device, including:
acquiring a current working scene image of the intelligent cleaning equipment, wherein the current working scene image comprises a landmark object of a current working scene;
inputting the collected current working scene image into a working scene convolution neural network classification model of the intelligent cleaning equipment, wherein the working scene convolution neural network classification model is established according to a training sample set, and a pre-collected working scene image in the training sample set is marked with a corresponding room attribute label;
determining the room attribute of the current working scene according to the landmark object of the current working scene included in the current working scene image and the pre-acquired working scene image in the training sample;
and adopting a corresponding cleaning mode according to the room attribute of the current working scene.
Optionally, the adopting a corresponding cleaning mode according to the room attribute of the current working scene includes: and adopting a corresponding first cleaning mode according to the room attributes, wherein the first cleaning mode is configured to adopt different cleaning forces for different room attributes by the intelligent cleaning equipment.
Optionally, the work scene convolutional neural network classification model is an adaptive convolutional neural network classification model, and is used for completing incremental learning according to a new training sample.
Optionally, the pre-captured working scene image is an image under various illumination, angle and focal length conditions.
An embodiment of the present invention provides an intelligent cleaning device, which includes a memory, a processor and a computer program stored on the memory and running on the processor, wherein the processor implements the steps of the convolutional neural network-based cleaning mode selection method according to an embodiment of the present invention when executing the program.
Embodiments of the present invention also provide a computer storage medium having a computer program stored thereon, where the program is executed by a processor to implement the steps of the convolutional neural network-based cleaning mode selection method according to the present invention.
Drawings
The following drawings of the invention are included to provide a further understanding of the invention. There are shown in the drawings, embodiments and descriptions thereof, which are used to explain the principles and apparatus of the invention. In the drawings, there is shown in the drawings,
FIG. 1 is a flow chart of a convolutional neural network-based cleaning pattern selection method according to the present invention.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present invention. It will be apparent, however, to one skilled in the art, that the present invention may be practiced without one or more of these specific details. In other instances, well-known features have not been described in order to avoid obscuring the invention.
In the following description, for purposes of explanation, specific details are set forth in order to provide a thorough understanding of the present invention. It is apparent that the practice of the invention is not limited to the specific details set forth herein as are known to those of skill in the art. The following detailed description of the preferred embodiments of the invention, however, the invention is capable of other embodiments in addition to the detailed description and should not be construed as limited to the embodiments set forth herein.
FIG. 1 shows a flow diagram of a convolutional neural network-based cleaning pattern selection method according to the present invention. Specifically, the selection method comprises the following steps:
s1: acquiring a current working scene image of the intelligent cleaning equipment;
s2: inputting the collected current working scene image into a working scene convolution neural network classification model of the intelligent cleaning equipment to determine the type of the current working scene, wherein the working scene convolution neural network classification model is established according to a training sample set, and a pre-collected working scene image in the training sample set is marked with a corresponding working scene type label;
s3: and adopting a corresponding cleaning mode according to the type of the current working scene.
According to the cleaning mode selection method based on the convolutional neural network for the intelligent cleaning equipment, the type of the current working scene of the intelligent cleaning equipment can be obtained by acquiring the image of the current working scene and inputting the image into the trained working scene convolutional neural network classification model, and then a corresponding cleaning mode is adopted. The invention enables the selection of the cleaning mode to be more efficient, intelligent and diversified, and meets the requirements of consumers.
It should be noted that, the construction of the working scenario convolutional neural network classification model in step S2 and the training of the classification model by using the training sample set are already well-established technologies in the art, and a brief description is given here of a preferred embodiment.
Firstly, the pre-collected working scene images in the training sample set can be input before the intelligent cleaning equipment leaves a factory, or can be obtained by further performing field collection on a family in the home initialization process of a user, and the pre-collected working scene images are marked with corresponding working scene type labels. The pre-acquisition working scene image is an image under various illumination, angle and focal length conditions.
Secondly, the construction process of the work scene convolutional neural network classification model is as follows:
the work scene convolutional neural network classification model comprises C convolutional layers, F full-link layers and a softmax classifier, wherein C is more than or equal to 2 and less than or equal to 5, and F is more than or equal to 1 and less than or equal to 3; performing convolution, pooling and normalization processing in each convolution layer; the convolution kernel size of each convolution layer is ci, the step length is sci, wherein ci is more than or equal to 1 and less than or equal to 10, sci is more than or equal to 1 and less than or equal to 5, and i is more than or equal to 1 and less than or equal to C; performing convolution processing on the image to obtain ki characteristics, wherein ki is more than or equal to 1 and less than or equal to 256; the largest pooling is used in the pooling treatment of each convolution layer, the size is pi × pi, the step length is spi × spi, pi is more than or equal to 1 and less than or equal to 10, spi is more than or equal to 1 and less than or equal to 5, and i is more than or equal to 1 and less than or equal to C; normalizing the image subjected to pooling treatment; inputting a result obtained after the image is processed by the previous convolution layer into the next convolution layer; after the C-th convolutional layer, unfolding the obtained features into a one-dimensional vector, inputting the one-dimensional vector into a first full-link layer, then inputting the result of the first full-link layer into a second full-link layer, repeating the steps, obtaining a logits value after the F-th full-link layer, finally inputting the obtained logits value into a softmax classifier, obtaining the probability value of the picture belonging to each category, and obtaining a cross entropy loss function through the probability value and the real label calculation, thereby completing the construction of a convolutional neural network model; wherein, the number of the neurons contained in each full connection layer is fj, fj is more than or equal to 1 and less than or equal to 768, and j is more than or equal to 1 and less than or equal to F.
Training a working scene convolutional neural network classification model: the training uses a gradient descent method, the learning rate uses a variable learning rate, and the number of training iterations is between 5 and 10 ten thousand.
And preprocessing the acquired current working scene image, inputting the preprocessed current working scene image into a front C + F layer of the working scene convolutional neural network classification model trained by the method to obtain logits values, and selecting subscripts where maximum values are located from the logits values, namely predicted labels, so as to complete recognition.
And then the intelligent cleaning equipment adopts a corresponding cleaning mode according to the identified type of the current working scene.
In one embodiment, the work scenario convolutional neural network classification model may be an adaptive convolutional neural network classification model for performing incremental learning based on new training samples. The self-adaptive convolutional neural network classification model can be added with an input layer, a central layer and an excitation layer on the basis of the original working scene convolutional neural network classification model, wherein the input layer represents the samples which are wrongly classified in the original working scene neural network training process, the central layer represents different clusters of all the wrongly classified samples, and the excitation layer represents a sample mode and can directly output the sample mode according to the output of the central layer or activate the original working scene convolutional neural network classification model to further identify the mode.
The following will continue to describe the classification of the types of the work scenes involved in steps S2 and S3 and the contents of the corresponding cleaning modes, which are also important contents of the present invention.
In one embodiment, the work scenario type tags may include room property tags that are used to characterize the spatial location of the intelligent cleaning device.
Further, the convolutional neural network-based cleaning mode selection method according to the present invention may include the steps of:
step S2 includes the step of determining the room properties of the current working scene; step S3 includes adopting a corresponding first cleaning mode according to the room attribute; the first cleaning mode is configured as a step in which the intelligent cleaning apparatus employs different cleaning power for different room properties.
For example, for places such as a kitchen and a balcony, that is, for example, when there is a landmark object such as a smoke exhaust ventilator in a current working scene image acquired by an intelligent cleaning device, the landmark object has a room attribute tag corresponding to the kitchen after being analyzed by a working scene convolutional neural network classification model, or when there is a landmark object such as a pot plant in a current working scene image acquired by an intelligent cleaning device, the landmark object has a room attribute tag corresponding to the balcony after being analyzed by a working scene convolutional neural network classification model, the fan power can be increased or the cleaning can be repeated twice; for places such as bedrooms, namely, for example, the marked objects such as beds exist in the current working scene image collected by the intelligent cleaning equipment, after the working scene image is analyzed by the working scene convolutional neural network classification model, the working scene image has room attribute labels corresponding to the bedrooms, and then the fan can be turned off to clean the bedrooms, so that the user is prevented from being disturbed to have a rest.
In one embodiment, the work scene type tags may include tiny object property tags that are used to characterize the environment surrounding the intelligent cleaning device.
Further, the convolutional neural network-based cleaning mode selection method according to the present invention may include the steps of:
step S2 includes a step of determining attributes of tiny objects of the current working scene; step S3 includes adopting a corresponding second cleaning mode according to the attributes of the tiny objects; the second cleaning mode is configured as a step of walking down for cleaning while the intelligent cleaning apparatus avoids the minute object or is close to the minute object.
For example, objects which can be easily pushed such as garbage cans, toys, slippers and the like, or animal wastes and the like can be directly avoided; the cleaning can be carried out by walking at a reduced speed for general obstacles (even if the obstacles collide with the user, the user cannot be damaged).
It should be noted that, if the current working scene image acquired by the intelligent cleaning device contains the tiny objects in the above example, the tiny object attribute label corresponding to the object that can be easily pushed is provided after the working scene convolutional neural network classification model is analyzed; if general obstacles which do not bring loss to users exist in the current working scene image acquired by the intelligent cleaning equipment, the general obstacles have tiny object attribute labels corresponding to the general obstacles after the working scene image is analyzed by the working scene convolutional neural network classification model. Thus, the distinction between general obstacles and easily movable objects can be self-defined according to experience accumulation or user needs.
In one embodiment, the work scenario type tag may include a floor material property tag that is used to characterize the floor material on which the intelligent cleaning device is located.
Further, the convolutional neural network-based cleaning mode selection method according to the present invention may include the steps of:
step S2 includes the step of determining the ground material attribute of the current work scene; step S3 includes adopting a corresponding third cleaning mode according to the ground material attribute; the third cleaning mode is configured as a step in which the intelligent cleaning device adopts different cleaning power and/or different cleaning dryness and humidity for different floor material attributes.
For example, when the intelligent cleaning device is located on the floor tile, a wet cleaning mode can be adopted, such as water outlet for starting the floor mopping function of the intelligent cleaning device, and if the floor tile is dirty, cleaning force can be increased, such as fan power increase or repeated cleaning; when the intelligent cleaning device is located on the wood floor, a drying mode can be adopted, such as turning off the water outlet or reducing the water outlet amount of the floor mopping function of the intelligent cleaning device.
It should be noted that if the current working scene image collected by the intelligent cleaning device is displayed on the floor, the current working scene image has a ground material attribute label corresponding to the floor after being analyzed by the working scene convolutional neural network classification model; if the current working scene image collected by the intelligent cleaning equipment is displayed on the floor tile, the current working scene image has a ground material attribute label corresponding to the floor tile after being analyzed by the working scene convolutional neural network classification model.
The invention also provides an intelligent cleaning device, which comprises a memory, a processor and a computer program stored on the memory and running on the processor, wherein the processor executes the program to realize the following steps:
s1: acquiring a current working scene image of the intelligent cleaning equipment;
s2: inputting the collected current working scene image into a working scene convolution neural network classification model of the intelligent cleaning equipment to determine the type of the current working scene, wherein the working scene convolution neural network classification model is established according to a training sample set, and a pre-collected working scene image in the training sample set is marked with a corresponding working scene type label;
s3: and adopting a corresponding cleaning mode according to the type of the current working scene.
In one embodiment, the processor when executing the program performs the steps of: the work scene convolutional neural network classification model is a self-adaptive convolutional neural network classification model and is used for finishing incremental learning according to a new training sample.
In one embodiment, the processor when executing the program performs the steps of: the work scene type tags include room attribute tags that are used to characterize the spatial location of the intelligent cleaning device.
In one embodiment, the processor when executing the program performs the steps of: step S2 includes determining room attributes of the current work scenario; step S3 includes adopting a corresponding first cleaning mode according to the room attribute; the first cleaning mode is configured such that the intelligent cleaning device employs different cleaning forces for different room attributes.
In one embodiment, the processor when executing the program performs the steps of: the work scene type tags include tiny object attribute tags that are used to characterize the environment surrounding the intelligent cleaning device.
In one embodiment, the processor when executing the program performs the steps of: step S2 includes determining the attributes of the tiny objects in the current working scene; step S3 includes adopting a corresponding second cleaning mode according to the attributes of the tiny objects; the second cleaning mode is configured to walk at a reduced speed for cleaning while the intelligent cleaning apparatus avoids or is adjacent to the tiny objects.
In one embodiment, the processor when executing the program performs the steps of: the work scene type label includes ground material attribute label, and ground material attribute label is used for the ground material that representation intelligent cleaning device is located.
In one embodiment, the processor when executing the program performs the steps of: step S2 includes determining the ground material attribute of the current working scene; step S3 includes adopting a corresponding third cleaning mode according to the ground material attribute; the third cleaning mode is configured such that the intelligent cleaning device employs different cleaning forces and/or different cleaning dryness for different floor material properties.
In one embodiment, the processor when executing the program performs the steps of: the cleaning force is different, and the cleaning force is realized by adjusting the fan and/or the cleaning frequency of the intelligent cleaning equipment.
The present invention also provides a computer storage medium having a computer program stored thereon, the computer program, when executed by a processor, performing the steps of:
s1: acquiring a current working scene image of the intelligent cleaning equipment;
s2: inputting the collected current working scene image into a working scene convolution neural network classification model of the intelligent cleaning equipment to determine the type of the current working scene, wherein the working scene convolution neural network classification model is established according to a training sample set, and a pre-collected working scene image in the training sample set is marked with a corresponding working scene type label;
s3: and adopting a corresponding cleaning mode according to the type of the current working scene.
In one embodiment, the computer program when executed by a processor implements the steps of: the work scene convolutional neural network classification model is a self-adaptive convolutional neural network classification model and is used for finishing incremental learning according to a new training sample.
In one embodiment, the computer program when executed by a processor implements the steps of:
the work scene type tags include room attribute tags that are used to characterize the spatial location of the intelligent cleaning device.
In one embodiment, the computer program when executed by a processor implements the steps of: step S2 includes determining room attributes of the current work scenario; step S3 includes adopting a corresponding first cleaning mode according to the room attribute; the first cleaning mode is configured such that the intelligent cleaning device employs different cleaning forces for different room attributes.
In one embodiment, the computer program when executed by a processor implements the steps of: the work scene type tags include tiny object attribute tags that are used to characterize the environment surrounding the intelligent cleaning device.
In one embodiment, the computer program when executed by a processor implements the steps of: step S2 includes determining the attributes of the tiny objects in the current working scene; step S3 includes adopting a corresponding second cleaning mode according to the attributes of the tiny objects; the second cleaning mode is configured to walk at a reduced speed for cleaning while the intelligent cleaning apparatus avoids or is adjacent to the tiny objects.
In one embodiment, the computer program when executed by a processor implements the steps of: the work scene type label includes ground material attribute label, and ground material attribute label is used for the ground material that representation intelligent cleaning device is located.
In one embodiment, the computer program when executed by a processor implements the steps of: step S2 includes determining the ground material attribute of the current working scene; step S3 includes adopting a corresponding third cleaning mode according to the ground material attribute; the third cleaning mode is configured such that the intelligent cleaning device employs different cleaning forces and/or different cleaning dryness for different floor material properties.
In one embodiment, the computer program when executed by a processor implements the steps of: the cleaning force is different, and the cleaning force is realized by adjusting the fan and/or the cleaning frequency of the intelligent cleaning equipment.
The features of the above embodiments may be arbitrarily combined, and for the sake of brevity, all possible combinations of the features in the above embodiments are not described, but should be construed as being within the scope of the present specification as long as there is no contradiction between the combinations of the features.
Unless defined otherwise, technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Terms such as "component" and the like, when used herein, can refer to either a single part or a combination of parts. Terms such as "mounted," "disposed," and the like, as used herein, may refer to one component as being directly attached to another component or one component as being attached to another component through intervening components. Features described herein in one embodiment may be applied to another embodiment, either alone or in combination with other features, unless the feature is otherwise inapplicable or otherwise stated in the other embodiment.
The present invention has been described in terms of the above embodiments, but it should be understood that the above embodiments are for purposes of illustration and description only and are not intended to limit the invention to the scope of the described embodiments. Furthermore, it will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, and that many variations and modifications may be made in accordance with the teachings of the present invention, which variations and modifications fall within the scope of the present invention as claimed.

Claims (6)

1. A cleaning mode selection method for an intelligent cleaning device, comprising:
acquiring a current working scene image of the intelligent cleaning equipment, wherein the current working scene image comprises a landmark object of a current working scene;
inputting the collected current working scene image into a working scene convolution neural network classification model of the intelligent cleaning equipment, wherein the working scene convolution neural network classification model is established according to a training sample set, and a pre-collected working scene image in the training sample set is marked with a corresponding room attribute label;
determining the room attribute of the current working scene according to the landmark object of the current working scene included in the current working scene image and the pre-acquired working scene image in the training sample;
and adopting a corresponding cleaning mode according to the room attribute of the current working scene.
2. The cleaning mode selection method according to claim 1, wherein the adopting of the corresponding cleaning mode according to the room attribute of the current working scene comprises: and adopting a corresponding first cleaning mode according to the room attributes, wherein the first cleaning mode is configured to adopt different cleaning forces for different room attributes by the intelligent cleaning equipment.
3. The cleaning mode selection method of claim 1 or 2, wherein the working scenario convolutional neural network classification model is an adaptive convolutional neural network classification model for performing incremental learning based on new training samples.
4. The cleaning mode selection method according to claim 1 or 2, wherein the pre-captured working scene images are images under various illumination, angle, focus conditions.
5. An intelligent cleaning device comprising a memory, a processor and a computer program stored on the memory and running on the processor, characterized in that the steps of the method of any of claims 1 to 4 are implemented when the program is executed by the processor.
6. A computer storage medium having a computer program stored thereon, wherein the program, when executed by a processor, implements the steps of the method of any of claims 1 to 4.
CN202111500223.0A 2018-11-01 2018-11-01 Cleaning mode selection method, intelligent cleaning device, computer storage medium Pending CN114424916A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111500223.0A CN114424916A (en) 2018-11-01 2018-11-01 Cleaning mode selection method, intelligent cleaning device, computer storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111500223.0A CN114424916A (en) 2018-11-01 2018-11-01 Cleaning mode selection method, intelligent cleaning device, computer storage medium
CN201811295518.7A CN109452914A (en) 2018-11-01 2018-11-01 Intelligent cleaning equipment, cleaning mode selection method, computer storage medium

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201811295518.7A Division CN109452914A (en) 2018-11-01 2018-11-01 Intelligent cleaning equipment, cleaning mode selection method, computer storage medium

Publications (1)

Publication Number Publication Date
CN114424916A true CN114424916A (en) 2022-05-03

Family

ID=65609180

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111500223.0A Pending CN114424916A (en) 2018-11-01 2018-11-01 Cleaning mode selection method, intelligent cleaning device, computer storage medium
CN201811295518.7A Pending CN109452914A (en) 2018-11-01 2018-11-01 Intelligent cleaning equipment, cleaning mode selection method, computer storage medium

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201811295518.7A Pending CN109452914A (en) 2018-11-01 2018-11-01 Intelligent cleaning equipment, cleaning mode selection method, computer storage medium

Country Status (1)

Country Link
CN (2) CN114424916A (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111759226B (en) * 2019-04-02 2022-12-20 青岛塔波尔机器人技术股份有限公司 Sweeping robot control method and sweeping robot
CN110251004B (en) * 2019-07-16 2022-03-11 深圳市杉川机器人有限公司 Sweeping robot, sweeping method thereof and computer-readable storage medium
CN110674731A (en) * 2019-09-22 2020-01-10 江苏悦达专用车有限公司 Road cleanliness quantification method based on deep learning
CN111012261A (en) * 2019-11-18 2020-04-17 深圳市杉川机器人有限公司 Sweeping method and system based on scene recognition, sweeping equipment and storage medium
CN111248815B (en) * 2020-01-16 2021-07-16 珠海格力电器股份有限公司 Method, device and equipment for generating working map and storage medium
CN111476098A (en) * 2020-03-06 2020-07-31 珠海格力电器股份有限公司 Method, device, terminal and computer readable medium for identifying target area
CN111528739A (en) * 2020-05-09 2020-08-14 小狗电器互联网科技(北京)股份有限公司 Sweeping mode switching method and system, electronic equipment, storage medium and sweeper
CN112690704B (en) * 2020-12-22 2022-05-10 珠海一微半导体股份有限公司 Robot control method, control system and chip based on vision and laser fusion
CN114305208A (en) * 2021-12-17 2022-04-12 深圳市倍思科技有限公司 Driving method, device, equipment, program product and system of intelligent cleaning equipment
WO2023125698A1 (en) * 2021-12-28 2023-07-06 美智纵横科技有限责任公司 Cleaning device, and control method and control apparatus therefor
CN114468857A (en) * 2022-02-17 2022-05-13 美智纵横科技有限责任公司 Control method and device of cleaning equipment, cleaning equipment and readable storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105809146A (en) * 2016-03-28 2016-07-27 北京奇艺世纪科技有限公司 Image scene recognition method and device
US20170032189A1 (en) * 2015-07-31 2017-02-02 Xiaomi Inc. Method, apparatus and computer-readable medium for image scene determination
CN106446930A (en) * 2016-06-28 2017-02-22 沈阳工业大学 Deep convolutional neural network-based robot working scene identification method
CN107092254A (en) * 2017-04-27 2017-08-25 北京航空航天大学 A kind of design method for the Household floor-sweeping machine device people for strengthening study based on depth
CN107622244A (en) * 2017-09-25 2018-01-23 华中科技大学 A kind of indoor scene based on depth map becomes more meticulous analytic method
CN107688856A (en) * 2017-07-24 2018-02-13 清华大学 Indoor Robot scene active identification method based on deeply study
CN107690672A (en) * 2017-07-25 2018-02-13 深圳前海达闼云端智能科技有限公司 Training data generation method, generating means and its image, semantic dividing method
CN107944386A (en) * 2017-11-22 2018-04-20 天津大学 Visual scene recognition methods based on convolutional neural networks
CN108125622A (en) * 2017-12-15 2018-06-08 珊口(上海)智能科技有限公司 Control method, system and the clean robot being applicable in
CN108523768A (en) * 2018-03-12 2018-09-14 苏州大学 Household cleaning machine people's control system based on adaptive strategy optimization
CN108710847A (en) * 2018-05-15 2018-10-26 北京旷视科技有限公司 Scene recognition method, device and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106155049A (en) * 2015-04-15 2016-11-23 小米科技有限责任公司 Intelligent cleaning equipment and bootstrap technique, guiding stake, intelligent cleaning system
CN106228177A (en) * 2016-06-30 2016-12-14 浙江大学 Daily life subject image recognition methods based on convolutional neural networks
WO2018038552A1 (en) * 2016-08-25 2018-03-01 엘지전자 주식회사 Mobile robot and control method therefor
CN106821157A (en) * 2017-04-14 2017-06-13 小狗电器互联网科技(北京)股份有限公司 The cleaning method that a kind of sweeping robot is swept the floor
CN108594692A (en) * 2017-12-18 2018-09-28 深圳市奇虎智能科技有限公司 A kind of cleaning equipment control method, device, computer equipment and storage medium
CN108681752B (en) * 2018-05-28 2023-08-15 电子科技大学 Image scene labeling method based on deep learning

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170032189A1 (en) * 2015-07-31 2017-02-02 Xiaomi Inc. Method, apparatus and computer-readable medium for image scene determination
CN105809146A (en) * 2016-03-28 2016-07-27 北京奇艺世纪科技有限公司 Image scene recognition method and device
CN106446930A (en) * 2016-06-28 2017-02-22 沈阳工业大学 Deep convolutional neural network-based robot working scene identification method
CN107092254A (en) * 2017-04-27 2017-08-25 北京航空航天大学 A kind of design method for the Household floor-sweeping machine device people for strengthening study based on depth
CN107688856A (en) * 2017-07-24 2018-02-13 清华大学 Indoor Robot scene active identification method based on deeply study
CN107690672A (en) * 2017-07-25 2018-02-13 深圳前海达闼云端智能科技有限公司 Training data generation method, generating means and its image, semantic dividing method
CN107622244A (en) * 2017-09-25 2018-01-23 华中科技大学 A kind of indoor scene based on depth map becomes more meticulous analytic method
CN107944386A (en) * 2017-11-22 2018-04-20 天津大学 Visual scene recognition methods based on convolutional neural networks
CN108125622A (en) * 2017-12-15 2018-06-08 珊口(上海)智能科技有限公司 Control method, system and the clean robot being applicable in
CN108523768A (en) * 2018-03-12 2018-09-14 苏州大学 Household cleaning machine people's control system based on adaptive strategy optimization
CN108710847A (en) * 2018-05-15 2018-10-26 北京旷视科技有限公司 Scene recognition method, device and electronic equipment

Also Published As

Publication number Publication date
CN109452914A (en) 2019-03-12

Similar Documents

Publication Publication Date Title
CN114424916A (en) Cleaning mode selection method, intelligent cleaning device, computer storage medium
CN111543902B (en) Floor cleaning method and device, intelligent cleaning equipment and storage medium
Bormann et al. Room segmentation: Survey, implementation, and analysis
Sannakki et al. Leaf disease grading by machine vision and fuzzy logic
CN110174888B (en) Self-moving robot control method, device, equipment and storage medium
WO2021098536A1 (en) Cleaning method and system employing scene recognition, sweeping device, and storage medium
CN110251004B (en) Sweeping robot, sweeping method thereof and computer-readable storage medium
CN108195028B (en) The method for pushing and device of apparatus of air conditioning cleaning information
CN111643014A (en) Intelligent cleaning method and device, intelligent cleaning equipment and storage medium
WO2022166640A1 (en) Control method for intelligent cleaning device, and intelligent cleaning device
CN113777938A (en) Household appliance control method, device, equipment and storage medium
JP2015090559A (en) Data processing method and device, and data discriminating method, device, and program
Nguyen et al. Recognising behaviours of multiple people with hierarchical probabilistic model and statistical data association
CN110928282A (en) Control method and device for cleaning robot
Posner et al. Using scene similarity for place labelling
CN114089752A (en) Autonomous exploration method for robot, and computer-readable storage medium
KR20210047230A (en) Fruit tree disease Classification System AND METHOD Using Generative Adversarial Networks
Lai et al. Mask-attention-free transformer for 3d instance segmentation
Kanji Unsupervised part-based scene modeling for visual robot localization
WO2024001196A1 (en) Household appliance control method and apparatus, storage medium, and electronic apparatus
CN110824930B (en) Control method, device and system of household appliance
Alipoor et al. Designing edge detection filters using particle swarm optimization
CN112148914A (en) Method, device and equipment for retrieving clothes images
Madokoro et al. Unsupervised scene classification based on context of features for a mobile robot
CN110795964A (en) Sweeping method and device of sweeping robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination