CN114468866A - Mopping robot control method, device, equipment and medium based on artificial intelligence - Google Patents

Mopping robot control method, device, equipment and medium based on artificial intelligence Download PDF

Info

Publication number
CN114468866A
CN114468866A CN202210386861.2A CN202210386861A CN114468866A CN 114468866 A CN114468866 A CN 114468866A CN 202210386861 A CN202210386861 A CN 202210386861A CN 114468866 A CN114468866 A CN 114468866A
Authority
CN
China
Prior art keywords
mop
dirt
image
dirty
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210386861.2A
Other languages
Chinese (zh)
Other versions
CN114468866B (en
Inventor
马莉
谌世凤
毕宏博
周佳庆
周春大
林进玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fenglin Technology Shenzhen Co ltd
Original Assignee
Fenglin Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fenglin Technology Shenzhen Co ltd filed Critical Fenglin Technology Shenzhen Co ltd
Priority to CN202210386861.2A priority Critical patent/CN114468866B/en
Publication of CN114468866A publication Critical patent/CN114468866A/en
Application granted granted Critical
Publication of CN114468866B publication Critical patent/CN114468866B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/28Floor-scrubbing machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers

Landscapes

  • Electric Vacuum Cleaner (AREA)

Abstract

The application discloses a floor mopping robot control method, device, equipment and medium based on artificial intelligence, wherein the floor mopping robot control method based on artificial intelligence comprises the following steps: acquiring a mop image corresponding to a mop to be cleaned, and judging the dirt of the mop to be cleaned according to the mop image to obtain a dirt judgment result; if the dirty mop is judged to be dirty, carrying out dirty detection on the mop to be cleaned according to the image of the mop to obtain a dirty detection result; and controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the cleaning parameters corresponding to the dirt detection result. The technical problem that the control accuracy for controlling the mopping robot to clean the mop is low in the prior art is solved.

Description

Mopping robot control method, device, equipment and medium based on artificial intelligence
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a method, an apparatus, a device, and a medium for controlling a floor-mopping robot based on artificial intelligence.
Background
Along with the continuous development of artificial intelligence, the application of artificial intelligence is also more and more extensive, at present, when carrying out mop cleaning, the mopping robot sets up some fixed cleaning parameters usually to carry out mop cleaning, for example it is long when cleaning etc., but when fixed cleaning is long, to cleaner mop, the condition that the mop is already clean still is still in the washing easily takes place to cause the waste of water resource, and correspond dirty mop, the condition that the washing is not clean easily takes place again, so control at present control mopping robot carries out mop cleaning's control accuracy and is lower.
Disclosure of Invention
The application mainly aims to provide a mopping robot control method, device, equipment and medium based on artificial intelligence, and aims to solve the technical problem that in the prior art, the control accuracy for controlling the mopping robot to clean mops is low.
In order to achieve the above object, the present application provides a control method for a floor mopping robot based on artificial intelligence, which includes:
acquiring a mop image corresponding to a mop to be cleaned, and judging the dirt of the mop to be cleaned according to the mop image to obtain a dirt judgment result;
if the dirty mop is judged to be dirty, carrying out dirty detection on the mop to be cleaned according to the image of the mop to obtain a dirty detection result;
and controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the cleaning parameters corresponding to the dirt detection result.
The application still provides a control device of robot that drags ground based on artificial intelligence, the device is virtual device, control device of robot that drags ground based on artificial intelligence includes:
the dirt distinguishing module is used for acquiring a mop image corresponding to the mop to be cleaned, and distinguishing dirt of the mop to be cleaned according to the mop image to obtain a dirt distinguishing result;
the dirt detection module is used for carrying out dirt detection on the mop to be cleaned according to the image of the mop if the dirt judgment result is that the mop is dirty, so as to obtain a dirt detection result;
and the mop cleaning control module is used for controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the dirt detection result.
The present application further provides an electronic device, the electronic device including: a memory, a processor, and a program of the artificial intelligence based floor-mopping robot control method stored on the memory and executable on the processor, the program of the artificial intelligence based floor-mopping robot control method being executable by the processor to implement the steps of the artificial intelligence based floor-mopping robot control method as described above.
The present application also provides a computer-readable storage medium having stored thereon a program for implementing an artificial intelligence-based mopping robot control method, which when executed by a processor, implements the steps of the artificial intelligence-based mopping robot control method as described above.
The present application also provides a computer program product comprising a computer program which, when executed by a processor, performs the steps of the artificial intelligence based floor-mopping robot control method as described above.
Compared with the technical means of carrying out mop cleaning during fixed cleaning time set in the prior art, the method obtains a mop image corresponding to the mop to be cleaned, and carries out dirty judgment on the mop to be cleaned according to the mop image to obtain a dirty judgment result; if the dirty mop is judged to be dirty, carrying out dirty detection on the mop to be cleaned according to the image of the mop to obtain a dirty detection result; and controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the cleaning parameters corresponding to the dirt detection result. Whether can differentiate the mop in advance in this application is dirty mop, if for dirty mop, then dirty detection is carried out, thereby can predict this time required cleaning parameter of washing mop according to dirty testing result, accurate control mopping robot carries out the mop washing according to the cleaning parameter of predicting, make cleaning parameter (for example wash long) can be close to this time and wash required real cleaning parameter as far as possible, so overcome among the prior art when fixed washing is long under, to cleaner mop, it has totally still in the abluent condition to take place the mop easily, thereby cause the waste of water resource, and correspond cleaner mop, take place the technical defect who washs the unclean condition easily again, the control accuracy of controlling mopping robot to carry out mop washing has been promoted.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic flow chart of a first embodiment of a control method of a floor mopping robot based on artificial intelligence according to the present application;
FIG. 2 is a schematic flow chart of a second embodiment of the control method of the floor mopping robot based on artificial intelligence according to the present application;
fig. 3 is a schematic device structural diagram of a hardware operating environment related to a floor-mopping robot control method based on artificial intelligence in the embodiment of the present application.
The implementation of the objectives, functional features, and advantages of the present application will be further described with reference to the accompanying drawings.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments of the present application are described in detail below with reference to the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Example one
In a first embodiment of the control method of the floor mopping robot based on the artificial intelligence, referring to fig. 1, the control method of the floor mopping robot based on the artificial intelligence includes:
step S10, acquiring a mop image corresponding to a mop to be cleaned, and judging the dirt of the mop to be cleaned according to the mop image to obtain a dirt judging result;
step S20, if the dirty judgment result is a dirty mop, performing dirty detection on the mop to be cleaned according to the mop image to obtain a dirty detection result;
and step S30, controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the cleaning parameters corresponding to the dirt detection result.
As one example, steps S10 to S30 include: acquiring a mop image corresponding to a mop to be cleaned; inputting the mop image into a preset dirty mop distinguishing model, distinguishing whether the mop to be cleaned is a dirty mop or not, and obtaining a dirty distinguishing result; if the dirty mop is judged to be a dirty mop, inputting an image of the mop into a preset dirty detection model, and detecting a dirty parameter of the mop to be cleaned to obtain a dirty detection result, wherein the dirty parameter can be a dirty degree or a dirty type; searching corresponding cleaning parameters according to the dirt detection result, wherein the cleaning parameters can be cleaning duration or cleaning agent types; and controlling the mopping robot to perform mop cleaning on the mopping cloth to be cleaned according to the control instruction corresponding to the cleaning parameter.
As an example, the dirt detection result may be a detected dirt parameter, that is, a detected dirt parameter, and the finding the corresponding cleaning parameter according to the dirt detection result includes:
and searching the cleaning parameters corresponding to the detected dirt parameters according to the mapping relation between the dirt parameters and the cleaning parameters.
Wherein, according to the mop image, dirty detection is carried out to the mop of waiting to wash, includes:
step A10, extracting the dirt degree characteristic in the mop image according to the dirt degree characteristic extraction model; and carrying out dirt degree detection on the mop to be cleaned according to the dirt degree characteristic:
in this embodiment, it should be noted that the preset contamination detection model includes a contamination degree detection model for performing contamination degree detection, and the contamination degree detection model includes a contamination degree feature extraction model for extracting contamination degree features.
As an example, step a10 includes: inputting a mop image pixel matrix corresponding to the mop image into a dirt degree feature extraction model, and mapping the mop image pixel matrix to a preset first feature dimension through the dirt degree feature extraction model to obtain a dirt degree feature, wherein the dirt degree feature is a feature carrying dirt degree information of a mop, and the dirt degree feature can be a high-dimensional matrix output by the dirt degree feature extraction model or an output feature vector; and detecting the dirt degree of the mop to be cleaned according to the dirt degree characteristic, so that the detected dirt degree corresponding to the mop to be cleaned can be obtained.
Step B10, extracting the dirty type characteristics in the mop image according to the dirty type characteristic extraction model; and carrying out dirt type detection on the mop to be cleaned according to the dirt degree characteristic.
In this embodiment, it should be noted that the preset contamination detection model includes a contamination type detection model for performing contamination type detection, and the contamination type detection model includes a contamination type feature extraction model for extracting a contamination type feature.
As an example, step a10 includes: inputting a mop image pixel matrix corresponding to the mop image into a dirt type feature extraction model, and mapping the mop image pixel matrix to a preset second feature dimension through the dirt type feature extraction model to obtain a dirt type feature, wherein the dirt type feature is a feature carrying dirt type information of a mop, and the dirt type feature can be a high-dimensional matrix output by the dirt type feature extraction model or an output feature vector; and detecting the dirt type of the mop to be cleaned according to the dirt type characteristic, so that the detected dirt type corresponding to the mop to be cleaned can be obtained.
Wherein, mop image includes mop regional image and mop non-mop regional image of mop, dirty degree feature extraction model includes first feature extractor and second feature extractor, according to dirty degree feature extraction model, extract dirty degree feature in the mop image, include:
step A11, extracting the color distribution characteristics of the mop floor area image according to the first characteristic extractor;
step A12, extracting the brightness distribution characteristic of the image of the non-mopping area of the mop according to the second characteristic extractor;
step a13, generating the dirty degree feature according to the color distribution feature and the brightness distribution feature.
In this embodiment it is noted that when the mop to be cleaned is a light colored mop, e.g. a white mop or a light gray mop, the color distribution of the mop in the image of the area where the mop is in contact with the floor will differ from the color distribution of the mop of a clean mop; when the mop cloth is not used for a long time, the area of the mop cloth contacted with the ground can be clean, but dust accumulation often exists in the non-mopping area (usually, the upper side area of the mop cloth not contacted with the ground), if the mop cloth is directly wetted and then mopped, the mopping effect is poor, the dust accumulation exists, the image brightness of the image of the non-mopping area of the mop cloth can be influenced, namely, the original color of the mop cloth is shielded by the dust, so that the shot image of the non-mopping area of the mop cloth is darker.
As an example, the steps a11 to a13 include: mapping a pixel matrix of the mop mopping area image into a color distribution feature by inputting the pixel matrix of the mop mopping area image into a first feature extractor; mapping the pixel matrix of the non-mopping area image of the mop cloth into a brightness distribution characteristic by inputting the pixel matrix of the non-mopping area image of the mop cloth into a second characteristic extractor; and splicing the color distribution characteristic and the brightness distribution characteristic to obtain the dirt degree characteristic. The dirty degree of mop is considered from two aspects of mop colour and mop luminance to this application embodiment for dirty degree characteristic possesses more dirty degree information of mop, thereby detects for dirty degree and provides more decision-making bases, can promote dirty degree and detect the degree of accuracy.
Wherein the stain type feature extraction model includes a third feature extractor and a fourth feature extractor,
the method for extracting the stain type characteristics in the mop image according to the stain type characteristic extraction model comprises the following steps:
step B11, extracting the color distribution characteristics of the mop floor area image according to the third characteristic extractor;
step B12, extracting the mop reflection degree characteristic of the mop floor area image according to the fourth characteristic extractor;
and step B13, generating the dirt type characteristic according to the color distribution characteristic and the mop reflection characteristic.
In this embodiment, it should be noted that, dirty to the difference, there can be different dirty colours usually on the mop, for example oil stain dirty and dust dirty then can have certain difference in the colour, the mop also can have the difference to the reflection of light degree this moment to light, the dirty reflection of light degree of oil stain obviously can be greater than the dirty reflection of light degree of dust, so regional image can select to shoot under stronger light intensity environment in mop ground in this application embodiment and obtain.
As an example, the steps B11 to B13 include: mapping a pixel matrix of the mop mopping area image into a color distribution characteristic by inputting the pixel matrix of the mop mopping area image into a third characteristic extractor; mapping a pixel matrix of the mop mopping area image into a mop reflection degree characteristic by inputting the pixel matrix of the mop mopping area image into a fourth characteristic extractor; and splicing the color distribution characteristic and the mop reflection degree characteristic to obtain the dirty type characteristic. The dirty type of mop is considered from two aspects of mop colour and mop reflection of light degree to this application embodiment for dirty type characteristic possesses more dirty type information of mop, thereby detects for dirty type and provides more decision-making bases, can promote dirty type detection's degree of accuracy.
Wherein, dirty testing result is including detecting dirty degree and detecting dirty type, it is long that washing parameter includes the target washing, the basis the washing parameter that dirty testing result corresponds controls mopping robot is right wait to wash the mop and carry out the mop washing, include:
step S31, determining the target cleaning time length corresponding to the mop to be cleaned according to the detected dirt type and the detected dirt degree;
and step S32, controlling the mopping robot to mop and clean the mopping cloth to be cleaned according to the target cleaning time length.
In this embodiment, it should be noted that the stain degree detection type is divided into a stubborn stain type and a non-stubborn stain type, for example, oil stains and stains may be divided into the stubborn stain type, dust stains may be divided into the non-stubborn stain type, and the stubborn stain obviously requires a longer cleaning time; the higher the degree of soiling, the longer the cleaning time is clearly required.
As one example, steps S31 to S32 include: searching for standard cleaning time corresponding to the detected dirt degree according to the mapping relation between the dirt degree and the cleaning time; searching a standard cleaning time length compensation coefficient corresponding to the detected dirt type according to the mapping relation between the dirt type and the cleaning time length compensation coefficient; determining a target cleaning time length according to the standard cleaning time length and the standard cleaning time length compensation coefficient, wherein the target cleaning time length can be obtained by multiplying the standard cleaning time length and the standard cleaning time length compensation coefficient; and generating a corresponding cleaning control instruction according to the target cleaning time, and controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the cleaning control instruction. The embodiment of the application realizes the purpose of accurately determining the cleaning time according to the dirt degree and the dirt type.
Wherein, dirty testing result is including detecting dirty degree and detecting dirty type, the washing parameter includes that the target washs time length and target cleaner type, the basis the washing parameter that dirty testing result corresponds, control mopping robot is right wait to wash the mop and carry out the mop washing, include:
step C10, determining the target cleaning time length corresponding to the mop to be cleaned according to the detected dirt type and the detected dirt degree;
step C20, determining a target cleaning agent type corresponding to the detected dirt type according to the corresponding relation between the cleaning agent types corresponding to the dirt types;
and C30, controlling the mopping robot to mop and clean the mop to be cleaned according to the target cleaning time length and the target cleaning agent type.
In the present embodiment, it should be noted that, in order to achieve a better cleaning effect, different types of cleaning agents are usually used for cleaning, for example, oil stains and dirt need to be cleaned with a specific cleaning agent, and dust and dirt need to be cleaned with water.
As an example, the step C10 to the step C30 include: searching for standard cleaning time corresponding to the detected dirt degree according to the mapping relation between the dirt degree and the cleaning time; searching a standard cleaning time length compensation coefficient corresponding to the detected dirt type according to the mapping relation between the dirt type and the cleaning time length compensation coefficient; determining a target cleaning time length according to the standard cleaning time length and the standard cleaning time length compensation coefficient; determining a target cleaning agent type corresponding to the detected dirt type according to the mapping relation between the dirt type and the cleaning agent type; and generating a corresponding cleaning control instruction according to the target cleaning time and the type of the target cleaning agent, and controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the cleaning control instruction. The purpose of accurately determining the cleaning time and the type of the cleaning agent according to the dirt degree and the dirt type is achieved, so that a more accurate cleaning control instruction can be generated, and the cleaning effect of the mop is improved.
Compared with the technical means of cleaning the mop by setting fixed cleaning time in the prior art, the mopping robot control method based on artificial intelligence acquires a mop image corresponding to the mop to be cleaned, and performs dirt judgment on the mop to be cleaned according to the mop image to obtain a dirt judgment result; if the dirty mop is judged to be dirty, carrying out dirty detection on the mop to be cleaned according to the image of the mop to obtain a dirty detection result; and controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the cleaning parameters corresponding to the dirt detection result. Whether the mop is dirty mop can be judged in advance in the embodiment of the application, if the mop is dirty, dirty detection is carried out, so that the cleaning parameters required by cleaning the mop at this time can be estimated according to dirty detection results, the floor-cleaning robot is accurately controlled to clean the mop according to the estimated cleaning parameters, the cleaning parameters (such as cleaning time) can be as close as possible to the real cleaning parameters required by cleaning at this time, so that the technical defect that the unclean condition is cleaned easily due to the fact that the mop is clean but still in the cleaning condition in the prior art is overcome, and the technical defect that the unclean condition is cleaned easily occurs is improved.
Example two
Further, referring to fig. 2, based on the first embodiment, in another embodiment of the method for controlling a floor mopping robot based on artificial intelligence according to the present application, before the step of extracting the feature of the stain type from the image of the mop according to the feature extraction model of the stain type, the method for controlling a floor mopping robot based on artificial intelligence further includes:
step D10, acquiring a training sample and a dirty type identifier corresponding to the training sample, and extracting a positive example mop image sample and a corresponding negative example mop image sample corresponding to the training sample from oil-contaminated mop image data and non-oil-contaminated mop image data based on the dirty type identifier;
step D20, extracting the features of the training sample based on the feature extraction model of the type of the dirt to be trained, and obtaining the training dirt type features corresponding to the training sample;
step D30, respectively performing feature extraction on the positive example mop image sample and the negative example mop image sample based on a to-be-trained dirt type feature extraction model to obtain a positive example dirt degree feature corresponding to the positive example mop image sample and a negative example dirt degree feature corresponding to the negative example mop image sample;
step D40, constructing a contrast learning loss based on the degree of difference between the training stain type feature and the positive example stain degree feature and the degree of difference between the training stain type feature and the negative example stain degree feature;
and D50, optimizing the to-be-trained dirt type feature extraction model based on the comparison learning loss to obtain the dirt type feature extraction model.
In this embodiment, it should be noted that the dirty type identifier is a sample tag identifying the dirty type of the training sample, for example, when the sample tag is set to 1, the dirty mop is identified as oil stain, and when the sample tag is set to 0, the dirty mop is identified as non-oil stain. The oil-stained mop image data at least comprises an oil-stained mop image, and the non-oil-stained mop image data at least comprises a non-oil-stained mop image.
As an example, the steps D10 to D50 include: judging whether the training sample is an oil stain mop image or not according to the dirty type identifier, if so, selecting a positive example mop image sample corresponding to the training sample from the oil stain mop image data, and selecting a negative example mop image sample corresponding to the training sample from the non-oil stain mop image data; if not, selecting a negative example mop image sample corresponding to the training sample from the oil stain mop image data, and selecting a positive example mop image sample corresponding to the training sample from the non-oil stain mop image data; respectively extracting features of the training sample, the positive example mop image sample and the negative example mop image sample to obtain a training dirty type feature corresponding to the training sample, a positive example dirty degree feature corresponding to the positive example mop image sample and a negative example dirty degree feature corresponding to the negative example mop image sample; constructing a contrast learning loss through a preset contrast learning loss calculation formula based on the difference between the training stain type feature and the positive case stain degree feature and the difference between the training stain type feature and the negative case stain degree feature; judging whether the comparison learning loss is converged, if so, taking the to-be-trained dirty type feature extraction model as the dirty type feature extraction model, if not, updating the to-be-trained dirty type feature extraction model according to the model gradient calculated by the comparison learning loss, and returning to the execution step: and acquiring a training sample and a pollution type identifier corresponding to the training sample until the calculated comparative learning loss is converged.
As an example, the preset comparative learning loss calculation formula is as follows:
Figure 552787DEST_PATH_IMAGE001
wherein L is the contrast learning loss, uAFor said training stain type feature, uBIs the positive case stain degree characteristic, uB iFor the negative case dirty degree characteristics, M is a negative caseThe number of the dirty feature, when the distance between the positive dirty feature and the training dirty feature is small enough and the distance between each negative dirty feature and the training dirty feature is large enough, the contrast learning loss can be converged, and the dirty feature extraction model updated based on the contrast learning loss can be provided with the capability of reducing the distance between the dirty feature and the positive dirty feature as the positive example and reducing the distance between the dirty feature and the negative dirty feature as the negative example, so that the dirty feature extraction model can generate different dirty features based on samples of different sample types (positive or negative example), the generated dirty feature has sample category information, the information content included in the dirty feature generated by feature extraction is improved, and more decision basis can be provided for detecting the dirty feature, the accuracy of dirty type characteristic detection is promoted.
EXAMPLE III
The embodiment of the present application further provides a control device of a floor mopping robot based on artificial intelligence, the control device of a floor mopping robot based on artificial intelligence includes:
the dirt distinguishing module is used for acquiring a mop image corresponding to the mop to be cleaned, and distinguishing dirt of the mop to be cleaned according to the mop image to obtain a dirt distinguishing result;
the dirt detection module is used for carrying out dirt detection on the mop to be cleaned according to the image of the mop if the dirt judgment result is that the mop is dirty, so as to obtain a dirt detection result;
and the mop cleaning control module is used for controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the dirt detection result.
Optionally, the contamination detection module is further configured to:
extracting the dirt degree characteristic in the mop image according to the dirt degree characteristic extraction model; carrying out dirt degree detection on the mop to be cleaned according to the dirt degree characteristic; and/or
Extracting the dirt type characteristics in the mop image according to a dirt type characteristic extraction model; and carrying out dirt type detection on the mop to be cleaned according to the dirt degree characteristic.
Optionally, the mop image comprises a mop area image and a mop non-mop area image, the soiling level feature extraction model comprises a first feature extractor and a second feature extractor, and the soiling detection module is further configured to:
extracting color distribution characteristics of the mop mopping area image according to the first characteristic extractor;
extracting the brightness distribution characteristics of the image of the non-mopping area of the mop cloth according to the second characteristic extractor;
and generating the dirt degree characteristic according to the color distribution characteristic and the brightness distribution characteristic.
Optionally, the stain type feature extraction model includes a third feature extractor and a fourth feature extractor, and the stain detection module is further configured to:
extracting color distribution characteristics of the mop mopping area image according to the third characteristic extractor;
according to the fourth feature extractor, extracting the mop reflection degree feature of the mop mopping area image;
and generating the dirt type characteristic according to the color distribution characteristic and the mop reflection degree characteristic.
Optionally, the soil detection result comprises detecting a soil degree and detecting a soil type, the cleaning parameter comprises a target cleaning duration, and the mop cleaning control module is further configured to:
determining the target cleaning time corresponding to the mop to be cleaned according to the detected dirt type and the detected dirt degree;
and controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the target cleaning time length.
Optionally, the dirty detection result includes detecting a dirty degree and detecting a dirty type, the cleaning parameter includes a target cleaning time and a target cleaning agent type, and the mop cleaning control module is further configured to:
determining the target cleaning time corresponding to the mop to be cleaned according to the detected dirt type and the detected dirt degree;
determining a target cleaning agent type corresponding to the detected dirt type according to the corresponding relation between the cleaning agent types corresponding to the dirt types;
and controlling the mopping robot to perform mop cleaning on the mopping cloth to be cleaned according to the target cleaning time and the type of the target cleaning agent.
Optionally, the floor mopping robot control device based on artificial intelligence is further configured to:
acquiring a training sample and a dirty type identifier corresponding to the training sample, and extracting a positive example mop image sample and a corresponding negative example mop image sample corresponding to the training sample from oil-contaminated mop image data and non-oil-contaminated mop image data based on the dirty type identifier;
performing feature extraction on the training sample based on a to-be-trained dirt type feature extraction model to obtain training dirt type features corresponding to the training sample;
respectively performing feature extraction on the positive example mop image sample and the negative example mop image sample based on a to-be-trained dirt type feature extraction model to obtain a positive example dirt degree feature corresponding to the positive example mop image sample and a negative example dirt degree feature corresponding to the negative example mop image sample;
constructing a contrast learning loss based on the degree of difference between the training stain type feature and the positive case stain degree feature and the degree of difference between the training stain type feature and the negative case stain degree feature;
and optimizing the to-be-trained dirt type feature extraction model based on the comparison learning loss to obtain the dirt type feature extraction model.
The control device of the floor mopping robot based on artificial intelligence provided by the application adopts the control method of the floor mopping robot based on artificial intelligence in the embodiment, and solves the technical problem of low control accuracy of controlling the floor mopping robot to clean the mops. Compared with the prior art, the beneficial effects of the floor mopping robot control device based on artificial intelligence provided by the embodiment of the application are the same as the beneficial effects of the floor mopping robot control method based on artificial intelligence provided by the embodiment, and other technical characteristics of the floor mopping robot control device based on artificial intelligence are the same as the characteristics disclosed by the embodiment method, and are not repeated herein.
Example four
An embodiment of the present application provides an electronic device, and the electronic device includes: at least one processor; and a memory communicatively coupled to the at least one processor; the storage stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor, so that the at least one processor can execute the method for controlling the floor-mopping robot based on the artificial intelligence in the first embodiment.
Referring now to FIG. 3, shown is a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure. The electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., car navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 3, the electronic device may include a processing apparatus (e.g., a central processing unit, a graphic processor, etc.) that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) or a program loaded from a storage apparatus into a Random Access Memory (RAM). In the RAM, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device, ROM and RAM are trained on each other via the bus. An input/output (I/O) interface is also connected to the bus.
Generally, the following systems may be connected to the I/O interface: input devices including, for example, touch screens, touch pads, keyboards, mice, image sensors, microphones, accelerometers, gyroscopes, and the like; output devices including, for example, Liquid Crystal Displays (LCDs), speakers, vibrators, and the like; storage devices including, for example, magnetic tape, hard disk, etc.; and a communication device. The communication means may allow the electronic device to communicate wirelessly or by wire with other devices to exchange data. While the figures illustrate an electronic device with various systems, it is to be understood that not all illustrated systems are required to be implemented or provided. More or fewer systems may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means, or installed from a storage means, or installed from a ROM. The computer program, when executed by a processing device, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
The electronic equipment provided by the application adopts the control method of the floor mopping robot based on artificial intelligence in the embodiment, and solves the technical problem of low control accuracy of controlling the floor mopping robot to clean the mop. Compared with the prior art, the beneficial effects of the electronic device provided by the embodiment of the application are the same as the beneficial effects of the control method of the floor mopping robot based on artificial intelligence provided by the embodiment, and other technical features of the electronic device are the same as those disclosed by the embodiment method, which are not repeated herein.
It should be understood that portions of the present disclosure may be implemented in hardware, software, firmware, or a combination thereof. In the foregoing description of embodiments, the particular features, structures, materials, or characteristics may be combined in any suitable manner in any one or more embodiments or examples.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
EXAMPLE five
The present embodiment provides a computer-readable storage medium having computer-readable program instructions stored thereon for performing the method for controlling a floor-mopping robot based on artificial intelligence in the first embodiment.
The computer readable storage medium provided by the embodiments of the present application may be, for example, a usb disk, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, or device, or a combination of any of the above. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present embodiment, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, or device. Program code embodied on a computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer-readable storage medium may be embodied in an electronic device; or may be present alone without being incorporated into the electronic device.
The computer readable storage medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a mop image corresponding to a mop to be cleaned, and judging the dirt of the mop to be cleaned according to the mop image to obtain a dirt judgment result; if the dirty mop is judged to be dirty, carrying out dirty detection on the mop to be cleaned according to the image of the mop to obtain a dirty detection result; and controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the cleaning parameters corresponding to the dirt detection result.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present disclosure may be implemented by software or hardware. Wherein the names of the modules do not in some cases constitute a limitation of the unit itself.
The computer readable storage medium provided by the application stores computer readable program instructions for executing the above-mentioned control method of the floor mopping robot based on artificial intelligence, and solves the technical problem of low control accuracy for controlling the floor mopping robot to clean the mopping cloth. Compared with the prior art, the beneficial effects of the computer-readable storage medium provided by the embodiment of the application are the same as the beneficial effects of the control method of the floor-mopping robot based on artificial intelligence provided by the embodiment, and are not repeated herein.
EXAMPLE six
The present application also provides a computer program product comprising a computer program which, when executed by a processor, performs the steps of the artificial intelligence based floor-mopping robot control method as described above.
The computer program product solves the technical problem that the control accuracy for controlling the mopping robot to clean the mopping cloth is low. Compared with the prior art, the beneficial effects of the computer program product provided by the embodiment of the application are the same as the beneficial effects of the control method of the floor-mopping robot based on artificial intelligence provided by the embodiment, and are not repeated herein.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (10)

1. A floor mopping robot control method based on artificial intelligence is characterized by comprising the following steps:
acquiring a mop image corresponding to a mop to be cleaned, and judging the dirt of the mop to be cleaned according to the mop image to obtain a dirt judgment result;
if the dirty mop is judged to be dirty, carrying out dirty detection on the mop to be cleaned according to the image of the mop to obtain a dirty detection result;
and controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the cleaning parameters corresponding to the dirt detection result.
2. The artificial intelligence based mopping robot controlling method according to claim 1, wherein the soil detection of the mops to be cleaned according to the mop image comprises:
extracting the dirt degree characteristic in the mop image according to the dirt degree characteristic extraction model; carrying out dirt degree detection on the mop to be cleaned according to the dirt degree characteristic; and/or
Extracting the dirt type characteristics in the mop image according to a dirt type characteristic extraction model; and carrying out dirt type detection on the mop to be cleaned according to the dirt degree characteristic.
3. The artificial intelligence based mopping robot control method according to claim 2, wherein the mop cloth image comprises a mop cloth mopping area image and a mop cloth non-mopping area image, the dirty degree feature extraction model comprises a first feature extractor and a second feature extractor, and the extracting the dirty degree feature in the mop cloth image according to the dirty degree feature extraction model comprises:
extracting color distribution characteristics of the mop mopping area image according to the first characteristic extractor;
extracting the brightness distribution characteristics of the image of the non-mopping area of the mop cloth according to the second characteristic extractor;
and generating the dirt degree characteristic according to the color distribution characteristic and the brightness distribution characteristic.
4. The artificial intelligence based mopping robot control method of claim 2, wherein the soil type feature extraction model includes a third feature extractor and a fourth feature extractor,
the method for extracting the stain type characteristics in the mop image according to the stain type characteristic extraction model comprises the following steps:
extracting color distribution characteristics of the mop mopping area image according to the third characteristic extractor;
according to the fourth feature extractor, extracting the mop reflection degree feature of the mop mopping area image;
and generating the dirt type characteristic according to the color distribution characteristic and the mop reflection degree characteristic.
5. The method for controlling a floor mopping robot based on artificial intelligence according to claim 1, wherein the soil detection result comprises detecting a soil degree and detecting a soil type, the cleaning parameters comprise a target cleaning time period, and the controlling the floor mopping robot to mop-clean the mopping cloth to be cleaned according to the cleaning parameters corresponding to the soil detection result comprises:
determining the target cleaning time corresponding to the mop to be cleaned according to the detected dirt type and the detected dirt degree;
and controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the target cleaning time length.
6. The method for controlling the floor mopping robot based on the artificial intelligence as claimed in claim 1, wherein the dirty detection result comprises a dirty degree detection and a dirty type detection, the cleaning parameters comprise a target cleaning time and a target cleaning agent type, and the controlling the floor mopping robot to mop and clean the mop to be cleaned according to the cleaning parameters corresponding to the dirty detection result comprises:
determining the target cleaning time corresponding to the mop to be cleaned according to the detected dirt type and the detected dirt degree;
determining a target cleaning agent type corresponding to the detected dirt type according to the corresponding relation between the cleaning agent types corresponding to the dirt types;
and controlling the mopping robot to mop and clean the mop to be cleaned according to the target cleaning time and the type of the target cleaning agent.
7. The artificial intelligence-based floor mopping robot control method according to claim 2, wherein before the step of extracting the stain type feature in the mop image according to the stain type feature extraction model, the artificial intelligence-based floor mopping robot control method further comprises:
acquiring a training sample and a dirty type identifier corresponding to the training sample, and extracting a positive example mop image sample and a corresponding negative example mop image sample corresponding to the training sample from oil-contaminated mop image data and non-oil-contaminated mop image data based on the dirty type identifier;
performing feature extraction on the training sample based on a to-be-trained dirt type feature extraction model to obtain training dirt type features corresponding to the training sample;
respectively performing feature extraction on the positive example mop image sample and the negative example mop image sample based on a to-be-trained dirt type feature extraction model to obtain a positive example dirt degree feature corresponding to the positive example mop image sample and a negative example dirt degree feature corresponding to the negative example mop image sample;
constructing a contrast learning loss based on the degree of difference between the training stain type feature and the positive case stain degree feature and the degree of difference between the training stain type feature and the negative case stain degree feature;
and optimizing the dirty type feature extraction model to be trained based on the comparison learning loss to obtain the dirty type feature extraction model.
8. A floor mopping robot control device based on artificial intelligence is characterized in that the floor mopping robot control method based on artificial intelligence is as follows:
the dirt distinguishing module is used for acquiring a mop image corresponding to the mop to be cleaned, and distinguishing dirt of the mop to be cleaned according to the mop image to obtain a dirt distinguishing result;
the dirt detection module is used for carrying out dirt detection on the mop to be cleaned according to the image of the mop if the dirt judgment result is that the mop is dirty, so as to obtain a dirt detection result;
and the mop cleaning control module is used for controlling the mopping robot to perform mop cleaning on the mop to be cleaned according to the dirt detection result.
9. An electronic device, characterized in that the electronic device comprises:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the steps of the artificial intelligence based mopping robot control method of any one of claims 1-7.
10. A computer-readable storage medium, wherein a program for implementing the artificial intelligence-based mopping robot control method is stored on the computer-readable storage medium, and the program for implementing the artificial intelligence-based mopping robot control method is executed by a processor to implement the steps of the artificial intelligence-based mopping robot control method as recited in any one of claims 1 to 7.
CN202210386861.2A 2022-04-14 2022-04-14 Mopping robot control method, device, equipment and medium based on artificial intelligence Active CN114468866B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210386861.2A CN114468866B (en) 2022-04-14 2022-04-14 Mopping robot control method, device, equipment and medium based on artificial intelligence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210386861.2A CN114468866B (en) 2022-04-14 2022-04-14 Mopping robot control method, device, equipment and medium based on artificial intelligence

Publications (2)

Publication Number Publication Date
CN114468866A true CN114468866A (en) 2022-05-13
CN114468866B CN114468866B (en) 2022-07-15

Family

ID=81487887

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210386861.2A Active CN114468866B (en) 2022-04-14 2022-04-14 Mopping robot control method, device, equipment and medium based on artificial intelligence

Country Status (1)

Country Link
CN (1) CN114468866B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114983283A (en) * 2022-07-04 2022-09-02 麦岩智能科技(北京)有限公司 Self-cleaning method of intelligent cleaning robot
CN115429162A (en) * 2022-07-27 2022-12-06 云鲸智能(深圳)有限公司 Cleaning method, control device, base station, cleaning system and storage medium for mopping piece

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110236455A (en) * 2019-01-08 2019-09-17 云鲸智能科技(东莞)有限公司 Control method, device, equipment and the storage medium of floor-mopping robot
CN111246204A (en) * 2020-03-24 2020-06-05 昆山丘钛微电子科技有限公司 Relative brightness deviation-based dirt detection method and device
CN111358342A (en) * 2020-02-17 2020-07-03 添可智能科技有限公司 Self-cleaning control method of cleaning equipment, cleaning equipment and storage medium
CN112734766A (en) * 2020-12-14 2021-04-30 王富才 Cleaning parameter adjusting method and system of photovoltaic cleaning robot based on artificial intelligence
CN112890683A (en) * 2021-01-13 2021-06-04 美智纵横科技有限责任公司 Cleaning method, device, equipment and computer readable storage medium
CN113017506A (en) * 2021-03-25 2021-06-25 深圳市银星智能科技股份有限公司 Mop cleaning method and maintenance station for a cleaning robot
CN113273933A (en) * 2021-05-24 2021-08-20 美智纵横科技有限责任公司 Cleaning robot, control method and device thereof, and storage medium
CN113963290A (en) * 2021-09-24 2022-01-21 深圳市九洲电器有限公司 Video target detection method, device, equipment and readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110236455A (en) * 2019-01-08 2019-09-17 云鲸智能科技(东莞)有限公司 Control method, device, equipment and the storage medium of floor-mopping robot
CN111358342A (en) * 2020-02-17 2020-07-03 添可智能科技有限公司 Self-cleaning control method of cleaning equipment, cleaning equipment and storage medium
CN111246204A (en) * 2020-03-24 2020-06-05 昆山丘钛微电子科技有限公司 Relative brightness deviation-based dirt detection method and device
CN112734766A (en) * 2020-12-14 2021-04-30 王富才 Cleaning parameter adjusting method and system of photovoltaic cleaning robot based on artificial intelligence
CN112890683A (en) * 2021-01-13 2021-06-04 美智纵横科技有限责任公司 Cleaning method, device, equipment and computer readable storage medium
CN113017506A (en) * 2021-03-25 2021-06-25 深圳市银星智能科技股份有限公司 Mop cleaning method and maintenance station for a cleaning robot
CN113273933A (en) * 2021-05-24 2021-08-20 美智纵横科技有限责任公司 Cleaning robot, control method and device thereof, and storage medium
CN113963290A (en) * 2021-09-24 2022-01-21 深圳市九洲电器有限公司 Video target detection method, device, equipment and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114983283A (en) * 2022-07-04 2022-09-02 麦岩智能科技(北京)有限公司 Self-cleaning method of intelligent cleaning robot
CN115429162A (en) * 2022-07-27 2022-12-06 云鲸智能(深圳)有限公司 Cleaning method, control device, base station, cleaning system and storage medium for mopping piece

Also Published As

Publication number Publication date
CN114468866B (en) 2022-07-15

Similar Documents

Publication Publication Date Title
CN114468866B (en) Mopping robot control method, device, equipment and medium based on artificial intelligence
CN111222648A (en) Semi-supervised machine learning optimization method, device, equipment and storage medium
CN102904996B (en) The method and device of a kind of handset touch panel performance test, system
CN111709965B (en) Map optimization method and device for sweeping robot
CN110572636B (en) Camera contamination detection method and device, storage medium and electronic equipment
CN111582117A (en) Unmanned aerial vehicle illegal building inspection method, equipment and storage medium
CN112699940B (en) Vehicle cleaning associated resource recommendation method and device and storage medium
CN108415657B (en) Message sending method, device, medium and electronic equipment
CN114693673A (en) Glass detection method and device, electronic equipment and computer-readable storage medium
WO2012093856A2 (en) Method and apparatus for creating a live artistic sketch of an image
CN105279268A (en) Multi-data-source map download method
KR102011212B1 (en) Method for Collecting and Saving object that is used as training data of Neural network for Artificial Intelligence
CN114972113A (en) Image processing method and device, electronic equipment and readable storage medium
CN115861354A (en) Image edge detection method, device, equipment and storage medium
CN109348288A (en) A kind of processing method of video, device, storage medium and terminal
RU2754641C2 (en) Method and device for determining direction of rotation of target object, computer-readable media and electronic device
CN115444327B (en) Method, device, system and storage medium for processing cleaning image of cleaning device
CN112465859A (en) Method, device, equipment and storage medium for detecting fast moving object
CN108959074B (en) Method and device for testing algorithm library, storage medium and electronic equipment
CN114998336B (en) Method and device for detecting and treating plastic waste
CN112904366B (en) Repositioning method and device applied to sweeper, electronic equipment and medium
WO2023036274A1 (en) Video processing method and apparatus, electronic device, medium and program product
CN116109932A (en) House security detection method, house security detection device, electronic equipment and readable storage medium
CN115040033A (en) Robot cleaning record display method, device, equipment and medium
JP6784039B2 (en) Data estimation device, data estimation method, and data estimation program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant