CN112749753A - Electric equipment control method and device, electric equipment and storage medium - Google Patents

Electric equipment control method and device, electric equipment and storage medium Download PDF

Info

Publication number
CN112749753A
CN112749753A CN202110061859.3A CN202110061859A CN112749753A CN 112749753 A CN112749753 A CN 112749753A CN 202110061859 A CN202110061859 A CN 202110061859A CN 112749753 A CN112749753 A CN 112749753A
Authority
CN
China
Prior art keywords
image
detected
pollutant
features
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110061859.3A
Other languages
Chinese (zh)
Other versions
CN112749753B (en
Inventor
宋士奇
汪进
李保水
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202110061859.3A priority Critical patent/CN112749753B/en
Publication of CN112749753A publication Critical patent/CN112749753A/en
Application granted granted Critical
Publication of CN112749753B publication Critical patent/CN112749753B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B08CLEANING
    • B08BCLEANING IN GENERAL; PREVENTION OF FOULING IN GENERAL
    • B08B13/00Accessories or details of general applicability for machines or apparatus for cleaning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B3/00Audible signalling systems; Audible personal calling systems
    • G08B3/10Audible signalling systems; Audible personal calling systems using electric transmission; using electromagnetic transmission
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to an electrical equipment control method, an electrical equipment control device, electrical equipment and a storage medium. The method comprises the following steps: acquiring an image to be detected, which is acquired aiming at a region to be detected of electrical equipment; performing feature extraction on an image to be detected to obtain image features of the image to be detected, and performing target detection based on the image features to obtain a pollutant detection result in the area to be detected; and when the pollutant detection result meets the preset condition, sending prompt information for prompting to clean the electrical equipment. By adopting the method, the pollutants in the electrical equipment can be automatically detected, and the user can be intelligently prompted to carry out cleaning treatment.

Description

Electric equipment control method and device, electric equipment and storage medium
Technical Field
The present application relates to the field of intelligent control technologies, and in particular, to an electrical device control method, an electrical device control apparatus, an electrical device, and a storage medium.
Background
After the electric equipment is used for a period of time, pollutants such as dust and the like can be attached to the electric equipment, and the use effect can be influenced if the electric equipment is not cleaned. For example, an air conditioner, which is a common cooling/heating device, during operation, contaminants such as dust and foreign particles in an external environment may enter the air conditioner along with an airflow, which may cause dirt to be easily accumulated in an air conditioner indoor unit, and the cleanliness of blown air may be affected, which threatens the health of a user, and thus the air conditioner indoor unit needs to be regularly cleaned.
At present, an air conditioner indoor unit is cleaned by adopting a manual cleaning mode, but the cleaning time needs to be judged manually, and the cleaning time is often difficult to judge manually, so that the following problems can be caused: if frequent cleaning is performed, time and labor are wasted, and if cleaning is performed once every long period of time, cleanliness is difficult to ensure.
Disclosure of Invention
In view of the above, it is necessary to provide an electrical appliance control method, an electrical appliance control device, an electrical appliance, and a storage medium, which can provide a cleaning prompt.
An appliance control method, the method comprising:
acquiring an image to be detected, which is acquired aiming at a region to be detected of electrical equipment;
performing feature extraction on the image to be detected to obtain image features of the image to be detected, and performing target detection based on the image features to obtain a pollutant detection result in the area to be detected;
and sending prompt information for prompting that the electrical equipment is cleaned when the pollutant detection result meets a preset condition.
An electrical appliance control apparatus, the apparatus comprising:
the acquisition module is used for acquiring an image to be detected, which is acquired aiming at a region to be detected of the electrical equipment;
the detection module is used for extracting the characteristics of the image to be detected to obtain the image characteristics of the image to be detected, and performing target detection based on the image characteristics to obtain a pollutant detection result in the area to be detected;
and the control module is used for sending prompt information for prompting that the electrical equipment is cleaned when the pollutant detection result meets a preset condition.
An electrical device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
acquiring an image to be detected, which is acquired aiming at a region to be detected of electrical equipment;
performing feature extraction on the image to be detected to obtain image features of the image to be detected, and performing target detection based on the image features to obtain a pollutant detection result in the area to be detected;
and sending prompt information for prompting that the electrical equipment is cleaned when the pollutant detection result meets a preset condition.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring an image to be detected, which is acquired aiming at a region to be detected of electrical equipment;
performing feature extraction on the image to be detected to obtain image features of the image to be detected, and performing target detection based on the image features to obtain a pollutant detection result in the area to be detected;
and sending prompt information for prompting that the electrical equipment is cleaned when the pollutant detection result meets a preset condition.
According to the electrical equipment control method, the electrical equipment control device, the electrical equipment and the storage medium, the image to be detected collected aiming at the area to be detected of the electrical equipment is obtained, feature extraction is carried out on the image to be detected, the image feature of the image to be detected is obtained, target detection is carried out based on the image feature, the pollutant detection result in the area to be detected is obtained, and when the pollutant detection result meets the preset condition, prompt information used for prompting that the electrical equipment is cleaned is sent. Therefore, automatic detection can be carried out on pollutants in the electrical equipment, cleaning opportunity is judged according to a pollutant detection result, when the pollutants are judged to be required to be cleaned, a user is intelligently prompted to carry out cleaning treatment, the user is not required to judge the cleaning opportunity by self, and manpower consumption is reduced.
Drawings
FIG. 1 is a diagram illustrating an application environment of a control method of an electric device according to an embodiment;
FIG. 2 is a schematic flow chart illustrating a method for controlling an electrical device according to an embodiment;
FIG. 3 is a schematic flow chart diagram illustrating a method for training a target detection model in one embodiment;
FIG. 4 is a diagram illustrating the structure of an object detection model in one embodiment;
fig. 5 is a block diagram showing a configuration of an electric appliance control device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The electrical equipment control method provided by the application can be applied to the electrical equipment 100 shown in fig. 1. The electric appliance 100 comprises a control unit 101, an image acquisition unit 102 and a voice unit 103, wherein the image acquisition unit 102 and the voice unit 103 are respectively in communication connection with the control unit 101. The image acquisition unit 102 acquires an image of an area to be detected of the electrical equipment, the control unit 101 acquires the image acquired by the image acquisition unit 102, detects pollutants in the image, and sends a control instruction to the voice unit 103 when a detection result meets a preset condition so that the voice unit 103 sends a cleaning prompt. The image capturing unit 102 may be a camera, and the voice unit 103 may be a speaker.
In one embodiment, the electric appliance 100 may further include a WIFI communication unit 104, and the WIFI communication unit 104 is in communication connection with the control unit 101. The WIFI communication unit 104 is further communicatively connected to an internet of things (IOT) server 105, and the user terminal 106 communicates with the IOT server 105 through a network. When the detection result meets the preset condition, the control unit 101 may further send a control instruction to the WIFI communication unit 104, so that the WIFI communication unit 104 sends the cleaning prompt information to the user terminal 106 through the IOT server 105, and pushes cleaning service related information for the user through the user terminal 106.
In one embodiment, as shown in fig. 2, an electrical apparatus control method is provided, which is described by taking the method as an example for application to an electrical apparatus, and includes the following steps S202 to S206.
S202, acquiring an image to be detected, which is acquired aiming at the area to be detected of the electrical equipment.
The electrical equipment in the present application may be, but is not limited to, an air conditioner, a vacuum cleaner, and other equipment that is prone to dust deposition. Taking an air conditioner as an example, the region to be detected may be an internal region of an air conditioner internal unit, the image to be detected refers to an image of the region to be detected, and specifically, a camera may be installed on an inner side of the air conditioner internal unit and used for collecting the image of the region to be detected.
And S204, performing feature extraction on the image to be detected to obtain the image features of the image to be detected, and performing target detection based on the image features to obtain a pollutant detection result in the area to be detected.
It can be understood that when pollutants such as dust appear in the to-be-detected region, the corresponding to-be-detected image can contain pollutant information, the to-be-detected image is subjected to target detection, pollutants in the to-be-detected image can be identified, and a pollutant identification result in the to-be-detected image is used as a pollutant detection result in the to-be-detected region.
In one embodiment, a target detection model may be pre-established for detecting contaminants in the image, which may include, but is not limited to, various types of contaminants such as dust, dirt, dust particles, and airborne matter. Specifically, an image to be detected is input into a target detection model, the target detection model extracts image features of the image to be detected, mapping is carried out based on the image features, and position information of each detected pollutant in the image to be detected is obtained.
And S206, when the pollutant detection result meets the preset condition, sending out prompt information for prompting to clean the electrical equipment.
The pollutant detection result meets the preset condition, and the pollutant in the area to be detected can be understood to reach the degree of cleaning, namely the cleaning opportunity, so that the electrical equipment sends out a cleaning prompt. The preset condition may be set in combination with an actual situation, which is not limited herein.
In one embodiment, the prompt message may be a voice prompt message, and specifically, the voice prompt may be directly issued through a voice unit (such as a speaker) in the electrical apparatus to prompt the user to clean the electrical apparatus. In other embodiments, the prompt message may also be sent to a user terminal installed with an associated application program (APP) through the IOT server to prompt the user to clean the electrical device.
According to the electrical equipment control method, the image to be detected acquired aiming at the area to be detected of the electrical equipment is acquired, feature extraction is carried out on the image to be detected, the image feature of the image to be detected is acquired, target detection is carried out based on the image feature, a pollutant detection result in the area to be detected is acquired, and when the pollutant detection result meets a preset condition, prompt information for prompting that the electrical equipment is cleaned is sent. Therefore, automatic detection can be carried out on pollutants in the electrical equipment, cleaning opportunity is judged according to a pollutant detection result, when the pollutants are judged to be required to be cleaned, a user is intelligently prompted to carry out cleaning treatment, the user is not required to judge the cleaning opportunity by self, and manpower consumption is reduced.
In one embodiment, when the pollutant detection result satisfies a preset condition, the method further includes: and sending the cleaning service related information to a terminal associated with the electrical equipment.
The cleaning service related information may include, but is not limited to, cleaning operation guide information, cleaning reservation service information, etc., to guide a user to better clean the electrical appliance or to provide a professional home cleaning service.
In one embodiment, the pollutant detection result comprises the pollutant quantity, and when the pollutant quantity reaches a quantity threshold value, the pollutant detection result is judged to meet a preset condition.
The number of pollutants refers to the number of targets detected from the image to be detected, and specifically, by performing target detection on the image to be detected, position information of each target in the image to be detected can be obtained, the position information is used for representing the position of each target in the image to be detected, and the number of target positions is used as the number of targets. The amount of contaminants reaches a threshold amount, indicating that the amount of contaminants is sufficient to reach the level of cleaning required. The number threshold may be set in combination with actual requirements, which is not limited herein.
In other embodiments, when the density of the pollutant in the area to be detected reaches the density threshold, the pollutant detection result is determined to meet the preset condition. The pollutant density can be determined by the ratio of the quantity of pollutants to the area of the area to be detected, and when the pollutant density reaches a density threshold value, the pollutant density is large enough to reach the degree required to be cleaned. The density threshold may be set in combination with actual requirements, which is not limited herein.
In an embodiment, the step of performing feature extraction on an image to be detected to obtain image features of the image to be detected, performing target detection based on the image features, and obtaining a detection result of a pollutant in the area to be detected may specifically include: and performing feature extraction on the image to be detected through a target detection model to obtain the image features of the image to be detected, and performing target detection based on the image features to obtain a pollutant detection result in the area to be detected.
The object detection model is a model for detecting a contaminant object in an image. In one embodiment, as shown in fig. 3, the training method of the target detection model may include the following steps S302 to S306.
S302, acquiring a sample image acquired aiming at a to-be-detected area and corresponding annotation information, wherein the annotation information comprises: location information and category information of each contaminant labeled in the sample image.
Labeling each pollutant in the sample image, wherein for the labeled pollutant, the position information represents the real position of the pollutant, and the category information represents the real category of the pollutant. In one embodiment, the labeled category information may only contain one category for indicating that the labeled contaminant is a real contaminant, not an image background. In another embodiment, the labeled category information may comprise a plurality of categories, such as dust, dirt, dust particles, airborne matter, etc., i.e. the contaminant category is subdivided to represent the true fine category to which the labeled contaminant belongs.
S304, carrying out target detection on the sample image through the target detection model to be trained to obtain detection information corresponding to the sample image, wherein the detection information comprises: position information and category information of each contaminant detected from the sample image.
And inputting the sample image into a target detection model to be trained, extracting the image characteristics of the image to be detected by the target detection model, mapping based on the image characteristics, and outputting the detection information corresponding to the sample image. For the detected pollutant, the position information represents the predicted position of the pollutant, and the category information comprises the predicted category and the probability thereof.
S306, adjusting parameters of the target detection model to be trained based on the labeling information and the detection information of the sample image until the training end condition is met, and obtaining the target detection model.
The method includes the steps of establishing a loss function based on an error between labeling information and detection information of a sample image, adjusting parameters of a target detection model according to values of the loss function, and setting the loss function to be smaller than a preset threshold value, the accuracy of a test sample to meet preset requirements, or the iteration times to reach the preset times, wherein the preset threshold value, the preset requirements and the preset times can be set in combination with actual requirements and are not limited here.
In the above embodiment, the target detection model is trained, and after the training is completed, the image to be detected is input into the trained target detection model, so that the position information and the category information of each pollutant in the image to be detected can be obtained. According to the method, the pollutant condition in the area to be detected is intelligently identified through a target detection technology in computer vision, so that the labor consumption is reduced, and convenience is brought to users.
In one embodiment, the object detection model comprises: a feature extraction network, a feature fusion network and a recognition network. Through the target detection model, the characteristic extraction is carried out on the image to be detected, the image characteristic of the image to be detected is obtained, the target detection is carried out based on the image characteristic, and the step of obtaining the pollutant detection result in the area to be detected can specifically comprise the following steps: performing feature extraction on an image to be detected through a feature extraction network to obtain first features of at least two scales; extracting features of an image to be detected through a feature fusion network to obtain second features of at least two scales, wherein the scales of the second features correspond to the scales of the first features, and fusing each first feature and the second feature corresponding to the scale of the first feature to obtain fusion features of each scale, wherein the image features comprise each fusion feature; and identifying each fusion characteristic through an identification network to obtain a pollutant detection result in the area to be detected.
As shown in fig. 4, a schematic structural diagram of the object detection model in an embodiment is provided, where the feature extraction network includes convolution layers and 6 residual layers, the feature fusion network includes 5 convolution layers, and the recognition network employs a Softmax function. Specifically, the resolution (scale or size) of the image to be detected is adjusted to 256 × 256, the image is input to the feature extraction network as an input image, the input image is sequentially subjected to first convolution processing (convolution using 8 convolution kernels of size 3 × 3) and second convolution processing (convolution using 16 convolution kernels of size 3 × 3 with a step size of two pixels), and a feature map of size 128 × 128 is output. The 128 × 128 feature map is sequentially processed by six groups of residual layers (the execution times of each group of residual layers are respectively 2, 4) to increase the network depth, so that the first feature map with the scale of 64 × 64, 32 × 32, 16 × 16, 8 × 8, 4 × 4, 2 × 2 is obtained, in the six groups of residual layers, except that the number of convolution kernels and the resolution of the feature map are different, the structure of each group of residual layers is similar, and the two groups of residual layers are adjusted by the convolution kernels with corresponding sizes in a step size of two pixels to achieve the same effect as the pooling layer. In addition, the input image is sequentially subjected to 5 groups of convolutional layers, so that second feature maps with the scales of 2 × 2, 4 × 4, 8 × 8, 16 × 16 and 32 × 32 are obtained, the second feature maps extracted by the convolutional layers and the first feature maps extracted by the residual error layers are subjected to cascade operation to fuse feature information of the corresponding scales, and in the process, the convolutional layers are mainly used for performing up-sampling operation on the feature maps and dividing the fused features into 5 groups according to different scales (the scales are respectively 2 × 2, 4 × 4, 8 × 8, 16 × 16 and 32 × 32). And identifying 5 groups of fusion characteristics with different scales by using a Softmax function to obtain target position information and target category information.
In the embodiment, the network depth is increased by using the residual error layer, so that the feature information such as the contour, the texture and the like of the small-size target such as dust and the like can be more accurately extracted, and the feature extracted by the residual error layer and the feature extracted by the convolution layer with the corresponding scale are fused, so that the fused feature simultaneously contains the feature information such as the contour, the texture and the like of the target and the position feature information, and the accuracy of target detection is favorably improved.
In an embodiment, the step of identifying each fusion feature through an identification network to obtain a detection result of the contaminant in the area to be detected may specifically include the following steps: identifying each fusion characteristic through an identification network to obtain an identification result and probability corresponding to each fusion characteristic, wherein the identification result comprises position information and category information of each pollutant; and obtaining the detection result of the pollutants in the area to be detected according to the probability of each identification result.
The fusion features of various scales are identified through the identification network, identification results and probabilities corresponding to the fusion features of various scales are obtained, namely the fusion features of each scale correspond to one position information, one class information and one probability, and the class information and the position information corresponding to the maximum probability are output as detection results, so that the accuracy of target detection can be further improved.
In one embodiment, the pollutant detection result includes position information of each pollutant in the image to be detected, and the position information of each pollutant in the electrical equipment is obtained according to the position information of each pollutant in the image to be detected and a predetermined conversion relation.
Taking an air conditioner as an example, a camera is installed on the inner side of an air conditioner indoor unit and is used for collecting images of an area to be detected, a conversion relation exists between a camera coordinate system where the camera is located and an image coordinate system where the images are located, and a conversion relation also exists between the camera coordinate system and a world coordinate system, and the conversion relation can be predetermined in any possible mode. Therefore, after the position information of each pollutant in the image to be detected is obtained, according to the predetermined conversion relation, the position information of each pollutant can be converted from the image coordinate system to the world coordinate system, and the position information of each pollutant in the air conditioner indoor unit, such as the distances between the pollutant and the upper surface, the lower surface, the left surface, the right surface and the front surface and the rear surface of the air conditioner indoor unit, can be obtained, so that the position of the pollutant can be more accurately positioned. After the position information of each pollutant in the electrical equipment is obtained, the position information can be carried in the sent prompt information so as to clean more pertinently.
It should be understood that, although the steps in the flowcharts related to the above embodiments are shown in sequence as indicated by the arrows, the steps are not necessarily executed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in each flowchart related to the above embodiments may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a part of the steps or stages in other steps.
In one embodiment, as shown in fig. 5, there is provided an electrical appliance control device 500 including: an acquisition module 510, a detection module 520, and a control module 530, wherein:
the acquiring module 510 is configured to acquire an image to be detected acquired for an area to be detected of the electrical equipment.
The detection module 520 is configured to perform feature extraction on an image to be detected to obtain image features of the image to be detected, perform target detection based on the image features, and obtain a pollutant detection result in the area to be detected.
And the control module 530 is configured to send out a prompt message for prompting to clean the electrical equipment when the pollutant detection result meets a preset condition.
In one embodiment, the control module 530 is further configured to: and when the pollutant detection result meets a preset condition, sending cleaning service related information to a terminal associated with the electrical equipment.
In one embodiment, the contaminant detection result includes a contaminant amount, and the control module 530 is further configured to: and when the quantity of the pollutants reaches a quantity threshold value, judging that the pollutant detection result meets a preset condition.
In one embodiment, the detection module 520 is specifically configured to, when performing feature extraction on an image to be detected to obtain image features of the image to be detected, performing target detection based on the image features, and obtaining a detection result of a pollutant in the area to be detected: and performing feature extraction on the image to be detected through a target detection model to obtain the image features of the image to be detected, and performing target detection based on the image features to obtain a pollutant detection result in the area to be detected.
In one embodiment, the training method of the target detection model comprises the following steps: acquiring a sample image acquired aiming at a to-be-detected area and corresponding annotation information thereof, wherein the annotation information comprises: position information and category information of each pollutant marked in the sample image; carrying out target detection on the sample image through a target detection model to be trained to obtain detection information corresponding to the sample image, wherein the detection information comprises: position information and category information of each pollutant detected from the sample image; and adjusting parameters of the target detection model to be trained based on the labeling information and the detection information of the sample image until the training end condition is met, and obtaining the target detection model.
In one embodiment, the object detection model comprises: a feature extraction network, a feature fusion network and an identification network; through the target detection model, treat that the detected image carries out feature extraction, obtains the image characteristic of waiting to detect the image, carries out the target detection based on the image characteristic, obtains the step of waiting to detect the pollutant detection in the region and knot, specifically can include: performing feature extraction on an image to be detected through a feature extraction network to obtain first features of at least two scales; extracting features of an image to be detected through a feature fusion network to obtain second features of at least two scales, wherein the scales of the second features correspond to the scales of the first features, and fusing each first feature and the second feature corresponding to the scale of the first feature to obtain fusion features of each scale, wherein the image features comprise each fusion feature; and identifying each fusion characteristic through an identification network to obtain a pollutant detection result in the area to be detected.
In an embodiment, the step of identifying each fusion feature through an identification network to obtain a detection result of the contaminant in the region to be detected may specifically include: identifying each fusion characteristic through an identification network to obtain an identification result and probability corresponding to each fusion characteristic, wherein the identification result comprises position information and category information of each pollutant; and obtaining the detection result of the pollutants in the area to be detected according to the probability of each identification result.
In one embodiment, the contaminant detection result includes position information of each contaminant in the image to be detected, and the apparatus further includes: and the position determining module is used for obtaining the position information of each pollutant in the electrical equipment according to the position information of each pollutant in the image to be detected and the predetermined conversion relation.
For specific limitations of the electrical equipment control device, reference may be made to the above limitations of the electrical equipment control method, which are not described herein again. The modules in the electrical equipment control device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, an electrical device is provided, which includes a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the above method embodiments when executing the computer program.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the above-described method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In one embodiment, a computer program product or computer program is provided that includes computer instructions stored in a computer-readable storage medium. The computer instructions are read by a processor of a computer device from a computer-readable storage medium, and the computer instructions are executed by the processor to cause the computer device to perform the steps in the above-mentioned method embodiments.
It should be understood that the terms "first", "second", etc. in the above-described embodiments are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. For the description of numerical ranges, the term "plurality" is understood to be equal to or greater than two.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An electrical appliance control method, characterized in that the method comprises:
acquiring an image to be detected, which is acquired aiming at a region to be detected of electrical equipment;
performing feature extraction on the image to be detected to obtain image features of the image to be detected, and performing target detection based on the image features to obtain a pollutant detection result in the area to be detected;
and sending prompt information for prompting that the electrical equipment is cleaned when the pollutant detection result meets a preset condition.
2. The method of claim 1, further comprising, when the contaminant detection result satisfies a preset condition: and sending cleaning service related information to a terminal associated with the electrical equipment.
3. The method of claim 1, wherein the contaminant detection result comprises a contaminant amount, and the contaminant detection result is determined to satisfy a preset condition when the contaminant amount reaches an amount threshold.
4. The method according to claim 1, wherein the performing feature extraction on the image to be detected to obtain image features of the image to be detected, and performing target detection based on the image features to obtain a detection result of the pollutants in the area to be detected comprises:
performing feature extraction on the image to be detected through a target detection model to obtain image features of the image to be detected, and performing target detection based on the image features to obtain a pollutant detection result in the area to be detected; the training method of the target detection model comprises the following steps:
acquiring a sample image acquired aiming at the area to be detected and corresponding labeling information thereof, wherein the labeling information comprises: position information and category information of each pollutant marked in the sample image;
performing target detection on the sample image through a target detection model to be trained to obtain detection information corresponding to the sample image, wherein the detection information comprises: position information and category information of each contaminant detected from the sample image;
and adjusting parameters of the target detection model to be trained based on the labeling information and the detection information of the sample image until a training end condition is met, and obtaining the target detection model.
5. The method of claim 4, wherein the object detection model comprises: a feature extraction network, a feature fusion network and an identification network; through the target detection model, carry out feature extraction to the image that waits to detect, obtain the image characteristic of the image that waits to detect, carry out target detection based on the image characteristic, obtain the pollutant testing result in waiting to detect the area, include:
extracting the features of the image to be detected through the feature extraction network to obtain first features of at least two scales;
extracting features of the image to be detected through the feature fusion network to obtain second features of at least two scales, wherein the scales of the second features correspond to the scales of the first features, and fusing each first feature and the second feature corresponding to the scale of the first feature to obtain fusion features of each scale, wherein the image features comprise each fusion feature;
and identifying each fusion characteristic through the identification network to obtain a pollutant detection result in the area to be detected.
6. The method of claim 5, wherein identifying each of the fused features via the identification network to obtain a contaminant detection result in the area to be detected comprises:
identifying each fusion feature through the identification network to obtain an identification result and probability corresponding to each fusion feature, wherein the identification result comprises position information and category information of each pollutant;
and obtaining the detection result of the pollutants in the area to be detected according to the probability of each identification result.
7. The method according to any one of claims 1 to 6, wherein the contaminant detection result includes positional information of each contaminant in the image to be detected, the method further comprising:
and obtaining the position information of each pollutant in the electrical equipment according to the position information of each pollutant in the image to be detected and a predetermined conversion relation.
8. An electrical equipment control device, characterized in that the device comprises:
the acquisition module is used for acquiring an image to be detected, which is acquired aiming at a region to be detected of the electrical equipment;
the detection module is used for extracting the characteristics of the image to be detected to obtain the image characteristics of the image to be detected, and performing target detection based on the image characteristics to obtain a pollutant detection result in the area to be detected;
and the control module is used for sending prompt information for prompting that the electrical equipment is cleaned when the pollutant detection result meets a preset condition.
9. An electrical device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202110061859.3A 2021-01-18 2021-01-18 Electrical equipment control method and device, electrical equipment and storage medium Active CN112749753B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110061859.3A CN112749753B (en) 2021-01-18 2021-01-18 Electrical equipment control method and device, electrical equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110061859.3A CN112749753B (en) 2021-01-18 2021-01-18 Electrical equipment control method and device, electrical equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112749753A true CN112749753A (en) 2021-05-04
CN112749753B CN112749753B (en) 2024-04-26

Family

ID=75652345

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110061859.3A Active CN112749753B (en) 2021-01-18 2021-01-18 Electrical equipment control method and device, electrical equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112749753B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116274170A (en) * 2023-03-27 2023-06-23 中建三局第一建设工程有限责任公司 Control method, system and related device of laser cleaning equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107606744A (en) * 2017-10-09 2018-01-19 珠海格力电器股份有限公司 Monitoring device, method and air-conditioning for air-conditioning
CN108253587A (en) * 2017-11-29 2018-07-06 珠海格力电器股份有限公司 The method of adjustment and device of air cleanliness
WO2019100289A1 (en) * 2017-11-23 2019-05-31 Harman International Industries, Incorporated Method and system for speech enhancement
JP2020072311A (en) * 2018-10-29 2020-05-07 オリンパス株式会社 Information acquisition device, information acquisition method, information acquisition program, and information acquisition system
CN111442472A (en) * 2020-04-10 2020-07-24 青岛海尔空调器有限总公司 Air conditioner filter screen cleaning prompting method and air conditioner
CN111780349A (en) * 2020-07-15 2020-10-16 珠海格力电器股份有限公司 Self-cleaning method and device for air conditioner heat exchanger and air conditioner

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107606744A (en) * 2017-10-09 2018-01-19 珠海格力电器股份有限公司 Monitoring device, method and air-conditioning for air-conditioning
WO2019100289A1 (en) * 2017-11-23 2019-05-31 Harman International Industries, Incorporated Method and system for speech enhancement
CN108253587A (en) * 2017-11-29 2018-07-06 珠海格力电器股份有限公司 The method of adjustment and device of air cleanliness
JP2020072311A (en) * 2018-10-29 2020-05-07 オリンパス株式会社 Information acquisition device, information acquisition method, information acquisition program, and information acquisition system
CN111442472A (en) * 2020-04-10 2020-07-24 青岛海尔空调器有限总公司 Air conditioner filter screen cleaning prompting method and air conditioner
CN111780349A (en) * 2020-07-15 2020-10-16 珠海格力电器股份有限公司 Self-cleaning method and device for air conditioner heat exchanger and air conditioner

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116274170A (en) * 2023-03-27 2023-06-23 中建三局第一建设工程有限责任公司 Control method, system and related device of laser cleaning equipment
CN116274170B (en) * 2023-03-27 2023-10-13 中建三局第一建设工程有限责任公司 Control method, system and related device of laser cleaning equipment

Also Published As

Publication number Publication date
CN112749753B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
CN109785337B (en) In-column mammal counting method based on example segmentation algorithm
CN112131936B (en) Inspection robot image recognition method and inspection robot
WO2019214309A1 (en) Model test method and device
CN111643014A (en) Intelligent cleaning method and device, intelligent cleaning equipment and storage medium
CN111223129A (en) Detection method, detection device, monitoring equipment and computer readable storage medium
CN113516651A (en) Welding joint defect detection method and device based on residual error network
WO2020026643A1 (en) Information processing device, information processing method and information processing program
CN112749753B (en) Electrical equipment control method and device, electrical equipment and storage medium
CN112699940A (en) Vehicle cleaning associated resource recommendation method and device and storage medium
CN110827263B (en) Magnetic shoe surface defect detection system and detection method based on visual identification technology
CN108805884A (en) A kind of mosaic area's detection method, device and equipment
CN112884697A (en) Method for identifying wafer map and computer readable recording medium
CN112991343A (en) Method, device and equipment for identifying and detecting macular region of fundus image
CN117351472A (en) Tobacco leaf information detection method and device and electronic equipment
CN110704459B (en) Cleaning method, medium and system for zombie vehicle data in parking lot
CN109523509B (en) Method and device for detecting heading stage of wheat and electronic equipment
CN116525133A (en) Automatic collection method, system, electronic equipment and medium for nucleic acid
CN116485749A (en) Self-encoder-based method for identifying dirt in lens module
CN116380193A (en) Material level state detection system for realizing intelligent workshop
CN116167969A (en) Lens smudge detection method, device, vehicle, storage medium and program product
US11120541B2 (en) Determination device and determining method thereof
JP2021119802A (en) Sweeping control method
JP2022142018A (en) Stain determination system of air conditioner, stain determination method and program
CN110189301B (en) Foreign matter detection method for generator stator core steel sheet stacking platform
CN114115269A (en) Method and device for determining cleaning path and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant