CN113673404A - Clothes state identification method and device, electronic equipment and storage medium - Google Patents

Clothes state identification method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113673404A
CN113673404A CN202110932886.3A CN202110932886A CN113673404A CN 113673404 A CN113673404 A CN 113673404A CN 202110932886 A CN202110932886 A CN 202110932886A CN 113673404 A CN113673404 A CN 113673404A
Authority
CN
China
Prior art keywords
state
target
clothes
corrected
recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110932886.3A
Other languages
Chinese (zh)
Inventor
熊剑
陈翀
宋德超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202110932886.3A priority Critical patent/CN113673404A/en
Publication of CN113673404A publication Critical patent/CN113673404A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Control Of Washing Machine And Dryer (AREA)

Abstract

The application relates to a clothes state identification method and device, electronic equipment and a storage medium; the method comprises the following steps: acquiring image information of target clothes in a drum; pre-identifying the image information to obtain a pre-identification result of the target clothes; determining a target rotating speed of the roller, wherein the target rotating speed is the rotating speed of the roller when the image information is acquired; and correcting the pre-recognition result according to the target rotating speed to obtain a final recognition result. The method provided by the embodiment of the application can correct the pre-recognition result according to the target rotating speed of the drum so as to accurately distinguish the clothes states which cannot be accurately distinguished in the pre-recognition result, and further the aim of improving the accuracy of clothes state recognition of clothes in the drum is achieved.

Description

Clothes state identification method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of image recognition technologies, and in particular, to a method and an apparatus for recognizing a clothing state, an electronic device, and a storage medium.
Background
In the development process of washing machine products, after the washing machine finishes washing clothes, the clothes need to be dried. During the drying process, the drum of the washing machine rotates, and then hot air is introduced into the drum, so that the function of drying clothes in the drum is achieved. If the clothes are spread in the tub during the drying process, the dried clothes do not wrinkle, so that the rotation speed of the drum needs to be controlled to spread the clothes in the drum in cooperation with the action of the gravity of the clothes. However, laundry unwinding requires a reasonable control of the drum rotation speed, which in turn requires knowledge of the state of the laundry in the drum. The clothes are in four states of rolling, beating, unfolding and adhering to the wall in the drum, so that the inner scene of the drum needs to be shot by a camera, and then the state of the clothes is identified.
In the related art, because the similarity between the rolling state and the beating state is high in the actual algorithm implementation process and is difficult to distinguish, the state identification accuracy of the clothes in the drum is low.
Aiming at the technical problem of low accuracy rate of state identification of the clothes in the drum in the related art, no effective solution is provided at present.
Disclosure of Invention
In order to solve the technical problem of low accuracy of state identification of clothes in a drum, the application provides a clothes state identification method and device, electronic equipment and a storage medium.
In a first aspect, an embodiment of the present application provides a clothes state identification method, including:
acquiring image information of target clothes in a drum;
pre-identifying the image information to obtain a pre-identification result of the target clothes;
determining a target rotating speed of the roller, wherein the target rotating speed is the rotating speed of the roller when the image information is acquired;
and correcting the pre-recognition result according to the target rotating speed to obtain a final recognition result.
Optionally, as in the foregoing method, the pre-recognizing the image information to obtain a pre-recognition result of the target clothing includes:
inputting the image information into a clothes state recognition model to obtain the pre-recognition result of the target clothes, wherein the pre-recognition result comprises: a one-to-one correspondence between laundry states and pre-recognition weights, the pre-recognition weights being used to indicate a probability that a state of the target laundry is a corresponding predicted state.
Optionally, as in the foregoing method, the modifying the pre-recognition result according to the target rotation speed to obtain a final recognition result includes:
generating a corresponding target correction function according to the target rotating speed;
and correcting the pre-recognition result through the target correction function to obtain a final recognition result.
Optionally, as in the foregoing method, the pre-recognition result includes: a one-to-one correspondence between laundry states and pre-recognition weights, the pre-recognition weights being used to indicate a probability that a state of the target laundry is a corresponding predicted state, the generating a corresponding target correction function according to the target rotation speed comprising:
determining at least two clothes states to be corrected which need parameter correction in all the clothes states;
determining a weighted value uniquely corresponding to the state of each piece of clothes to be corrected according to the target rotating speed;
and obtaining the target correction function according to each weighted value.
Optionally, as in the foregoing method, the state of the garment to be modified includes: the determining of the unique corresponding weighted value of each laundry state to be corrected according to the target rotating speed comprises the following steps:
calculating a first weighting value g corresponding to the first clothes state to be corrected and a second weighting value s corresponding to the second clothes state to be corrected according to the following formula:
Figure BDA0003211692220000031
wherein e is the base number of the natural logarithm, r is the target rotating speed, C is a constant term, the first clothes state to be corrected is a rolling state, and the second clothes state to be corrected is a beating state.
Optionally, as in the foregoing method, the modifying the pre-recognition result by the target modification function to obtain a final recognition result includes:
for each clothing state to be corrected, determining a weight to be corrected corresponding to the clothing state to be corrected in the pre-recognition result, and obtaining a corrected output value corresponding to the clothing state to be corrected after weighting the weight to be corrected through a weighting value corresponding to the clothing state to be corrected in the target correction function, wherein the weight to be corrected is the pre-recognition weight corresponding to the clothing state to be corrected;
and taking the state of the clothes to be corrected corresponding to the maximum target corrected output value in all the corrected output values as the final recognition result.
Optionally, as in the foregoing method, the pre-recognition result includes: the method further includes a one-to-one correspondence relationship between laundry states and pre-recognition weights, where the pre-recognition weights are used to indicate probabilities that states of the target laundry are corresponding predicted states, and after the pre-recognition is performed on the image information to obtain a pre-recognition result of the target laundry, the method further includes:
determining a target clothes state without parameter correction in all clothes states;
taking the target laundry state as the final recognition result in the case where the largest target pre-recognition weight among all the pre-recognition weights is the pre-recognition weight of the target laundry state.
In a second aspect, an embodiment of the present application provides a clothes state identification device, including:
an acquisition module for acquiring image information of target laundry in the drum;
the pre-recognition module is used for pre-recognizing the image information to obtain a pre-recognition result of the target clothes;
the determining module is used for determining a target rotating speed of the roller, wherein the target rotating speed is the rotating speed of the roller when the image information is acquired;
and the result module is used for correcting the pre-recognition result according to the target rotating speed to obtain a final recognition result.
In a third aspect, an embodiment of the present application provides an electronic device, including: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
the memory is used for storing a computer program;
the processor, when executing the computer program, is configured to implement the method according to any of the preceding claims.
In a fourth aspect, the present application provides a computer-readable storage medium, which includes a stored program, where the program is executed to perform the method according to any one of the preceding claims.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages:
the method provided by the embodiment of the application can correct the pre-recognition result according to the target rotating speed of the drum so as to accurately distinguish the clothes states which cannot be accurately distinguished in the pre-recognition result, and further the aim of improving the accuracy of clothes state recognition of clothes in the drum is achieved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic flowchart of a clothes state identification method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a clothes state identification method according to another embodiment of the present application;
fig. 3 is a schematic flowchart of a clothes state identification method according to another embodiment of the present application;
fig. 4 is a block diagram of a clothes state recognition apparatus according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
According to an aspect of an embodiment of the present application, there is provided a laundry state recognition method. Alternatively, in the present embodiment, the clothing state recognition method described above may be applied to a hardware environment constituted by a terminal and a server. The server is connected with the terminal through a network, can be used for providing clothes state identification service for the terminal or a client installed on the terminal, and can be provided with a database on the server or independently of the server for providing data storage service for the server.
The network may include, but is not limited to, at least one of: wired networks, wireless networks. The wired network may include, but is not limited to, at least one of: wide area networks, metropolitan area networks, local area networks, which may include, but are not limited to, at least one of the following: WIFI (Wireless Fidelity), bluetooth. The terminal may not be limited to a PC, a mobile phone, a tablet computer, and the like.
The clothes state identification method in the embodiment of the application can be executed by the server, the terminal or both. The terminal executing the clothes state identification method of the embodiment of the present application may also be executed by a client installed thereon.
Taking the method for recognizing the clothes state in the embodiment executed by the terminal as an example, fig. 1 is a method for recognizing the clothes state provided by the embodiment of the present application, and the method includes the following steps:
step S101, image information of the target laundry in the drum is acquired.
The laundry state identification method in the present embodiment may be applied to a scenario in which a laundry state needs to be identified in an apparatus for performing laundry drying, for example: the drying method can be used for drying clothes in the washing machine drum, drying clothes in the dryer and the like, and can also be used for drying clothes in other scenes. In the embodiment of the present application, the clothes state identification method is described by taking drying of clothes in a washing machine as an example, and the clothes state identification method is also applicable to other types of scenes in which the clothes state needs to be acquired under the condition of no contradiction.
Taking the drying scene of clothes in the drum of the washing machine as an example, the pre-recognition result of the clothes is corrected by combining the rotating speed of the drum, so as to achieve the purpose of improving the recognition accuracy of the clothes state.
The image acquisition device (for example, a camera) arranged in the rolling drum can acquire images in the drum, so that the aim of acquiring the image information of the target clothes in the drum is fulfilled.
And S102, pre-identifying the image information to obtain a pre-identification result of the target clothes.
After the image information is acquired, the image information can be pre-identified to acquire a pre-identification result of the target clothes.
The pre-recognition may be to input the image information into a pre-trained clothes state recognition model, and then obtain a pre-recognition result indicating the clothes state of the target clothes.
Further, the clothes state identification model may have a situation that some similar clothes states cannot be accurately distinguished, and therefore, when the clothes state is one of the similar clothes states, the pre-identification result has a large error.
And step S103, determining a target rotating speed of the roller, wherein the target rotating speed is the rotating speed of the roller when the image information is acquired.
When the image information is acquired, the target rotating speed of the roller can be acquired through a roller rotating speed acquisition device arranged on the roller; the target rotation speed of the drum may also be acquired by acquiring an operation state of a drive unit that controls rotation of the drum.
And step S104, correcting the pre-recognition result according to the target rotating speed to obtain a final recognition result.
Since the laundry state in which the laundry in the drum tends to be different at different rotation speeds of the drum, after the target rotation speed of the drum is obtained, the pre-recognition result may be corrected based on the target rotation speed to obtain the final recognition result.
Optionally, the pre-recognition result may include a pre-recognition weight corresponding to each recognition result, and the correcting the pre-recognition result according to the target rotation speed may be a correction value obtained according to the target rotation speed, where the correction value is used for correcting:
the correction value may be corrected by adding or subtracting the pre-recognition weight in the pre-recognition result, for example: when the correction value corresponding to the target rotation speed is a0, the clothes state recognition model cannot accurately distinguish the clothes state a1 (the pre-recognition weight is a1) from the clothes state a2 (the pre-recognition weight is a2), the pre-recognition weight is a1 and the pre-recognition weight is a2, which are greater than the pre-recognition weights of other clothes states, and the probability that the clothes state of the target clothes is a1 is higher as the correction value corresponding to the target rotation speed is higher, the pre-recognition result is corrected by the target rotation speed, the weight after the clothes state a1 is corrected is a1+ a0, and the weight after the clothes state a2 is corrected is a2+1-a 0. Further, the laundry state corresponding to the greater value of a1+ a0 and a2+1-a0 may be selected as the final recognition result.
The correction value may be corrected by weighting the pre-recognition weight in the pre-recognition result, for example: when the correction value corresponding to the target rotation speed is a0, the clothes state recognition model cannot accurately distinguish the clothes state a1 (the pre-recognition weight is a1) from the clothes state a2 (the pre-recognition weight is a2), the pre-recognition weight is a1 and the pre-recognition weight is a2, which are greater than the pre-recognition weights of other clothes states, and the probability that the clothes state of the target clothes is a1 is higher as the correction value corresponding to the target rotation speed is higher, the pre-recognition result is corrected by the target rotation speed, the weight after the clothes state a1 is corrected is a1 × a0, and the weight after the clothes state a2 is corrected is a2 × (1-a 0). Further, the laundry state corresponding to the larger value of a1 × a0 and a2 × (1-a0) may be selected as the final recognition result.
By the method in the embodiment, the pre-recognition result can be corrected according to the target rotating speed of the drum, so that the clothes states which cannot be accurately distinguished in the pre-recognition result can be accurately distinguished, and the aim of improving the accuracy of clothes state recognition of the clothes in the drum is fulfilled.
As an alternative implementation manner, as in the foregoing method, the step S102 of performing pre-recognition on the image information to obtain a pre-recognition result of the target clothes includes the following steps:
step S201, inputting image information into a clothes state identification model to obtain a pre-identification result of the target clothes, wherein the pre-identification result comprises: the clothes state and the pre-recognition weight are in one-to-one correspondence, and the pre-recognition weight is used for indicating the probability that the state of the target clothes is the corresponding prediction state.
After the image information is obtained, the image information may be input into a clothes state recognition model to obtain a pre-recognition result of the target clothes, where the clothes state recognition model may be obtained after the model to be trained is trained by the training image information and the recognition accuracy meets a preset requirement.
After the image information is input into the clothes state recognition model, the obtained pre-recognition result comprises the one-to-one correspondence between the clothes state and the pre-recognition weight. That is, when the laundry state I, the laundry state II, the laundry state III, and the laundry state IIII exist, the pre-recognition weight I corresponding to the laundry state I, the pre-recognition weight II corresponding to the laundry state II, the pre-recognition weight III corresponding to the laundry state III, and the pre-recognition weight IIII corresponding to the laundry state IIII exist; further, the pre-recognition weight i + the pre-recognition weight ii + the pre-recognition weight iii + the pre-recognition weight iiii is 1.
By the method in the embodiment, the pre-recognition result of the target clothes can be obtained through the prediction of the clothes state recognition model, and the efficiency of recognizing the clothes state of the target clothes is improved.
As an alternative embodiment, as shown in fig. 2, the modifying the pre-recognition result according to the target rotation speed to obtain the final recognition result according to the foregoing method includes the following steps:
step S301, generating a corresponding target correction function according to the target rotating speed;
and S302, correcting the pre-recognition result through a target correction function to obtain a final recognition result.
After the target rotation speed is obtained, a target correction function for correcting the pre-recognition result by the user can be generated based on the target rotation speed.
The target correction function may be a function for processing a pre-recognition weight of a laundry state that cannot be accurately distinguished in the pre-recognition result. Therefore, the accuracy of the final recognition result is improved after the pre-recognition weight of the clothes state which cannot be accurately distinguished is corrected through the target correction function.
As an alternative implementation, as shown in fig. 3, in the foregoing method, the pre-recognition result includes: the step S301 of generating a corresponding target correction function according to a target rotation speed includes the following steps:
step S401, determining at least two clothes states to be corrected which need parameter correction in all the clothes states.
At least two laundry states to be corrected, which cannot be accurately distinguished by the laundry state recognition model, may be determined in advance among all the laundry states.
The clothing state to be corrected can be obtained by identifying different image information through the clothing state identification model, the clothing state P and the clothing state Q can be determined to be the clothing state to be corrected under the condition that the probability or the frequency of errors of identifying one clothing state P as another clothing state Q through the clothing state identification model is larger than a preset value, for example, when the clothing state comprises rolling, beating, unfolding and adherence, when 1000 pieces of preset image information are identified through the clothing state identification model, if the rolling is identified as beating for 100 times, the rolling is identified as rolling for 100 times, the unfolding is identified as other clothing state for 1 time, the adherence is identified as other clothing state for 1 time, the frequency and the probability of errors of identifying the rolling state and the beating state are far larger than the two clothing states of the unfolding and adherence, the two clothes states to be corrected thus determined are rolling and beating.
And S402, determining a unique corresponding weighted value of each clothes state to be corrected according to the target rotating speed.
After the target rotating speed is determined, a weighted value uniquely corresponding to the state of each to-be-corrected clothes can be calculated through a preset calculation scheme. That is, after the target rotation speed is determined, a unique corresponding weighted value can be determined for each clothes state to be corrected, and for other clothes states, the corresponding weighted value does not need to be calculated.
In step S403, a target correction function is obtained according to each weighted value.
After the weighted value corresponding to each state of the clothing to be corrected is obtained through calculation, a target correction function for weighting the weight to be corrected corresponding to each state of the clothing to be corrected can be obtained through construction of all the weighted values. For example, when there are two clothes states to be corrected (p, q), the weighting value corresponding to the clothes state to be corrected p is g, and the weighting value corresponding to the clothes state to be corrected q is s, the target correction function is:
Figure BDA0003211692220000101
by the method in the embodiment, the target correction function for correcting the weight to be corrected of the state of the clothes to be corrected can be obtained, so that the final recognition result can be obtained according to the target correction function in the later period.
As an alternative embodiment, the method as described above, the state of the garment to be modified comprises: the step S402 of determining a unique weighting value corresponding to each laundry condition to be corrected according to the target rotation speed includes the following steps:
calculating a first weighting value g corresponding to the first clothes state to be corrected and a second weighting value s corresponding to the second clothes state to be corrected according to the following formula:
Figure BDA0003211692220000111
wherein e is the base number of the natural logarithm, r is the target rotating speed, C is a constant term, the first clothes state to be corrected is a rolling state, and the second clothes state to be corrected is a beating state.
When the states of the clothes to be corrected only include the first state of the clothes to be corrected and the second state of the clothes to be corrected, only the two states of the clothes to be corrected need to be corrected. Where C may be a critical rotation speed commonly used for controlling the drum, for example, 40r/min, that is, when the rotation speed of the drum is C, the probability that the laundry state of the laundry in the drum is in the rolling state and the probability that the laundry state is in the beating state are the same. Therefore, it is possible to obtain the first weighting value g corresponding to the scroll state as the target rotation speed r is larger, whereas the second weighting value s corresponding to the tumble state as the target rotation speed r is smaller.
By the method in the embodiment, the first weighted value g corresponding to the rolling state and the second weighted value s corresponding to the beating state can be quickly calculated, so that the accuracy of identifying the rolling state and the beating state can be conveniently improved in the later stage.
As an alternative implementation manner, as in the foregoing method, the step S301 of modifying the pre-recognition result by using the target modification function to obtain the final recognition result includes the following steps:
step S601, for each clothing state to be corrected, determining a weight to be corrected corresponding to the clothing state to be corrected in the pre-recognition result, and obtaining a corrected output value corresponding to the clothing state to be corrected after weighting the weight to be corrected according to a weighted value corresponding to the clothing state to be corrected in the target correction function, wherein the weight to be corrected is the pre-recognition weight corresponding to the clothing state to be corrected.
For each clothing state to be corrected, the weight to be corrected corresponding to the clothing state to be corrected can be determined in all the pre-recognition weights in the pre-recognition result, and then the weight to be corrected corresponding to the clothing state to be corrected can be obtained; and then weighting the weight to be corrected according to the weighted value corresponding to the state of the clothes to be corrected in the target correction function, so as to obtain a corrected output value corresponding to the state of the clothes to be corrected.
For example, when the pre-recognition weights respectively corresponding to the rolling state, the beating state, the unfolding state, and the adhesion state in the pre-recognition result are 0.35, 0.45, 0.1, and 0.1, the pre-recognition result indicates that the pre-recognition result is the beating state, and therefore the product of the pre-recognition result and the correction function needs to be obtained, and the pre-recognition result is determined by the corrected value. At this time, the washing machine controller data feedback can obtain that the washing machine drum rotation speed is 39r/min, C is 40r/min, g of the correction term is 0.7, s is 0.3, g is 0.7, s is 0.3, p is 0.35, and q is 0.45, and the corrected output values corresponding to the rolling state and the beating state are 0.25 and 0.14 respectively.
Step S602, the clothes state to be corrected corresponding to the maximum target corrected output value in all the corrected output values is used as the final recognition result.
After all the corrected output values are obtained, the state of the clothes to be corrected, which is used as a final recognition result, can be selected from all the states of the clothes to be corrected; that is, the semantic state to be modified corresponding to the maximum modified output value can be used as the final recognition result. For example, on the basis of the foregoing embodiment, the corrected output values of the scroll state and the tumble state are 0.25 and 0.14, and the maximum corrected output value is 0.25, and therefore, the scroll state can be taken as the final recognition result.
By the method in the embodiment, the maximum target corrected output value can be calculated based on the target correction function, the final recognition result can be determined based on the target corrected output value, and the probability of improving the accuracy of the final recognition result can be further achieved.
As an alternative implementation, as in the foregoing method, the pre-recognition result includes: after the pre-recognition of the image information in step S102 to obtain the pre-recognition result of the target laundry, the method further includes the following steps:
and step S701, determining a target clothes state without parameter correction in all clothes states.
In step S702, when the largest target pre-recognition weight among all the pre-recognition weights is the pre-recognition weight of the target laundry state, the target laundry state is set as the final recognition result.
After all the clothes states are determined and determined, the target clothes state without parameter correction in all the clothes states can be determined.
The target laundry state may be a laundry state that the laundry state recognition model can accurately distinguish among all laundry states.
After the pre-recognition weight corresponding to each laundry state is determined, the pre-recognition weight corresponding to the target laundry state can be determined, and in the case that the pre-recognition weight of the target laundry state is the largest target pre-recognition weight among all the pre-recognition weights, the target laundry state can be used as the final recognition result. For example, when there are the pre-recognition weights of the rolling state, the beating state, the expansion state, and the adhesion state of 0.12, 0.13, 0.45, 0.3, respectively, and the expansion state and the adhesion state are both the target laundry states, and the pre-recognition weight of the expansion state is the largest, the expansion state is taken as the final recognition result.
By the method in the embodiment, under the condition that the maximum pre-recognition weight is the target clothes state, the final recognition result can be obtained directly according to the pre-recognition result without executing related steps in the steps S103-S104, and the recognition efficiency can be effectively improved.
As shown in fig. 4, according to an embodiment of another aspect of the present application, there is also provided a laundry state recognition apparatus including:
an acquisition module 1 for acquiring image information of target laundry in a drum;
the pre-recognition module 2 is used for pre-recognizing the image information to obtain a pre-recognition result of the target clothes;
the determining module 3 is used for determining a target rotating speed of the roller, wherein the target rotating speed is the rotating speed of the roller when the image information is acquired;
and the result module 4 is used for correcting the pre-recognition result according to the target rotating speed to obtain a final recognition result.
Specifically, the specific process of implementing the functions of each module in the apparatus according to the embodiment of the present invention may refer to the related description in the method embodiment, and is not described herein again.
According to another embodiment of the present application, there is also provided an electronic apparatus including: as shown in fig. 5, the electronic device may include: the system comprises a processor 1501, a communication interface 1502, a memory 1503 and a communication bus 1504, wherein the processor 1501, the communication interface 1502 and the memory 1503 complete communication with each other through the communication bus 1504.
A memory 1503 for storing a computer program;
the processor 1501 is configured to implement the steps of the above-described method embodiments when executing the program stored in the memory 1503.
The bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
The embodiment of the present application further provides a computer-readable storage medium, where the storage medium includes a stored program, and when the program runs, the method steps of the above method embodiment are executed.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A clothes state recognition method is characterized by comprising the following steps:
acquiring image information of target clothes in a drum;
pre-identifying the image information to obtain a pre-identification result of the target clothes;
determining a target rotating speed of the roller, wherein the target rotating speed is the rotating speed of the roller when the image information is acquired;
and correcting the pre-recognition result according to the target rotating speed to obtain a final recognition result.
2. The method according to claim 1, wherein the pre-recognizing the image information to obtain the pre-recognition result of the target clothes comprises:
inputting the image information into a clothes state recognition model to obtain the pre-recognition result of the target clothes, wherein the pre-recognition result comprises: a one-to-one correspondence between laundry states and pre-recognition weights, the pre-recognition weights being used to indicate a probability that a state of the target laundry is a predicted state.
3. The method of claim 1, wherein the modifying the pre-recognition result according to the target rotation speed to obtain a final recognition result comprises:
generating a corresponding target correction function according to the target rotating speed;
and correcting the pre-recognition result through the target correction function to obtain a final recognition result.
4. The method of claim 3, wherein the pre-recognition result comprises: a one-to-one correspondence between laundry states and pre-recognition weights, the pre-recognition weights being used to indicate a probability that a state of the target laundry is a corresponding predicted state, the generating a corresponding target correction function according to the target rotation speed comprising:
determining at least two clothes states to be corrected which need parameter correction in all the clothes states;
determining a weighted value uniquely corresponding to the state of each piece of clothes to be corrected according to the target rotating speed;
and obtaining the target correction function according to each weighted value.
5. The method according to claim 4, wherein the laundry state to be corrected comprises: the determining of the unique corresponding weighted value of each laundry state to be corrected according to the target rotating speed comprises the following steps:
calculating a first weighting value g corresponding to the first clothes state to be corrected and a second weighting value s corresponding to the second clothes state to be corrected according to the following formula:
Figure FDA0003211692210000021
wherein e is the base number of the natural logarithm, r is the target rotating speed, C is a constant term, the first clothes state to be corrected is a rolling state, and the second clothes state to be corrected is a beating state.
6. The method according to claim 4, wherein the modifying the pre-recognition result by the target modification function to obtain a final recognition result comprises:
for each clothing state to be corrected, determining a weight to be corrected corresponding to the clothing state to be corrected in the pre-recognition result, and obtaining a corrected output value corresponding to the clothing state to be corrected after weighting the weight to be corrected through a weighting value corresponding to the clothing state to be corrected in the target correction function, wherein the weight to be corrected is the pre-recognition weight corresponding to the clothing state to be corrected;
and taking the state of the clothes to be corrected corresponding to the maximum target corrected output value in all the corrected output values as the final recognition result.
7. The method of claim 1, wherein the pre-recognition result comprises: the method further includes a one-to-one correspondence relationship between laundry states and pre-recognition weights, where the pre-recognition weights are used to indicate probabilities that states of the target laundry are corresponding predicted states, and after the pre-recognition is performed on the image information to obtain a pre-recognition result of the target laundry, the method further includes:
determining a target clothes state without parameter correction in all clothes states;
taking the target laundry state as the final recognition result in the case where the largest target pre-recognition weight among all the pre-recognition weights is the pre-recognition weight of the target laundry state.
8. A clothing state recognition apparatus, comprising:
an acquisition module for acquiring image information of target laundry in the drum;
the pre-recognition module is used for pre-recognizing the image information to obtain a pre-recognition result of the target clothes;
the determining module is used for determining a target rotating speed of the roller, wherein the target rotating speed is the rotating speed of the roller when the image information is acquired;
and the result module is used for correcting the pre-recognition result according to the target rotating speed to obtain a final recognition result.
9. An electronic device, comprising: the system comprises a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory are communicated with each other through the communication bus;
the memory is used for storing a computer program;
the processor, when executing the computer program, implementing the method of any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the storage medium comprises a stored program, wherein the program when executed performs the method of any of the preceding claims 1 to 7.
CN202110932886.3A 2021-08-13 2021-08-13 Clothes state identification method and device, electronic equipment and storage medium Pending CN113673404A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110932886.3A CN113673404A (en) 2021-08-13 2021-08-13 Clothes state identification method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110932886.3A CN113673404A (en) 2021-08-13 2021-08-13 Clothes state identification method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113673404A true CN113673404A (en) 2021-11-19

Family

ID=78543010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110932886.3A Pending CN113673404A (en) 2021-08-13 2021-08-13 Clothes state identification method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113673404A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114753096A (en) * 2022-04-18 2022-07-15 无锡小天鹅电器有限公司 Method, device, equipment and medium for adjusting washing parameters

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114753096A (en) * 2022-04-18 2022-07-15 无锡小天鹅电器有限公司 Method, device, equipment and medium for adjusting washing parameters
CN114753096B (en) * 2022-04-18 2023-11-24 无锡小天鹅电器有限公司 Washing parameter adjusting method, device, equipment and medium

Similar Documents

Publication Publication Date Title
US20230203733A1 (en) Drum washing machine, and control method and apparatus for same
EP3279382A1 (en) Clothes dryer and control method therefor
CN113673404A (en) Clothes state identification method and device, electronic equipment and storage medium
CN111027428B (en) Training method and device for multitasking model and electronic equipment
CN113005709B (en) Clothes weighing method and device of washing machine, computer readable medium and washing machine
CN108881736B (en) Aperture correction method and device
WO2017054103A1 (en) Pairing electronic devices
CN1906572B (en) Methods and apparatus for generating a delay using a counter
KR102552463B1 (en) Application processor, system on chip and booting method of device
CN115665542A (en) Picture processing method based on scene self-recognition and related device
CN112941861B (en) Drying control method and control device of dryer
CN104601884A (en) Photograph method and terminal
US20240069891A1 (en) Electronic device bios updates
CN107026978A (en) IMAQ control method and device
CN110409118B (en) Drum control method and related device
CN110373858B (en) Rinsing control method, device, terminal and computer readable medium
JP7489653B2 (en) Presentation method and presentation system
CN113584837B (en) Control method of circulation fan, circulation fan and computer readable storage medium
CN114541079B (en) Control method of washing machine, control system and storage medium
US20220243375A1 (en) Method and device for controlling laundry equipment, and storage medium
CN112114972B (en) Data inclination prediction method and device
TWI813326B (en) Method and system for inferring apparatus fingerprint
CN107566347B (en) Rolling code learning detection method and device, equipment and computer readable storage medium
CN113417122B (en) Clothes drying equipment, control method and device thereof and storage medium
TWI819061B (en) Electronic system for adaptively adjusting allocation of memory area and method of operating the electronic system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination