CN111273581B - Control method of intelligent wardrobe and related product - Google Patents

Control method of intelligent wardrobe and related product Download PDF

Info

Publication number
CN111273581B
CN111273581B CN202010076808.3A CN202010076808A CN111273581B CN 111273581 B CN111273581 B CN 111273581B CN 202010076808 A CN202010076808 A CN 202010076808A CN 111273581 B CN111273581 B CN 111273581B
Authority
CN
China
Prior art keywords
target
clothes
garment
sensor device
care
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010076808.3A
Other languages
Chinese (zh)
Other versions
CN111273581A (en
Inventor
余承富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Danale Technology Co ltd
Original Assignee
Shenzhen Danale Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Danale Technology Co ltd filed Critical Shenzhen Danale Technology Co ltd
Priority to CN202010076808.3A priority Critical patent/CN111273581B/en
Publication of CN111273581A publication Critical patent/CN111273581A/en
Application granted granted Critical
Publication of CN111273581B publication Critical patent/CN111273581B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Abstract

The application discloses a control method of an intelligent wardrobe and a related product, wherein the intelligent wardrobe comprises sensor equipment, and the method comprises the following steps: collecting information of a target garment by the sensor device; determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment; controlling, by the sensor device, the intelligent wardrobe to care for the target garment according to the care plan. Therefore, according to the technical scheme, the sensor equipment is used for collecting the clothes information, and then different nursing schemes are adopted for nursing according to the clothes information, so that the intelligent clothes nursing is favorably improved, and the clothes nursing is more reasonable and healthier.

Description

Control method of intelligent wardrobe and related product
Technical Field
The application relates to the technical field of computer vision, in particular to a control method of an intelligent wardrobe and a related product.
Background
The clothes represent the taste of a person, and the neat and graceful clothes can not only improve confidence of people, but also improve the life quality of people, so that the attention of people to clothes is gradually improved, and the nursing of the clothes is gradually the topic of attention of people.
However, in the current life, any clothes are all in the same nursing mode, and may damage the clothes, thereby shortening the service life of the clothes, so that the problems of unreasonable and unhealthy nursing mode exist, and the use experience of the user in nursing the clothes is also reduced.
Disclosure of Invention
The embodiment of the application provides a control method of an intelligent wardrobe and a related product, clothes information is collected through sensor equipment, then nursing is carried out by adopting different nursing schemes according to the clothes information, the improvement of the intelligence of clothes nursing is facilitated, and the clothes nursing is more reasonable and healthier.
In a first aspect, an embodiment of the present application provides a control method for an intelligent wardrobe, where the intelligent wardrobe includes a sensor device, and the method includes:
collecting information of a target garment by the sensor device;
determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment;
controlling, by the sensor device, the intelligent wardrobe to care for the target garment according to the care plan.
In a second aspect, an embodiment of the present application provides a control apparatus for an intelligent wardrobe, where the intelligent wardrobe includes a sensor device, the apparatus includes a processing unit, and the processing unit is configured to:
collecting information of a target garment by the sensor device;
and determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment;
and controlling the intelligent wardrobe to care for the target clothes according to the care scheme through the sensor device.
In a third aspect, embodiments of the present application provide an intelligent wardrobe comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing some or all of the steps described in the method according to the first aspect of embodiments of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, where the computer program is executed by a processor to implement part or all of the steps described in the method according to the first aspect of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps described in the method according to the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
According to the technical scheme, the sensor equipment is used for acquiring the information of the target clothes; determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment; controlling, by the sensor device, the intelligent wardrobe to care for the target garment according to the care plan. It is thus clear that through the technical scheme that this application provided, gather clothing information through sensor equipment, then adopt different nursing scheme to nurse according to clothing information, be favorable to improving the intellectuality of clothing nursing, make clothing nursing rationalize, healthization more.
These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic structural diagram of an intelligent wardrobe provided in an embodiment of the present application;
fig. 1B is a schematic structural diagram of a sensor device according to an embodiment of the present application;
fig. 1C is a schematic diagram illustrating an operating principle of a sensor device according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart of a control method of an intelligent wardrobe provided in an embodiment of the present application;
FIG. 3 is a schematic flow chart of another control method for an intelligent wardrobe provided in the embodiments of the present application;
fig. 4 is a schematic structural diagram of an intelligent wardrobe provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a control device of an intelligent wardrobe according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1A, fig. 1A is a schematic structural diagram of an intelligent wardrobe according to an embodiment of the present disclosure. As shown in fig. 1A, the intelligent wardrobe includes a wardrobe body 101, a rack body 102, and sensor devices 103, wherein the installation positions and the number of the sensor devices 103 are not specifically limited in this application, and the intelligent wardrobe can be installed according to actual requirements, for example, the intelligent wardrobe is installed on the top, the side, or the like of the intelligent wardrobe, or is installed outside the intelligent wardrobe.
The sensor device 103 may be used to collect information about clothes (including clothes being nursed in an intelligent wardrobe and clothes to be nursed outside the intelligent wardrobe), environment inside the intelligent wardrobe, environment outside the intelligent wardrobe, and the like.
The sensor device 104 may be an image sensor device (e.g., a camera), a humidity sensor device, a temperature sensor device, a wind sensor device, or the like, or a combination of various sensor devices, which may be adjusted according to actual needs, and is not limited in this application.
The intelligent wardrobe in the present application may include one or more of the following components: processor, memory, transceiver, etc.
A processor may include one or more processing cores. The processor connects various parts within the entire intelligent wardrobe using various interfaces and lines, and performs various functions of the intelligent wardrobe and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory, and calling data stored in the memory. Alternatively, the processor may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is to be understood that the modem may be implemented by a communication chip without being integrated into the processor.
The Memory may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory includes a non-transitory computer-readable medium. The memory may be used to store an instruction, a program, code, a set of codes, or a set of instructions. The memory may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, which may be an Android (Android) system (including Android system-based deep development systems), an IOS system developed by apple, including IOS system-based deep development systems), or other systems, instructions for implementing at least one function, instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the intelligent wardrobe during use.
In the traditional computer vision technology, information is acquired by a sensor and then is sent to a background for processing, a signal processing module is used for carrying out effect processing, and the information is transmitted to a computer vision module from the signal processing module for processing.
Referring to fig. 1B, fig. 1B is a schematic structural diagram of a sensor device according to an embodiment of the present disclosure. As shown in fig. 1B, different from a mechanism in which the conventional sensor collects data and sends the data to the backend device, the sensor device provided in the present application is a combination of the sensor and the computer vision module, the sensor device directly and locally performs data processing, that is, the sensor device performs data collection and analysis processing to obtain a recognition result, and performs specific control based on the recognition result, and an internal algorithm of the sensor device may be updated and optimized through a platform. The sensor equipment can acquire the information of a specific target through the information acquisition module, and the information acquired by the information acquisition module is transmitted to the sensor/computer vision module; the sensor/computer vision module may process the information and then perform a series of specific operations based on the processing results. In addition, the sensor device can also transmit the acquired original information or the information processed by the sensor/computer vision module to the background, and the background further processes the received data (effect processing).
Referring to fig. 1C, fig. 1C is a schematic diagram illustrating an operating principle of a sensor device according to an embodiment of the present disclosure. As shown in fig. 1C, the sensor device includes an information acquisition module, a front-end processing module and a computer vision chip, the front-end processing module includes at least one sensor unit, an analog signal processing circuit and an analog-to-digital conversion circuit; the computer vision chip includes a digital signal processor and at least one artificial intelligence processor.
The at least one sensor unit is connected with the analog signal processing circuit and used for receiving a measuring signal (namely, information acquired by the information acquisition module), converting the measuring signal into an electric signal and transmitting the electric signal to the analog signal processing circuit.
The analog signal processing circuit is connected with the analog-to-digital conversion circuit and used for processing the analog signal of the electric signal and transmitting the analog processing result to the analog-to-digital conversion circuit.
The analog-to-digital conversion circuit is used for converting the analog processing result into a digital signal and outputting the digital signal.
The digital signal processor is used for performing digital signal processing according to the electric signal generated by the front-end processing module and outputting a digital signal processing result.
The memory is used for storing the digital signal processing result and comprises a sharing area and n exclusive areas.
The sharing area is used for storing various information (for nursing different clothes, controlling different intelligent wardrobes and acquiring different information to perform specific processing) needing specific signal processing (such as format conversion and effect processing). For example, taking image information as an example, the sensor device may include a pixel unit array (i.e., a signal acquisition module), an analog signal processing circuit, an analog-to-digital conversion circuit, a control circuit, an interface circuit, and the like. The external light irradiates the pixel unit array to generate a photoelectric effect, corresponding charges are generated in the pixel unit array, namely the image sensing unit acquires an optical signal, the optical signal is converted into an electric signal, the electric signal is subjected to analog signal processing, an analog processing result is converted into a digital signal under the control of the clock circuit, and the control circuit controls the digital signal to transmit the digital signal to a shared area of the memory through the interface circuit.
The exclusive area is used for storing specific information, and the specific information may include information of a specific target (for example, for different target clothes, when the intelligent wardrobe is controlled, specific and differential control is performed), and information of a specific type (for example, some collected specific information may be directly processed by the artificial intelligence processor without front-end processing).
The artificial intelligence processor is used for acquiring specific information or digital signal processing results from the memory and executing corresponding artificial intelligence processing operation according to the specific information or digital signal processing results.
Referring to fig. 2, fig. 2 is a schematic flow chart of a control method of an intelligent wardrobe provided in an embodiment of the present application, where the control method of the intelligent wardrobe is applicable to an intelligent wardrobe, and the intelligent wardrobe includes a sensor device.
As shown in fig. 2, the control method of the intelligent wardrobe can be applied to the intelligent wardrobe shown in fig. 1A, and the control method of the intelligent wardrobe includes the following operations.
S201, collecting information of the target clothes through the sensor device.
The target clothes can be clothes already stored and hung in the intelligent wardrobe or clothes not yet stored and hung in the intelligent wardrobe.
S202, determining a care scheme corresponding to the target clothes according to the information of the target clothes through the sensor equipment;
it can be understood that the care modes of the clothes made of different materials, different types and different states are different. For example, cotton clothes and polyester clothes require different care modes; underwear and outerwear require different care modes; the clothes with high humidity and the clothes with low humidity need different care modes.
The nursing scheme is preset according to actual nursing requirements corresponding to the materials and/or types of the clothes.
The material of the target clothes includes, but is not limited to, pure wool (e.g., woolen sweater, cashmere sweater, camel sweater, cashmere sweater, rabbit wool blended sweater, camel wool blended sweater, yak wool blended sweater, etc.), blended fiber (e.g., wool/acrylic fiber, rabbit wool/acrylic fiber, mohair/acrylic fiber, camel hair/acrylic fiber, cashmere/brocade, nylon blended sweater, cashmere/silk blended sweater, etc.), pure chemical fiber (e.g., stretch nylon sweater, stretch polypropylene fiber sweater, stretch polyester sweater, acrylic bulked sweater, acrylic fiber/polyester, viscose/nylon blended sweater, etc.), interweaving fiber (e.g., wool acrylic fiber, rabbit hair acrylic fiber, wool and cotton yarn interweaving sweater), natural fiber, regenerated fiber, synthetic fiber, spandex, etc. In addition, the material of the target clothes can be classified according to other methods according to actual care requirements, and the material is not particularly limited in this application.
Among them, the types of the target clothes include, but are not limited to, classification by wearing combination (e.g., one-piece suit, coat, vest, skirt, pants), classification by use (e.g., underwear, coat, sock, etc.), classification by folding garment fabric and process (e.g., chinese style garment, western style garment, embroidery garment, woolen garment, silk garment, cotton cloth garment, fur garment, knitted garment, down garment, etc.), classification by gender (male garment, female garment), classification by age (adult garment: classification of male garment, female garment, middle aged and old-aged garment, child garment: classification of baby garment, child garment, middle child garment, large child garment, teenager garment, etc.), classification by special functions (having heat-resistant firefighter garment, high temperature operation garment, water-resistant diving garment, high-altitude flight garment, space garment, mountaineering garment, etc.), classification by difference of the thinness and lining materials of the garment (having single type, heavy-weight garment, heavy-weight, and heavy-weight, The clothes are classified according to the washing effect of the clothes (such as stone washing, rinsing, general washing, sand washing, enzyme washing, snow washing and the like), and the clothes are classified according to the wearing seasons (such as spring clothes, autumn clothes, winter clothes and summer clothes). In addition, the types of the target clothes can be classified according to other methods according to actual care requirements, and the application is not particularly limited to this.
S203, controlling the intelligent wardrobe to nurse the target clothes according to the nursing scheme through the sensor equipment.
It can be seen that, in the control method of the intelligent wardrobe provided by the embodiment of the application, information of target clothes is collected through the sensor device; then determining a care scheme corresponding to the target clothes according to the information of the target clothes through the sensor equipment; and controlling the intelligent wardrobe to nurse the target clothes according to the nursing scheme through the sensor equipment. Therefore, according to the control method of the intelligent wardrobe, the sensor equipment is used for collecting the clothes information, and then nursing is carried out according to the clothes information by adopting different nursing schemes, so that the improvement of the intelligence of clothes nursing is facilitated, and the clothes nursing is more reasonable and healthier.
In one possible example, the collecting information of the target clothes by the sensor device includes: capturing an image of the target garment by the sensor device; and/or collecting spectral information and/or thermal spectrum information of the target clothes through the sensor device; and/or collecting humidity of the target laundry by the sensor device.
It can be seen that in the present example, the sensor device is integrated with a multifunctional information collection, which can collect various information of the laundry so as to comprehensively determine the care plan of the laundry based on the various information of the laundry.
In one possible example, the determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment includes: determining, by the sensor device, a material and/or type of the target garment from the image and/or spectral information of the target garment; and/or determining, by the sensor device, a temperature of the target garment from spectral information and/or energy spectral information of the target garment; and/or determining the bacteria and/or mite content of the target clothes according to the image and/or the spectrum information of the target clothes through the sensor equipment; and determining a care scheme corresponding to the target clothes according to at least one of the material, type, temperature, humidity, bacteria content and mite content of the target clothes through the sensor equipment.
Therefore, in the example, the sensor device can process the collected clothes information without sending the collected information to the rear end for processing, namely the sensor collects the clothes information and can determine the care scheme of the clothes, and then the intelligent wardrobe is controlled to care the clothes, so that the intelligence of clothes care is improved.
In one possible example, the determining the material and/or type of the target garment according to the image of the target garment includes: extracting feature points of the key part features of the clothes in the image of the target clothes to obtain a first feature point set; extracting feature points of the global features of the clothes in the image of the target clothes to obtain a second feature point set; inputting the first feature point set into a preset neural network model to obtain a first evaluation value; inputting the second feature point set into the preset neural network model to obtain a second evaluation value; acquiring a first weight value corresponding to the clothes key part features and a second weight value corresponding to the clothes global features, wherein the first weight value is greater than the second weight value, and the sum of the first weight value and the second weight value is 1; performing weighting operation according to the first evaluation value, the second evaluation value, the first weight value and the second weight value to obtain a target evaluation value; acquiring a target image quality evaluation value corresponding to the image of the target clothes; determining a target material and/or target type evaluation adjustment coefficient corresponding to the target image quality evaluation value according to a mapping relation between a preset image quality evaluation value and a material and/or type evaluation adjustment coefficient of clothes; adjusting the target evaluation value according to the target material and/or target type evaluation adjustment coefficient to obtain a final evaluation value; and determining the material and/or type of the target clothes corresponding to the final evaluation value according to the mapping relation between the preset evaluation value and the material and/or type of the clothes.
Wherein, it can be understood that different clothes materials and/or types can be treated by different treatment modes. After the material and/or type of the target clothes are determined, the corresponding care scheme can be matched according to the material and/or type of the target clothes; the nursing scheme is preset according to actual nursing requirements corresponding to the materials and/or types of the clothes; and then controlling the intelligent wardrobe to nurse the target clothes according to the matched nursing scheme.
In this example, it can be seen that, the key part features and the global features of the target clothes are obtained from the image of the target clothes, and then the material and/or the type of the target clothes are determined through the comprehensive analysis of the key part features and the global features of the target clothes, which is beneficial to ensuring the accuracy of identifying the material and/or the type of the clothes.
In one possible example, the determining of the care plan corresponding to the target laundry according to at least one of material, type, temperature, humidity, bacteria content, and mite content of the target laundry includes: judging whether the target clothes are specific clothes according to the material and/or type of the target clothes, wherein the specific clothes comprise socks, underwear and infant clothes; matching a first care plan if the target garment is a specific garment; matching a second care regimen if the target garment is not a specific garment, wherein the first care regimen has a higher care health level than the second care regimen.
It can be seen that, in this example, whether the target clothes are the specific clothes is identified, when the target clothes are the specific clothes, the care scheme with the high health level is adopted for nursing, and when the target clothes are the general clothes, the care scheme with the low health level is adopted for nursing, so that different clothes are nursed by adopting different care schemes, the intelligent clothes nursing is improved, and the clothes nursing is more reasonable and healthier.
In one possible example, after determining the material and/or type of the target garment from the image of the target garment, the method includes: identifying whether the target clothes have stains according to the image of the target clothes; if the target clothes have stains, matching a corresponding cleaning mode and a corresponding decontamination care product according to the type of the stains and the material and/or type of the target clothes; and controlling the intelligent wardrobe to clean the target clothes by adopting the cleaning mode and the decontamination care product.
Therefore, in the example, whether the clothes are stained or not is identified through the clothes image, and the type of the stain is further identified, so that the corresponding cleaning mode and the stain removal nursing product are matched according to the type of the stain to clean, the intelligent wardrobe is favorable for improving the intelligent clothes cleaning performance, and the user experience is enhanced.
In one possible example, after controlling the intelligent wardrobe to wash the target laundry using the wash mode and the decontamination care product, the method further comprises: and selecting a corresponding airing mode according to the material and/or type of the target clothes to dry the target clothes, wherein the airing mode comprises drying and normal-temperature airing.
Therefore, in the example, after the clothes are washed, the corresponding drying mode is selected according to the material and/or type of the clothes for drying, so that the intelligent clothes cabinet is favorable for improving the drying intelligence of the clothes, and the user experience is enhanced.
In one possible example, the method further comprises: if the sensor equipment detects that a plurality of clothes to be cleaned exist in the intelligent wardrobe, determining a cleaning mode and a decontamination nursing product corresponding to each piece of clothes to be cleaned in the plurality of clothes to be cleaned according to the type of stains and the material and/or type of the clothes; controlling the intelligent wardrobe through the sensor device to regulate and control the positions of the plurality of clothes to be cleaned, and moving the clothes to be cleaned, which are cleaned in the same manner as the decontamination care product, in the plurality of clothes to be cleaned into the same working area of the intelligent wardrobe; and controlling the intelligent wardrobe to clean the target clothes in each working area by adopting a corresponding cleaning mode and a decontamination nursing product through the sensor equipment.
Therefore, in the example, the clothes with the same cleaning mode and the same decontamination care product are collected together for cleaning, which is beneficial to saving resources and clothes cleaning time and enhancing user experience.
In one possible example, the controlling the intelligent wardrobe to regulate and control the positions of the plurality of clothes to be washed to move the clothes to be washed, which are washed in the same way as the decontamination care product, to the same working area of the intelligent wardrobe comprises: acquiring a plurality of images of each working area in the intelligent wardrobe through the sensor equipment; performing integral region fusion on the plurality of images of each working area through the sensor equipment, determining the current position of each piece of clothes to be cleaned in the plurality of pieces of clothes to be cleaned, and determining the number of the clothes to be cleaned, of which the corresponding cleaning mode is the same as that of a decontamination care product, in each working area; taking a working area with the largest number of clothes to be cleaned, which has the same corresponding cleaning mode and the same decontamination care product, as a cleaning area corresponding to the cleaning mode and the decontamination care product through the sensor equipment; the laundry to be cleaned in the other washing zones in the same manner as the washing manner and the decontamination care product is moved from its current position to the washing zone by the sensor device, and the laundry to be cleaned in the other washing manners and the decontamination care product in the washing zone is moved from its current position to the corresponding other washing zone.
It can be seen that, in this example, the positions of all the clothes to be washed are obtained through the sensor device, the working area with the largest number of clothes to be washed of the same washing mode and decontamination care product is used as the washing area of the washing mode and the decontamination care product, the clothes to be washed of the washing mode and the decontamination care product of other working areas are moved to the washing area, and other washing modes and decontamination care products in the washing area are moved to other corresponding washing areas, so that the intelligence of clothes movement during washing is improved, and the user experience is enhanced.
In one possible example, the method further comprises: detecting whether the content of mites and/or bacteria on the target clothes reaches a preset threshold value through the sensor equipment; if the content of the mites and/or bacteria on the target clothes reaches a preset threshold value, controlling the intelligent wardrobe to sterilize the target clothes in a corresponding sterilization mode through the sensor equipment according to the characteristics of the mites and/or bacteria on the target clothes, and releasing corresponding fragrant products.
It can be seen that, in this example, mite and/or bacterium on the clothing are detected through sensor device, then carry out germicidal treatment to the clothing according to the characteristic of mite and/or bacterium, still release faint scent after disinfecting to improve clothing care's intelligence, be favorable to strengthening user experience.
In one possible example, the method further comprises: acquiring, by the sensor device, a label of the target garment from an image of the target garment; identifying the target clothes using occasion according to the label of the target clothes through the sensor equipment; matching the corresponding care scheme according to the use occasion of the target clothes through the sensor equipment.
For example, if the clothing is identified as hospital clothing based on the clothing label, the care plan is mainly to enhance the sterilization process; if the garment is identified as factory clothing based on the garment tag, the care plan may need to consider static electricity removal processing, etc.
Therefore, in the example, the use occasions of the clothes are identified according to the labels of the clothes, and the corresponding nursing schemes are matched according to the use occasions of the clothes, so that the clothes nursing intelligence is improved, and the user experience is enhanced.
In one possible example, the method further comprises: detecting the temperature of the target clothes and the temperature of the current environment through the sensor device, and detecting the humidity of the target clothes and the humidity of the current environment through the sensor device; determining an optimal drying mode and an optimal drying temperature corresponding to the target clothes to be dried within a preset time length according to the material and/or type of the target clothes, the temperature of the current environment, the humidity of the target clothes and the humidity of the current environment through the sensor equipment; and drying the target clothes through the sensor equipment according to the optimal airing mode and the optimal airing temperature.
Therefore, in the example, the optimal temperature and the optimal drying mode for drying the clothes are comprehensively determined according to the material, type, temperature and humidity of the clothes, the environmental temperature and the humidity, so that the clothes drying intelligence is improved, and the user experience is favorably enhanced.
In one possible example, if there are a plurality of laundry to be dried, the method further includes: respectively determining the optimal wind power and the optimal temperature required by drying each piece of clothes to be dried within the same preset time by the sensor equipment; performing weight calculation through the sensor equipment according to the optimal wind power and the optimal temperature required by drying each piece of clothes to be dried to obtain the optimal wind power sequence and the optimal temperature sequence required by drying each piece of clothes to be dried; calculating the drying sequence of the clothes to be dried according to the influence weight coefficient of the preset wind power and temperature for drying the clothes to be dried and the optimal wind power sequence and the optimal temperature sequence required by drying each piece of clothes to be dried by the sensor equipment; controlling, by the sensor device, the intelligent wardrobe to adjust the position of the plurality of clothes to be dried according to the drying sequence.
For example, clothes that are easy to dry are hung at the end away from the fan and the heating device, and clothes that are not easy to dry are hung at the end close to the fan and the heating device.
Therefore, in the example, when a plurality of clothes to be dried exist, the drying sequence of the plurality of clothes to be dried is determined according to comprehensive analysis of a plurality of influence factors of clothes drying, and then the positions of the clothes are intelligently adjusted, so that the clothes drying intelligence is improved, and the user experience is favorably enhanced.
Referring to fig. 3, fig. 3 is a schematic flow chart of a control method of an intelligent wardrobe provided in an embodiment of the present application, where the control method of the intelligent wardrobe is applicable to an intelligent wardrobe, and the intelligent wardrobe includes a sensor device.
As shown in fig. 3, the control method of the intelligent wardrobe can be applied to the intelligent wardrobe shown in fig. 1A, and the control method of the intelligent wardrobe includes the following operations.
S301, acquiring an image of the target clothes through the sensor device.
S302, determining the material and/or type of the target clothes according to the image of the target clothes through the sensor equipment.
S303, judging whether the target clothes are specific clothes or not according to the material and/or type of the target clothes through the sensor equipment, wherein the specific clothes comprise socks, underwear and infant clothes.
S304, if the target clothes are specific clothes, matching a first care plan through the sensor device.
S305, if the target clothes are not specific clothes, matching a second care scheme through the sensor equipment, wherein the care health level of the first care scheme is higher than that of the second care scheme.
It is to be noted, among others, that the second care regimen may be a general laundry care regimen, while the first care regimen may be a specific care regimen, such as specific disinfection, specific sterilization, etc.
S306, controlling the intelligent wardrobe to nurse the target clothes according to the nursing scheme through the sensor equipment.
It can be seen that in the control method of the intelligent wardrobe provided by the embodiment of the application, the image of the target clothes is acquired through the sensor device, then whether the target clothes are the specific clothes or not is identified, when the target clothes are the specific clothes, a nursing scheme with a high health level is adopted for nursing, and when the target clothes are the general clothes, a nursing scheme with a low health level is adopted for nursing, so that different nursing schemes are adopted for different clothes, the intelligent nursing of the clothes is improved, and the nursing of the clothes is more reasonable and healthier.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an intelligent wardrobe 400 according to an embodiment of the present application, consistent with the embodiments shown in fig. 2 and fig. 3. As shown in fig. 4, the intelligent wardrobe 400 includes an application processor 410, a memory 420, a communication interface 430, and one or more programs 421, wherein the one or more programs 421 are stored in the memory 420 and configured to be executed by the application processor 410, and the one or more programs 421 include instructions for performing any of the steps of the above method embodiments. Additionally, the intelligent wardrobe includes a sensor device.
In one possible example, the program 421 includes instructions for performing the following steps: collecting information of a target garment by the sensor device; determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment; controlling, by the sensor device, the intelligent wardrobe to care for the target garment according to the care plan.
It can be seen that, in the intelligent wardrobe provided by the embodiment of the application, information of target clothes is collected through the sensor device; then determining a care scheme corresponding to the target clothes according to the information of the target clothes through the sensor equipment; and controlling the intelligent wardrobe to nurse the target clothes according to the nursing scheme through the sensor equipment. Therefore, according to the intelligent wardrobe, the sensor equipment is used for collecting the clothes information, and then different nursing schemes are adopted for nursing according to the clothes information, so that the intelligent clothes nursing is improved, and the clothes nursing is more reasonable and healthier.
In one possible example, in terms of collecting information of a target garment by the sensor device, the instructions in the program 421 are specifically for performing the following operations: capturing an image of the target garment by the sensor device; and/or collecting spectral information and/or thermal spectrum information of the target clothes through the sensor device; and/or collecting humidity of the target laundry by the sensor device.
In one possible example, in terms of determining, by the sensor device, a care plan corresponding to the target garment from the information of the target garment, the instructions in the program 421 are specifically for performing the following operations: determining, by the sensor device, a material and/or type of the target garment from the image and/or spectral information of the target garment; and/or determining, by the sensor device, a temperature of the target garment from spectral information and/or energy spectral information of the target garment; and/or determining the bacteria and/or mite content of the target clothes according to the image and/or the spectrum information of the target clothes through the sensor equipment; and determining a care scheme corresponding to the target clothes according to at least one of the material, type, temperature, humidity, bacteria content and mite content of the target clothes through the sensor equipment.
In one possible example, in determining the material and/or type of the target garment from the image of the target garment, the instructions in the program 421 are specifically configured to: extracting feature points of the key part features of the clothes in the image of the target clothes to obtain a first feature point set; extracting feature points of the global features of the clothes in the image of the target clothes to obtain a second feature point set; inputting the first feature point set into a preset neural network model to obtain a first evaluation value; inputting the second feature point set into the preset neural network model to obtain a second evaluation value; acquiring a first weight value corresponding to the clothes key part features and a second weight value corresponding to the clothes global features, wherein the first weight value is greater than the second weight value, and the sum of the first weight value and the second weight value is 1; performing weighting operation according to the first evaluation value, the second evaluation value, the first weight value and the second weight value to obtain a target evaluation value; acquiring a target image quality evaluation value corresponding to the image of the target clothes; determining a target material and/or target type evaluation adjustment coefficient corresponding to the target image quality evaluation value according to a mapping relation between a preset image quality evaluation value and a material and/or type evaluation adjustment coefficient of clothes; adjusting the target evaluation value according to the target material and/or target type evaluation adjustment coefficient to obtain a final evaluation value; and determining the material and/or type of the target clothes corresponding to the final evaluation value according to the mapping relation between the preset evaluation value and the material and/or type of the clothes.
In one possible example, in determining a care plan corresponding to the target garment based on at least one of material, type, temperature, humidity, bacteria content, and mite content of the target garment, the instructions in the program 421 are specifically configured to: judging whether the target clothes are specific clothes according to the material and/or type of the target clothes, wherein the specific clothes comprise socks, underwear and infant clothes; matching a first care plan if the target garment is a specific garment; matching a second care regimen if the target garment is not a specific garment, wherein the first care regimen has a higher care health level than the second care regimen.
In one possible example, after determining the material and/or type of the target garment from the image of the target garment, the instructions in the program 421 are further specifically configured to: identifying whether the target clothes have stains according to the image of the target clothes; if the target clothes have stains, matching a corresponding cleaning mode and a corresponding decontamination care product according to the type of the stains and the material and/or type of the target clothes; and controlling the intelligent wardrobe to clean the target clothes by adopting the cleaning mode and the decontamination care product.
In one possible example, after controlling the intelligent wardrobe to wash the target laundry using the washing pattern and the decontamination care product, the instructions in the program 421 are further specifically configured to: and selecting a corresponding airing mode according to the material and/or type of the target clothes to dry the target clothes, wherein the airing mode comprises drying and normal-temperature airing.
In one possible example, the instructions in the program 421 are also to perform the following operations: if the sensor equipment detects that a plurality of clothes to be cleaned exist in the intelligent wardrobe, determining a cleaning mode and a decontamination nursing product corresponding to each piece of clothes to be cleaned in the plurality of clothes to be cleaned according to the type of stains and the material and/or type of the clothes; controlling the intelligent wardrobe through the sensor device to regulate and control the positions of the plurality of clothes to be cleaned, and moving the clothes to be cleaned, which are cleaned in the same manner as the decontamination care product, in the plurality of clothes to be cleaned into the same working area of the intelligent wardrobe; and controlling the intelligent wardrobe to clean the target clothes in each working area by adopting a corresponding cleaning mode and a decontamination nursing product through the sensor equipment.
In one possible example, in controlling the intelligent wardrobe to control the positions of the plurality of clothes to be cleaned, and moving the clothes to be cleaned, which are cleaned in the same manner as the decontamination care product, to the same working area of the intelligent wardrobe, the instructions in the program 421 are specifically configured to perform the following operations: acquiring a plurality of images of each working area in the intelligent wardrobe through the sensor equipment; performing integral region fusion on the plurality of images of each working area through the sensor equipment, determining the current position of each piece of clothes to be cleaned in the plurality of pieces of clothes to be cleaned, and determining the number of the clothes to be cleaned, of which the corresponding cleaning mode is the same as that of a decontamination care product, in each working area; taking a working area with the largest number of clothes to be cleaned, which has the same corresponding cleaning mode and the same decontamination care product, as a cleaning area corresponding to the cleaning mode and the decontamination care product through the sensor equipment; the laundry to be cleaned in the other washing zones in the same manner as the washing manner and the decontamination care product is moved from its current position to the washing zone by the sensor device, and the laundry to be cleaned in the other washing manners and the decontamination care product in the washing zone is moved from its current position to the corresponding other washing zone.
In one possible example, the instructions in the program 421 are also to perform the following operations: detecting whether the content of mites and/or bacteria on the target clothes reaches a preset threshold value through the sensor equipment; if the content of the mites and/or bacteria on the target clothes reaches a preset threshold value, controlling the intelligent wardrobe to sterilize the target clothes in a corresponding sterilization mode through the sensor equipment according to the characteristics of the mites and/or bacteria on the target clothes, and releasing corresponding fragrant products.
In one possible example, the instructions in the program 421 are also to perform the following operations: acquiring, by the sensor device, a label of the target garment from an image of the target garment; identifying the target clothes using occasion according to the label of the target clothes through the sensor equipment; matching the corresponding care scheme according to the use occasion of the target clothes through the sensor equipment.
In one possible example, the instructions in the program 421 are also to perform the following operations: detecting the temperature of the target clothes and the temperature of the current environment through the sensor device, and detecting the humidity of the target clothes and the humidity of the current environment through the sensor device; determining an optimal drying mode and an optimal drying temperature corresponding to the target clothes to be dried within a preset time length according to the material and/or type of the target clothes, the temperature of the current environment, the humidity of the target clothes and the humidity of the current environment through the sensor equipment; and drying the target clothes through the sensor equipment according to the optimal airing mode and the optimal airing temperature.
In one possible example, if there are multiple clothes to be dried, the instructions in the program 421 are further configured to: respectively determining the optimal wind power and the optimal temperature required by drying each piece of clothes to be dried within the same preset time by the sensor equipment; performing weight calculation through the sensor equipment according to the optimal wind power and the optimal temperature required by drying each piece of clothes to be dried to obtain the optimal wind power sequence and the optimal temperature sequence required by drying each piece of clothes to be dried; calculating the drying sequence of the clothes to be dried according to the influence weight coefficient of the preset wind power and temperature for drying the clothes to be dried and the optimal wind power sequence and the optimal temperature sequence required by drying each piece of clothes to be dried by the sensor equipment; controlling, by the sensor device, the intelligent wardrobe to adjust the position of the plurality of clothes to be dried according to the drying sequence.
It should be noted that, for the specific implementation process of the present embodiment, reference may be made to the specific implementation process described in the foregoing method embodiment, and a description thereof is omitted here.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the intelligent wardrobe includes hardware structures and/or software modules for performing the above functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the application, the intelligent wardrobe can be divided into the functional units according to the method example, for example, each functional unit can be divided corresponding to each function, or two or more functions can be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Referring to fig. 5, fig. 5 is a block diagram illustrating functional units of a control device 500 of an intelligent wardrobe according to an embodiment of the present application. As shown in fig. 5, the control device of the intelligent wardrobe comprises a processing unit 501 and a communication unit 502, wherein the processing unit 501 is configured to execute any step in the above method embodiments, and when data transmission such as transmission is performed, the communication unit 502 is optionally called to complete corresponding operations. In addition, the control apparatus 500 of the intelligent wardrobe is applied to an intelligent wardrobe including a sensor device, which will be described in detail below.
In one possible example, the processing unit 501 is configured to: collecting information of a target garment by the sensor device; determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment; controlling, by the sensor device, the intelligent wardrobe to care for the target garment according to the care plan.
It can be seen that, in the control device of the intelligent wardrobe provided by the embodiment of the application, information of target clothes is collected through the sensor equipment; then determining a care scheme corresponding to the target clothes according to the information of the target clothes through the sensor equipment; and controlling the intelligent wardrobe to nurse the target clothes according to the nursing scheme through the sensor equipment. Therefore, the control device of the intelligent wardrobe collects the clothes information through the sensor equipment, and then nurses the clothes according to the clothes information by adopting different nursing schemes, so that the intelligence of clothes nursing is improved, and the clothes nursing is more reasonable and healthier.
In one possible example, in terms of collecting information of a target garment by the sensor device, the processing unit 501 is specifically configured to: capturing an image of the target garment by the sensor device; and/or collecting spectral information and/or thermal spectrum information of the target clothes through the sensor device; and/or collecting humidity of the target laundry by the sensor device.
In one possible example, in terms of determining, by the sensor device, a care plan corresponding to the target garment from the information of the target garment, the processing unit 501 is specifically configured to: determining, by the sensor device, a material and/or type of the target garment from the image and/or spectral information of the target garment; and/or determining, by the sensor device, a temperature of the target garment from at least one of spectral information and/or energy spectral information of the target garment; and/or determining the bacteria and/or mite content of the target clothes according to the image and/or the spectrum information of the target clothes through the sensor equipment; and determining a care scheme corresponding to the target clothes according to at least one of the material, type, temperature, humidity, bacteria content and mite content of the target clothes through the sensor equipment.
In one possible example, in terms of determining the material and/or type of the target garment from the image of the target garment, the processing unit 501 is specifically configured to: extracting feature points of the key part features of the clothes in the image of the target clothes to obtain a first feature point set; extracting feature points of the global features of the clothes in the image of the target clothes to obtain a second feature point set; inputting the first feature point set into a preset neural network model to obtain a first evaluation value; inputting the second feature point set into the preset neural network model to obtain a second evaluation value; acquiring a first weight value corresponding to the clothes key part features and a second weight value corresponding to the clothes global features, wherein the first weight value is greater than the second weight value, and the sum of the first weight value and the second weight value is 1; performing weighting operation according to the first evaluation value, the second evaluation value, the first weight value and the second weight value to obtain a target evaluation value; acquiring a target image quality evaluation value corresponding to the image of the target clothes; determining a target material and/or target type evaluation adjustment coefficient corresponding to the target image quality evaluation value according to a mapping relation between a preset image quality evaluation value and a material and/or type evaluation adjustment coefficient of clothes; adjusting the target evaluation value according to the target material and/or target type evaluation adjustment coefficient to obtain a final evaluation value; and determining the material and/or type of the target clothes corresponding to the final evaluation value according to the mapping relation between the preset evaluation value and the material and/or type of the clothes.
In one possible example, in terms of determining a care plan corresponding to the target laundry according to at least one of material, type, temperature, humidity, bacteria content, and mite content of the target laundry, the processing unit 501 is specifically configured to: judging whether the target clothes are specific clothes according to the material and/or type of the target clothes, wherein the specific clothes comprise socks, underwear and infant clothes; matching a first care plan if the target garment is a specific garment; matching a second care regimen if the target garment is not a specific garment, wherein the first care regimen has a higher care health level than the second care regimen.
In one possible example, after determining the material and/or type of the target garment from the image of the target garment, the processing unit 501 is specifically configured to: identifying whether the target clothes have stains according to the image of the target clothes; if the target clothes have stains, matching a corresponding cleaning mode and a corresponding decontamination care product according to the type of the stains and the material and/or type of the target clothes; and controlling the intelligent wardrobe to clean the target clothes by adopting the cleaning mode and the decontamination care product.
In one possible example, after controlling the intelligent wardrobe to wash the target laundry using the washing pattern and the decontamination care product, the processing unit 501 is specifically configured to: and selecting a corresponding airing mode according to the material and/or type of the target clothes to dry the target clothes, wherein the airing mode comprises drying and normal-temperature airing.
In one possible example, the processing unit 501 is specifically configured to: if the sensor equipment detects that a plurality of clothes to be cleaned exist in the intelligent wardrobe, determining a cleaning mode and a decontamination nursing product corresponding to each piece of clothes to be cleaned in the plurality of clothes to be cleaned according to the type of stains and the material and/or type of the clothes; controlling the intelligent wardrobe through the sensor device to regulate and control the positions of the plurality of clothes to be cleaned, and moving the clothes to be cleaned, which are cleaned in the same manner as the decontamination care product, in the plurality of clothes to be cleaned into the same working area of the intelligent wardrobe; and controlling the intelligent wardrobe to clean the target clothes in each working area by adopting a corresponding cleaning mode and a decontamination nursing product through the sensor equipment.
In one possible example, in controlling the intelligent wardrobe to adjust and control the position of the plurality of clothes to be washed, and moving the clothes to be washed, which are washed in the same way as the decontamination care product, of the plurality of clothes to be washed into the same working area of the intelligent wardrobe, the processing unit 501 is specifically configured to: acquiring a plurality of images of each working area in the intelligent wardrobe through the sensor equipment; performing integral region fusion on the plurality of images of each working area through the sensor equipment, determining the current position of each piece of clothes to be cleaned in the plurality of pieces of clothes to be cleaned, and determining the number of the clothes to be cleaned, of which the corresponding cleaning mode is the same as that of a decontamination care product, in each working area; taking a working area with the largest number of clothes to be cleaned, which has the same corresponding cleaning mode and the same decontamination care product, as a cleaning area corresponding to the cleaning mode and the decontamination care product through the sensor equipment; the laundry to be cleaned in the other washing zones in the same manner as the washing manner and the decontamination care product is moved from its current position to the washing zone by the sensor device, and the laundry to be cleaned in the other washing manners and the decontamination care product in the washing zone is moved from its current position to the corresponding other washing zone.
In one possible example, the processing unit 501 is further configured to: detecting whether the content of mites and/or bacteria on the target clothes reaches a preset threshold value through the sensor equipment; if the content of the mites and/or bacteria on the target clothes reaches a preset threshold value, controlling the intelligent wardrobe to sterilize the target clothes in a corresponding sterilization mode through the sensor equipment according to the characteristics of the mites and/or bacteria on the target clothes, and releasing corresponding fragrant products.
In one possible example, the processing unit 501 is further configured to: acquiring, by the sensor device, a label of the target garment from an image of the target garment; identifying the target clothes using occasion according to the label of the target clothes through the sensor equipment; matching the corresponding care scheme according to the use occasion of the target clothes through the sensor equipment.
In one possible example, the processing unit 501 is further configured to: detecting the temperature of the target clothes and the temperature of the current environment through the sensor device, and detecting the humidity of the target clothes and the humidity of the current environment through the sensor device; determining an optimal drying mode and an optimal drying temperature corresponding to the target clothes to be dried within a preset time length according to the material and/or type of the target clothes, the temperature of the current environment, the humidity of the target clothes and the humidity of the current environment through the sensor equipment; and drying the target clothes through the sensor equipment according to the optimal airing mode and the optimal airing temperature.
In one possible example, if there are multiple pieces of laundry to be dried, the processing unit 501 is further configured to: respectively determining the optimal wind power and the optimal temperature required by drying each piece of clothes to be dried within the same preset time by the sensor equipment; performing weight calculation through the sensor equipment according to the optimal wind power and the optimal temperature required by drying each piece of clothes to be dried to obtain the optimal wind power sequence and the optimal temperature sequence required by drying each piece of clothes to be dried; calculating the drying sequence of the clothes to be dried according to the influence weight coefficient of the preset wind power and temperature for drying the clothes to be dried and the optimal wind power sequence and the optimal temperature sequence required by drying each piece of clothes to be dried by the sensor equipment; controlling, by the sensor device, the intelligent wardrobe to adjust the position of the plurality of clothes to be dried according to the drying sequence.
The control device 500 of the intelligent wardrobe may further include a storage unit 503 for storing program codes and data of the intelligent wardrobe. The processing unit 501 may be a processor, the communication unit 502 may be a touch display screen or a transceiver, and the storage unit 503 may be a memory.
It can be understood that, since the method embodiment and the apparatus embodiment are different presentation forms of the same technical concept, the contents of the method embodiment portion in the present application should be synchronously adapted to the apparatus embodiment portion, and are not described herein again.
Embodiments of the present application further provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device and an intelligent wardrobe.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product can be a software installation package, and the computer comprises electronic equipment and an intelligent wardrobe.
It should be noted that, for simplicity of description, the foregoing method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (9)

1. A control method of an intelligent wardrobe, the intelligent wardrobe comprising a sensor device, the method comprising:
collecting information of a target garment by the sensor device;
determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment;
controlling, by the sensor device, the intelligent wardrobe to care for the target garment according to the care plan;
the collecting, by the sensor device, information of the target laundry includes:
capturing an image of the target garment by the sensor device;
the determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment includes:
determining, by the sensor device, a material and/or type of the target garment from the image of the target garment;
determining, by the sensor device, a care plan corresponding to the target garment according to the material and/or type of the target garment;
the determining the material and/or type of the target clothing according to the image of the target clothing comprises:
extracting feature points of the key part features of the clothes in the image of the target clothes to obtain a first feature point set;
extracting feature points of the global features of the clothes in the image of the target clothes to obtain a second feature point set;
inputting the first feature point set into a preset neural network model to obtain a first evaluation value;
inputting the second feature point set into the preset neural network model to obtain a second evaluation value;
acquiring a first weight value corresponding to the clothes key part features and a second weight value corresponding to the clothes global features, wherein the first weight value is greater than the second weight value, and the sum of the first weight value and the second weight value is 1;
performing weighting operation according to the first evaluation value, the second evaluation value, the first weight value and the second weight value to obtain a target evaluation value;
acquiring a target image quality evaluation value corresponding to the image of the target clothes;
determining a target material and/or target type evaluation adjustment coefficient corresponding to the target image quality evaluation value according to a mapping relation between a preset image quality evaluation value and a material and/or type evaluation adjustment coefficient of clothes;
adjusting the target evaluation value according to the target material and/or target type evaluation adjustment coefficient to obtain a final evaluation value;
determining the material and/or type of the target clothes corresponding to the final evaluation value according to the mapping relation between the preset evaluation value and the material and/or type of the clothes;
the determining of the care scheme corresponding to the target clothes according to the material and/or type of the target clothes comprises: judging whether the target clothes are specific clothes according to the material and/or type of the target clothes, wherein the specific clothes comprise socks, underwear and infant clothes; matching a first care plan if the target garment is a specific garment; matching a second care regimen if the target garment is not a specific garment, wherein the first care regimen has a higher care health level than the second care regimen.
2. The method of claim 1, wherein said collecting information of a target garment by said sensor device further comprises:
collecting spectral information and/or thermal spectrum information of a target garment by the sensor device;
and/or collecting humidity of the target laundry by the sensor device.
3. The method of claim 2, wherein the determining, by the sensor device, a care plan corresponding to the target garment from the information of the target garment further comprises:
determining the material and/or type of the target clothes through the sensor equipment according to the spectral information of the target clothes;
and/or determining, by the sensor device, a temperature of the target garment from spectral information and/or energy spectral information of the target garment;
and/or determining the bacteria and/or mite content of the target clothes according to the image and/or the spectrum information of the target clothes through the sensor equipment;
determining, by the sensor device, a care regimen corresponding to the target laundry also according to at least one of a temperature, a humidity, a bacteria content, a mite content of the target laundry.
4. The method of claim 3, wherein after determining the material and/or type of the target garment from the image of the target garment, the method comprises:
identifying whether the target clothes have stains according to the image of the target clothes;
if the target clothes have stains, matching a corresponding cleaning mode and a corresponding decontamination care product according to the type of the stains and the material and/or type of the target clothes;
and controlling the intelligent wardrobe to clean the target clothes by adopting the cleaning mode and the decontamination care product.
5. The method of claim 4, further comprising:
if the sensor equipment detects that a plurality of clothes to be cleaned exist in the intelligent wardrobe, determining a cleaning mode and a decontamination nursing product corresponding to each piece of clothes to be cleaned in the plurality of clothes to be cleaned according to the type of stains and the material and/or type of the clothes;
controlling the intelligent wardrobe through the sensor device to regulate and control the positions of the plurality of clothes to be cleaned, and moving the clothes to be cleaned, which are cleaned in the same manner as the decontamination care product, in the plurality of clothes to be cleaned into the same working area of the intelligent wardrobe;
and controlling the intelligent wardrobe to clean the target clothes in each working area by adopting a corresponding cleaning mode and a decontamination nursing product through the sensor equipment.
6. The method of claim 5, wherein controlling the intelligent wardrobe with the sensor device to regulate the position of the plurality of items of clothing to be cleaned to move items of clothing to be cleaned in the same manner as the decontaminated care product to the same work area of the intelligent wardrobe comprises:
acquiring a plurality of images of each working area in the intelligent wardrobe through the sensor equipment;
performing integral region fusion on the plurality of images of each working area through the sensor equipment, determining the current position of each piece of clothes to be cleaned in the plurality of pieces of clothes to be cleaned, and determining the number of the clothes to be cleaned, of which the corresponding cleaning mode is the same as that of a decontamination care product, in each working area;
taking a working area with the largest number of clothes to be cleaned, which has the same corresponding cleaning mode and the same decontamination care product, as a cleaning area corresponding to the cleaning mode and the decontamination care product through the sensor equipment;
the laundry to be cleaned in the other washing zones in the same manner as the washing manner and the decontamination care product is moved from its current position to the washing zone by the sensor device, and the laundry to be cleaned in the other washing manners and the decontamination care product in the washing zone is moved from its current position to the corresponding other washing zone.
7. Control device of an intelligent wardrobe, characterized in that the intelligent wardrobe comprises a sensor device, the device comprising a processing unit for:
collecting information of a target garment by the sensor device;
and determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment;
and controlling, by the sensor device, the intelligent wardrobe to care for the target garment according to the care plan;
the collecting, by the sensor device, information of the target laundry includes:
capturing an image of the target garment by the sensor device;
the determining, by the sensor device, a care plan corresponding to the target garment according to the information of the target garment includes:
determining, by the sensor device, a material and/or type of the target garment from the image of the target garment;
determining, by the sensor device, a care plan corresponding to the target garment according to the material and/or type of the target garment;
the determining the material and/or type of the target clothing according to the image of the target clothing comprises:
extracting feature points of the key part features of the clothes in the image of the target clothes to obtain a first feature point set;
extracting feature points of the global features of the clothes in the image of the target clothes to obtain a second feature point set;
inputting the first feature point set into a preset neural network model to obtain a first evaluation value;
inputting the second feature point set into the preset neural network model to obtain a second evaluation value;
acquiring a first weight value corresponding to the clothes key part features and a second weight value corresponding to the clothes global features, wherein the first weight value is greater than the second weight value, and the sum of the first weight value and the second weight value is 1;
performing weighting operation according to the first evaluation value, the second evaluation value, the first weight value and the second weight value to obtain a target evaluation value;
acquiring a target image quality evaluation value corresponding to the image of the target clothes;
determining a target material and/or target type evaluation adjustment coefficient corresponding to the target image quality evaluation value according to a mapping relation between a preset image quality evaluation value and a material and/or type evaluation adjustment coefficient of clothes;
adjusting the target evaluation value according to the target material and/or target type evaluation adjustment coefficient to obtain a final evaluation value;
determining the material and/or type of the target clothes corresponding to the final evaluation value according to the mapping relation between the preset evaluation value and the material and/or type of the clothes;
the determining of the care scheme corresponding to the target clothes according to the material and/or type of the target clothes comprises: judging whether the target clothes are specific clothes according to the material and/or type of the target clothes, wherein the specific clothes comprise socks, underwear and infant clothes; matching a first care plan if the target garment is a specific garment; matching a second care regimen if the target garment is not a specific garment, wherein the first care regimen has a higher care health level than the second care regimen.
8. An intelligent wardrobe comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-6.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which is executed by a processor to implement the method of any one of claims 1-6.
CN202010076808.3A 2020-01-23 2020-01-23 Control method of intelligent wardrobe and related product Active CN111273581B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010076808.3A CN111273581B (en) 2020-01-23 2020-01-23 Control method of intelligent wardrobe and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010076808.3A CN111273581B (en) 2020-01-23 2020-01-23 Control method of intelligent wardrobe and related product

Publications (2)

Publication Number Publication Date
CN111273581A CN111273581A (en) 2020-06-12
CN111273581B true CN111273581B (en) 2021-11-09

Family

ID=71003557

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010076808.3A Active CN111273581B (en) 2020-01-23 2020-01-23 Control method of intelligent wardrobe and related product

Country Status (1)

Country Link
CN (1) CN111273581B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112746472A (en) * 2020-12-28 2021-05-04 珠海格力电器股份有限公司 Control method and device of clothes care cabinet, clothes care cabinet and storage medium
CN114155691B (en) * 2021-11-30 2023-01-24 珠海格力电器股份有限公司 Prompt message generation method and device and electronic equipment
CN114040118B (en) * 2021-12-24 2022-10-28 珠海格力电器股份有限公司 Clothes processing method, device, electronic equipment and computer readable storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104730930A (en) * 2015-01-16 2015-06-24 小米科技有限责任公司 Clothes sorting method and device and clothes washing method and device
CN106702667A (en) * 2016-12-22 2017-05-24 Tcl家用电器(合肥)有限公司 Washing machine and intelligent clothes washing method thereof
CN106884278A (en) * 2017-04-17 2017-06-23 东华大学 A kind of multifunctional intellectual laundry care machine
CN107177951A (en) * 2016-03-09 2017-09-19 青岛海尔洗衣机有限公司 A kind of washing machine Intelligent antibacterial method
CN107918780A (en) * 2017-09-01 2018-04-17 中山大学 A kind of clothes species and attributive classification method based on critical point detection
CN109344841A (en) * 2018-08-10 2019-02-15 北京华捷艾米科技有限公司 A kind of clothes recognition methods and device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030021348A (en) * 2001-09-05 2003-03-15 주식회사 엘지이아이 method for controlling washing in drum-type washing machine
JP4415674B2 (en) * 2002-01-29 2010-02-17 株式会社ニコン Image forming state adjusting system, exposure method, exposure apparatus, program, and information recording medium
JP6894725B2 (en) * 2017-03-09 2021-06-30 キヤノン株式会社 Image processing device and its control method, program, storage medium
CN108288267B (en) * 2018-01-17 2022-04-05 中国矿业大学 Dark channel-based non-reference evaluation method for image definition of scanning electron microscope
CN108446651A (en) * 2018-03-27 2018-08-24 百度在线网络技术(北京)有限公司 Face identification method and device
CN109671023B (en) * 2019-01-24 2023-07-21 江苏大学 Face image super-resolution secondary reconstruction method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104730930A (en) * 2015-01-16 2015-06-24 小米科技有限责任公司 Clothes sorting method and device and clothes washing method and device
CN107177951A (en) * 2016-03-09 2017-09-19 青岛海尔洗衣机有限公司 A kind of washing machine Intelligent antibacterial method
CN106702667A (en) * 2016-12-22 2017-05-24 Tcl家用电器(合肥)有限公司 Washing machine and intelligent clothes washing method thereof
CN106884278A (en) * 2017-04-17 2017-06-23 东华大学 A kind of multifunctional intellectual laundry care machine
CN107918780A (en) * 2017-09-01 2018-04-17 中山大学 A kind of clothes species and attributive classification method based on critical point detection
CN109344841A (en) * 2018-08-10 2019-02-15 北京华捷艾米科技有限公司 A kind of clothes recognition methods and device

Also Published As

Publication number Publication date
CN111273581A (en) 2020-06-12

Similar Documents

Publication Publication Date Title
CN111273581B (en) Control method of intelligent wardrobe and related product
CN107893309A (en) Washing methods and device, clothes washing method and device
CN106884278A (en) A kind of multifunctional intellectual laundry care machine
CN106032613B (en) Intelligent clothes washing management device and method thereof
CN103820971B (en) A kind of intelligence of the washing machine based on wireless automatic identification technology washing methods
CN203334040U (en) Intelligent washing machine based on Internet of Things
EP3682049B1 (en) Personalized laundry appliance
CN107123019A (en) A kind of VR shopping commending systems and method based on physiological data and Emotion identification
CN103534722A (en) Article utilization
CN109112774A (en) Control method, device, storage medium, program product and the washing machine of washing machine
CN205893692U (en) Washing machine's control system
CN114466954B (en) Machine control method and system based on object recognition
CN107974799A (en) A kind of method and washing machine of intelligent recognition washing clothing
CN107904860A (en) Washing machine undergarment processing method and processing device
WO2019076228A1 (en) Washing control method and washing machine
CN110019322A (en) A kind of clothing data collection analysis method and system
CN107604593A (en) Washing machine wastewater processing method and processing device
WO2019036974A1 (en) Smart washing method, and washing machine
WO2020253463A1 (en) Air conditioner control method and device, and air conditioner
CN108095212A (en) A kind of modularization intelligent clothes and its implementation
CN201384237Y (en) Automatic clothing matching cabinet
CN205557093U (en) Dirty degree discriminating gear of wearable clothing
US10037672B1 (en) Smart garments that identify user changes
EP3382085A1 (en) Apparatus and method for a washing machine
CN110547606A (en) Intelligent clothes management system, wardrobe with intelligent clothes management system and working method of intelligent clothes management system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant