CN111429210A - Method, device and equipment for recommending clothes - Google Patents

Method, device and equipment for recommending clothes Download PDF

Info

Publication number
CN111429210A
CN111429210A CN202010166996.9A CN202010166996A CN111429210A CN 111429210 A CN111429210 A CN 111429210A CN 202010166996 A CN202010166996 A CN 202010166996A CN 111429210 A CN111429210 A CN 111429210A
Authority
CN
China
Prior art keywords
clothes
user
recommended
wearing
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010166996.9A
Other languages
Chinese (zh)
Inventor
高进宝
苏明月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Haier Smart Technology R&D Co Ltd
Haier Smart Home Co Ltd
Original Assignee
Qingdao Haier Smart Technology R&D Co Ltd
Haier Smart Home Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Haier Smart Technology R&D Co Ltd, Haier Smart Home Co Ltd filed Critical Qingdao Haier Smart Technology R&D Co Ltd
Priority to CN202010166996.9A priority Critical patent/CN111429210A/en
Publication of CN111429210A publication Critical patent/CN111429210A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9532Query formulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Ophthalmology & Optometry (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application relates to the technical field of clothes recommendation and discloses a method for clothes recommendation. The method comprises the following steps: receiving scene information; determining corresponding wearing clothes according to the scene information; inquiring the existing clothes based on the corresponding matched clothes; and determining and feeding back recommended wearing clothes according to the query result. The scene information is determined through the user instruction, the clothes information is obtained according to the scene information, the existing clothes of the user can be recommended according to the scene specified by the user, the clothes wearing scene of the user is considered, the user can conveniently select according to the existing clothes information, and the experience of the user in obtaining the clothes recommendation is improved. The application also discloses a device and equipment for recommending clothes.

Description

Method, device and equipment for recommending clothes
Technical Field
The present application relates to the field of clothing recommendation technologies, and for example, to a method, an apparatus, and a device for clothing recommendation.
Background
Along with the improvement of living standard, the types and styles of clothes are more and more abundant, in daily life of people, a lot of time is spent on wearing clothes every day, most of the cases are that people randomly select clothes to match, people who want to have better image but low fashion sensitivity or lack of time are very troubled, especially because the clothes are suitable for different seasons, different styles and colors, and some clothes are forgotten because of not being frequently used, and the various factors make the user take a lot of trouble in selecting clothes.
In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art: at present, when people select clothes, recommendation information of the clothes is lacked, the existing clothes recommendation information is mostly a combination of fixed collocation or products of all brands, the clothes dressing scene of a user is not considered generally, and the actual requirements of the user are difficult to meet.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a method, a device and equipment for recommending clothes, so that clothes information recommended for a user is more fit with a clothes dressing scene.
In some embodiments, the method comprises:
receiving scene information;
determining corresponding wearing clothes according to the scene information;
inquiring the existing clothes based on the corresponding wearing clothes;
and determining and feeding back recommended wearing clothes according to the query result.
In some embodiments, the apparatus for laundry recommendation includes: a processor and a memory storing program instructions, the processor being configured to, upon execution of the program instructions, perform the method for laundry recommendation described above.
In some embodiments, the apparatus comprises the above-described device for clothing recommendation.
The method, the device and the equipment for recommending the clothes, provided by the embodiment of the disclosure, can realize the following technical effects: the scene information is determined through the user instruction, the clothes information is obtained according to the scene information, the existing clothes of the user can be recommended according to the scene specified by the user, the clothes wearing scene of the user is considered, the user can conveniently select according to the existing clothes information, and the experience of the user in obtaining the clothes recommendation is improved.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in limitation thereof, in which elements having the same reference numeral designations are shown as like elements and not in limitation thereof, and wherein:
FIG. 1 is a schematic diagram of a method for clothing recommendation provided by an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of another method for clothing recommendation provided by an embodiment of the present disclosure;
fig. 3 is a schematic view of an apparatus for recommending clothes according to an embodiment of the present disclosure.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, one or more embodiments may be practiced without these details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
The terms "first," "second," and the like in the description and in the claims, and the above-described drawings of embodiments of the present disclosure, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the present disclosure described herein may be made. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions.
The term "plurality" means two or more unless otherwise specified.
In the embodiment of the present disclosure, the character "/" indicates that the preceding and following objects are in an or relationship. For example, A/B represents: a or B.
The term "and/or" is an associative relationship that describes objects, meaning that three relationships may exist. For example, a and/or B, represents: a or B, or A and B.
As shown in fig. 1, an embodiment of the present disclosure provides a method for recommending laundry, including:
step S101, receiving scene information;
s102, determining corresponding wearing clothes according to scene information;
step S103, inquiring the existing clothes based on the corresponding wearing clothes;
and step S104, determining and feeding back recommended wearing clothes according to the query result.
By adopting the method for recommending the clothes, the corresponding information of the worn clothes can be obtained through the received scene information, the existing clothes of the user can be recommended according to the scene specified by the user, the clothes wearing scene of the user is considered, the user can conveniently select according to the existing clothes information, and the experience of the user in obtaining the clothes recommendation is improved.
Optionally, receiving the scene information includes: and receiving voice information sent by a user, and identifying scene keywords according to the voice information to obtain scene information.
When a user wants to go to a certain scene, for example: a party scene, a work scene, an interview scene, a evening scene, etc., by sending voice information, for example, "i want to go to a party, a clothes collocation", or "party, clothes", or "interview, clothes", etc. According to the voice information of the user, matching the place where the user wants to go, for example, if the voice information contains a party, matching the scene where the user wants to go to the party.
Optionally, determining the corresponding wearing clothes according to the scene information includes: and matching the wearing clothes corresponding to the scene keywords in the scene information in a preset clothes database.
In some embodiments, the preset clothing database stores the wearing clothing corresponding to different scenes and the clothing attribute information corresponding to the wearing clothing. Optionally, from the matched clothes, clothes with the same clothes attribute information are selected to form a clothes matching scheme. Optionally, the same clothing attribute information includes one or more of the same style, the same model, the same suitable age, the same suitable scene, the same suitable weather, the same suitable sex, the same suitable height, and the same suitable weight. Therefore, the clothes with the same clothes attribute information are collocated, so that the collocation scheme is more reasonable and more suitable for user requirements.
Optionally, the existing clothes of the user and the corresponding clothes attribute information are obtained by querying an existing clothes database; or by reading RFID (Radio Frequency Identification) tags of the clothes inside the intelligent wardrobe. Optionally, the clothing attribute information includes style, model, style, and clothing attribute information such as suitable age, suitable scene, suitable weather, suitable gender, suitable height, suitable weight, and the like. The suitable age is a preset age value, the suitable scene is a preset scene, the suitable weather is preset weather, the suitable gender is preset gender, the suitable height is a preset height value, and the suitable weight is a preset weight value. The clothes can be matched with wearing clothes more suitable for a user by presetting the age, scene, weather, sex, height, weight and the like of the clothes.
Optionally, determining the recommended clothing item according to the query result includes:
determining all the wearing clothes as recommended wearing clothes under the condition that all the wearing clothes are found in the existing clothes; or,
under the condition that part of the wearing clothes is found in the existing clothes, the found wearing clothes are determined to be recommended wearing clothes; or,
determining the corresponding existing clothes as recommended wearing clothes under the condition that the found clothes in the existing clothes have a corresponding relation with the wearing clothes; or,
and under the condition that the wearing clothes are not found in the existing clothes, acquiring the purchasable clothes in the electronic mall according to the scene information, and determining the purchasable clothes as the recommended wearing clothes.
In some embodiments, according to the clothing under the interview scene, all the clothing corresponding to the interview scene in the existing clothing are simultaneously matched, for example, the interview scene includes: formal western-style clothes, formal western-style trousers, formal leather shoes, white shirts, ties and the like, wherein all the worn clothes are included in the existing clothes, and the all the worn clothes are fed back to the user as recommended worn clothes.
In some embodiments, according to the clothing under the interview scene, if only the part corresponding to the interview scene is worn inside the existing clothing, for example, the interview scene includes: stripe shirts, gray shirts, white shirts and the like, but if only white shirts exist in the existing clothes, only the white shirts are fed back to the user as recommended wearing clothes.
In some embodiments, according to the wearing clothes in the interview scene, the wearing clothes corresponding to the interview scene are not matched in the existing clothes, but the clothes information corresponding to the wearing clothes is matched. For example, an interview scenario includes: and if the striped shirt does not exist in the existing clothes, but the blue shirt corresponding to the striped shirt is included, the blue shirt is fed back to the user as the recommended wearing clothes.
In some embodiments, according to the wearing clothes in the interview scene, and when the wearing clothes corresponding to the interview scene are not matched in the existing clothes, or the intelligent wardrobe is not bound, or the existing clothes of the user are not found, a search engine interface of an electronic shopping mall, such as a treasure washing, a Jingdong or a surrounding electronic department store, is called for searching, according to the wearing clothes corresponding to the interview scene, for example, the wearing clothes corresponding to the interview scene, such as a striped shirt, a gray shirt, a white shirt, and the like, are used as keywords, the matched search result is fed back to the user as recommended wearing clothes, and meanwhile, a purchase link is also sent to the user, so that the user can conveniently place an order for purchase.
Optionally, the method for laundry recommendation further comprises: and classifying the recommended wearing clothes and feeding back a classification result. Optionally, when feeding back the recommended clothing to the user, a classification is performed, for example, as follows: the existing clothes and the purchasable clothes are both fed back to the user. This can facilitate the user in selecting a garment from an existing garment or in purchasing a suitable garment again.
Optionally, when searching through the electronic mall, the user attribute information, that is, the age, sex, height, and weight of the user is also considered, and the wearing clothes matching the user attribute information are searched and fed back to the user as recommended wearing clothes.
Optionally, the user attribute information is obtained by querying a user information database. Optionally, the user attribute information is entered when the user registers, and the user attribute information includes: the name, age, sex, height, weight, face image and the like of the user can be better matched with the recommended clothing through the user attribute information.
Optionally, the feedback recommended wearing apparel comprises: displaying the recommended wearing clothes to the user through a display interface; or sending the corresponding wearing clothes to the terminal equipment, so that the terminal equipment displays the recommended wearing clothes to the user.
In some embodiments, when the received scene information has multiple scene types, the scene type is fed back to the user, the user makes a specific scene type selection, and the corresponding clothing is fed back according to the scene type selected by the user. For example, when a meeting scene that the user wants to go to is received, the meeting scene has various types of selections, such as a colleague meeting, a friend meeting, a college party and the like, the user can further select the classification by speaking the on-screen terminal device returned to the user in a text or voice mode. The wearing clothes corresponding to the scene can be matched more accurately.
Optionally, the method for laundry recommendation further comprises: and matching the recommended wearing clothes according to one or more of the age, sex, height and weight of the user, and feeding back a clothes matching scheme.
Optionally, clothes matched with the age of the user are selected from the recommended wearing clothes for matching. Wherein, the age matching with the user is as follows: the suitable age in the clothing attribute information includes the age of the user. In some embodiments, in a sports scene, recommended wearing clothes such as obtained sportswear, sports pants and sports shoes are obtained, meanwhile, the age of the user is 23 years according to user attribute information, and then bright color series sports clothes, bright color series sports pants and bright color series sports shoes which are matched with the age of 23 years in the recommended wearing clothes are combined and matched, and the matching scheme of the clothes is fed back to the user.
Optionally, clothes matched with the gender of the user are selected from the recommended wearing clothes for matching. Wherein, the gender matching with the user is: the suitable sex in the clothing attribute information is the same as the sex of the user. In some embodiments, in an athletic scene, according to the obtained wearing clothes such as the sportswear, the sport pants and the sport shoes recommended by the user, and meanwhile according to the user attribute information, the user is a male, the recommended men's sportswear, the men's sport pants and the men's sport shoes in the wearing clothes are combined and matched, and the matching scheme of the clothes is fed back to the user.
Optionally, clothes matched with the age and gender of the user are selected from the recommended wearing clothes for matching. Wherein, the age matching with the user is as follows: the suitable age in the clothing attribute information includes the age of the user; matching the gender of the user is: the suitable sex in the clothing attribute information is the same as the sex of the user. In some embodiments, in an athletic scene, according to obtained wearing clothes such as sportswear, athletic pants and athletic shoes recommended by a user and the like, and meanwhile according to user attribute information, the user is male and 23 years old, a bright color system sportswear, a bright color system athletic pants for male and a bright color system athletic shoe for male, which are matched with the male with age 23, in the recommended wearing clothes are combined and matched, and the matching scheme of each clothes is fed back to the user.
Optionally, clothes matched with the age, sex, height and weight of the user are selected from the recommended wearing clothes for matching. Wherein, the age matching with the user is as follows: the suitable age in the clothing attribute information includes the age of the user; matching the gender of the user is: the suitable gender in the clothing attribute information is the same as the gender of the user; the height match with the user is: the suitable height in the clothing attribute information comprises the height of the user; the weight matching with the user is: the appropriate weight in the clothing attribute information includes the weight of the user. In some embodiments, in an athletic scene, according to the obtained wearing clothes recommended by the sportswear, the sneakers and the like, and according to the user information, the user is female, the age is 23 years, the height is 1.58, and the weight is 48kg, the lady bright color system body-modifying type sportswear and the lady bright color system sneaker which are matched with the female, the age is 23 years, the height is 1.58, and the weight is 48kg in the recommended wearing clothes are combined and matched, and the matching scheme of the clothes is fed back to the user.
According to the scheme for matching the recommended clothes to be worn according to one or more of the age, the sex, the height and the weight of the user, the attribute information of the user is considered under a specific scene, clothes which are more suitable for being worn by the user can be matched, and more fit and more accurate recommendation is provided for the user.
In practical application, as shown in fig. 2, the specific steps of the user obtaining the recommended clothes are as follows:
step S201, receiving a user voice instruction;
step S202, determine whether scene information is matched? If yes, step S203 is executed, otherwise, the process returns to step S201.
Step S203, inquiring a preset clothing database to obtain wearing clothing matched with scene information;
step S204, judging whether an intelligent wardrobe is connected? If yes, step S205 is executed, and if no, step S206 is executed.
Step S205, inquiring the information of the existing clothes in the intelligent wardrobe, taking the intersection of the worn clothes and the existing clothes in the intelligent wardrobe as recommended worn clothes, meanwhile, searching the wearable clothes which can be purchased in the electronic mall as the recommended worn clothes, and then executing step S207;
step S206, searching for the purchasable wearing clothes in the electronic mall as recommended wearing clothes, and then executing step S207;
and step S207, displaying the recommended clothing to the user through a display interface after classifying the clothing.
The user obtains the wearing clothes corresponding to the scene through the method for recommending clothes disclosed by the embodiment, the wearing clothes corresponding to the specific scene can be matched quickly and accurately, and more accurate professional clothes wearing opinions are provided for the user.
In some embodiments, after obtaining the recommended clothing, the user may combine the recommended clothing obtained by the user with the real-time image of the user to implement virtual fitting. Optionally, the real-time image of the user is acquired through the terminal device, the terminal device uploads the real-time image of the user to the server, the server realizes the splicing process and feeds back spliced data to the terminal device, and the terminal device can also directly splice and display the spliced data. The recommended clothing obtained by the user is the recommended clothing obtained by the user.
Amalgamate the recommendation clothing that the user obtained and the real-time image of corresponding user, include:
and acquiring a user head portrait, and splicing the recommended clothes acquired by the user with the user head portrait.
Optionally, acquiring a user head portrait includes: the method comprises the steps of obtaining a real-time image of a user, carrying out face detection on the real-time image of the user through a face classifier, and extracting a corresponding face area image to obtain a head portrait of the user.
Optionally, the matching of the recommended clothing obtained by the user and the user head portrait includes:
the method comprises the steps of detecting a user head portrait through a human eye classifier, extracting a corresponding human eye region image, obtaining an eye distance according to the human eye region image, and splicing recommended clothes obtained by a user and the corresponding user head portrait according to the eye distance.
Optionally, the matching of the recommended clothing obtained by the user and the corresponding head portrait of the user according to the eye distance includes:
obtaining a corresponding clothes image according to recommended clothes obtained by a user;
adjusting the size of the head portrait of the user according to the inter-eye distance, and then splicing the head portrait of the user and the corresponding clothes image; and/or the presence of a gas in the gas,
and matching to obtain a clothes image corresponding to the eye distance in an image library corresponding to the recommended clothes obtained by the user, and then splicing the clothes image and the corresponding head portrait of the user.
Optionally, the inter-eye distance is a distance between pupil center points of both eyes in the human eye region image. Determining the pupil center point is a mature prior art and is not described herein again. After the coordinates of the pupil center points are obtained, the distance between the pupil center points can be calculated according to the coordinates of the two pupil center points, and the distance is used as the eye distance. The calculation of the distance between two points is also a well-established prior art and will not be described herein.
When a user performs virtual fitting through a terminal device, an image or video stream captured by the terminal device often changes with the posture of the user or changes with the distance between the user and the terminal device, so that the avatar of the user changes, for example, the closer the terminal device is to the face, the larger the captured avatar of the user is, and the farther the terminal device is from the face, the smaller the captured avatar of the user is, which is especially a problem when the user holds a terminal device such as a smart phone or a tablet. In this case, if the user's head portrait is directly combined with the clothes picture, the head may be small or small, which may affect the fitting effect. Therefore, eye distances are obtained according to the eye region images when the pictures are spliced, corresponding clothes images with different sizes are preset for each clothes, and the clothes images with the corresponding sizes are obtained according to the matching of the different eye distances in the shot real-time images; or when the eye distance is larger than the set threshold value, the corresponding user head portrait is reduced until the eye distance of the user head portrait reaches the set eye distance range. And then splicing the recommended clothes obtained by the user with the corresponding adjusted user head portrait, or splicing the user head portrait with the clothes image of the size corresponding to the recommended clothes obtained by the user. Because the size of the head portrait of the user is matched with the size of the clothes image before splicing, the splicing effect is improved, the user can better perform virtual fitting on recommended clothes, and the user experience is better when the clothes are recommended.
Since the image pixels obtained by the terminal devices of different users may be different, and the pixels of the clothes image are not high in consideration of the storage capacity, the clothes image has poor effect after being enlarged. In order to ensure better image quality, when the eye distance is larger than the eye distance set value corresponding to the clothes image, the user head portrait is too large corresponding to the clothes image, so that the user head portrait is zoomed, and the image definition is ensured as much as possible during splicing. When the eye distance is smaller than the eye distance set value corresponding to the clothes image, the head portrait of the user is too small corresponding to the clothes image, and the head portrait of the user and the clothes image are matched and spliced in a coordinated mode by matching the clothes images with corresponding sizes, so that the image definition is guaranteed.
Optionally, the virtual fitting of the user may be shown in the form of an image or a video stream, and for the video stream, the user may show the fitting effect of the user in real time through a camera of the terminal device, for example, a front-facing camera. And extracting a plurality of frame images from the video stream according to a set time interval aiming at the condition of showing the fitting effect of the user by the video stream, and then combining the head portrait of the user with the clothes image. In some embodiments, the method for the user to virtually try on according to the recommended clothes further comprises detecting a trend of eye distance changes:
computing
Figure BDA0002407788240000091
Obtaining the inter-eye distance change speed factor and then calculating
Figure BDA0002407788240000092
And obtaining the predicted value E of the inter-eye distance at the next moment.
Figure BDA0002407788240000093
Is a factor of speed of change of the interocular distance, YdFor the eye distance, Y, of the user's head portrait taken at the present momentqFor the eye distance of the user' S head portrait taken at the previous moment, SminIs the minimum value of the speed of change of the interocular distance, SmaxIs the maximum value of the speed of change of the inter-ocular distance, T is the interval time from the previous moment to the current moment, YtIs the eye distance, Y, of the user's head portrait taken at the t-th timet-1Is the eye distance, Y, of the user's head portrait taken at the t-1 st timemaxTo a set maximum inter-ocular distance value, YminT is more than or equal to 2, t is less than or equal to n and n is a positive integer for the set minimum interocular distance value, Smax>Smin. According to the inter-eye distance prediction method, the movement trend between the terminal equipment and the user is considered, meanwhile, the change speed of the inter-eye distance is also considered, and the inter-eye distance can be accurately predicted.
Optionally by calculation
Figure BDA0002407788240000101
Obtaining the speed S of interocular distance change, and obtaining Y from TIMtAnd Yt-1The interval time in between.
After obtaining recommended clothes, a user detects the eye distance change trend, then clothes images of corresponding sizes are matched according to the obtained predicted value E of the eye distance at the next moment, and then the head image of the user is spliced with the clothes images corresponding to all frames in the video stream to obtain virtual fitting video stream data. The method comprises the steps that when a user displays a virtual fitting effect through video stream data, the distance between the user and terminal equipment is changed continuously, so that the size of a head portrait of the user is changed, in order to achieve a good splicing effect, the size of a clothes image also needs to be switched along with the change of the head portrait of the user, the eye-to-eye distance of the head portrait of the user is subjected to variation trend prediction in the mode, and then the corresponding clothes image is matched according to a prediction result, namely the eye-to-eye distance prediction value E, so that splicing is carried out, the virtual fitting effect of the user on recommended clothes can be displayed more smoothly in a video stream mode, and the experience of the user in clothes recommendation is further improved.
As shown in fig. 3, an embodiment of the present disclosure provides an apparatus for recommending laundry, including a processor (processor)100 and a memory (memory) 101. Optionally, the apparatus may also include a Communication Interface (Communication Interface)102 and a bus 103. The processor 100, the communication interface 102, and the memory 101 may communicate with each other via a bus 103. The communication interface 102 may be used for information transfer. The processor 100 may call logic instructions in the memory 101 to perform the method for laundry recommendation of the above-described embodiment.
In addition, the logic instructions in the memory 101 may be implemented in the form of software functional units and stored in a computer readable storage medium when the logic instructions are sold or used as independent products.
The memory 101, which is a computer-readable storage medium, may be used for storing software programs, computer-executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 100 executes functional applications and data processing, i.e., implements the method for laundry recommendation in the above-described embodiments, by executing program instructions/modules stored in the memory 101.
The memory 101 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. In addition, the memory 101 may include a high-speed random access memory, and may also include a nonvolatile memory.
The device for recommending clothes can obtain corresponding wearing clothes information through the received scene information, can recommend the existing clothes of the user according to the scene specified by the user, is convenient for the user to select according to the existing clothes information while considering the clothes wearing scene of the user, and improves the experience of the user when obtaining clothes recommendation.
The embodiment of the disclosure provides a device comprising the device for recommending clothes.
Optionally, the device comprises a server or a terminal device.
In some embodiments, the terminal device is a smartphone, a television, a speaker with a screen, a refrigerator with a screen, an intelligent wardrobe, or the like.
Optionally, the terminal device receives voice information, key information, touch screen information or image information, etc. input by the user.
Optionally, the recommended clothing is viewed through a display interface of the terminal device.
Optionally, when the terminal device is not an intelligent wardrobe, the terminal device binds the intelligent wardrobe to obtain the existing clothes and the corresponding clothes attribute information thereof.
Optionally, the user enters user attribute information into the user information database through the terminal device, and the user enters existing clothes and clothes attribute information corresponding to the existing clothes into the existing clothes database through the terminal device.
In some embodiments, when the device is a server, the server receives voice information, key information, touch screen information or image information and the like input by a user through the terminal device.
Alternatively, the user sets the laundry database in the server in advance.
Optionally, the server obtains existing clothes and corresponding clothes attribute information of the intelligent wardrobe through the terminal device or receives the existing clothes and corresponding clothes attribute information input by the user.
Optionally, the server displays the recommendation result to the user through a display interface of the terminal device.
The equipment provided by the embodiment of the disclosure can obtain corresponding wearing clothes information through the received scene information, can recommend the existing clothes of the user according to the scene specified by the user, is convenient for the user to select according to the existing clothes information while considering the clothes wearing scene of the user, and improves the experience of the user when obtaining clothes recommendation.
The disclosed embodiments provide a computer-readable storage medium storing computer-executable instructions configured to perform the above-described method for laundry recommendation.
The disclosed embodiments provide a computer program product comprising a computer program stored on a computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the above-described method for laundry recommendation.
The computer-readable storage medium described above may be a transitory computer-readable storage medium or a non-transitory computer-readable storage medium.
The technical solution of the embodiments of the present disclosure may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes one or more instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium comprising: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes, and may also be a transient storage medium.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It can be clearly understood by the skilled person that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be merely a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than disclosed in the description, and sometimes there is no specific order between the different operations or steps. For example, two sequential operations or steps may in fact be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved. Each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Claims (10)

1. A method for clothing recommendation, comprising:
receiving scene information;
determining corresponding wearing clothes according to the scene information;
inquiring the existing clothes based on the corresponding wearing clothes;
and determining and feeding back recommended wearing clothes according to the query result.
2. The method of claim 1, wherein the receiving the scene information comprises:
and receiving voice information sent by a user, and identifying scene keywords according to the voice information to obtain the scene information.
3. The method of claim 1, wherein determining the corresponding clothing item according to the scene information comprises:
matching the wearing clothes corresponding to the scene keywords in the scene information in a preset clothes database.
4. The method of claim 1, wherein determining a recommended clothing item based on the query result comprises:
determining all the wearing clothes as recommended wearing clothes under the condition that all the wearing clothes are found in the existing clothes; or,
under the condition that part of the wearing clothes is found in the existing clothes, the found wearing clothes are determined to be recommended wearing clothes; or,
determining that the corresponding existing clothes are recommended wearing clothes under the condition that the found clothes in the existing clothes have a corresponding relation with the wearing clothes; or,
and under the condition that the wearing clothes are not found in the existing clothes, acquiring purchasable clothes in an electronic mall according to the scene information, and determining the purchasable clothes to be recommended wearing clothes.
5. The method of claim 1, wherein the feedback of the recommended clothing item comprises:
displaying the recommended wearing clothes to a user through a display interface; or,
and sending the corresponding wearing clothes to terminal equipment so that the terminal equipment displays the recommended wearing clothes to a user.
6. The method of any of claims 1 to 5, further comprising:
and matching the recommended wearing clothes according to one or more of the age, sex, height and weight of the user, and feeding back a clothes matching scheme.
7. The method of any of claims 1 to 5, further comprising:
and classifying the recommended wearing clothes and feeding back a classification result.
8. An apparatus for clothing recommendation, comprising a processor and a memory storing program instructions, characterized in that the processor is configured to perform the method for clothing recommendation according to any one of claims 1 to 7 when executing the program instructions.
9. An apparatus, characterized in that it comprises a device for recommendation of laundry according to claim 8.
10. The device of claim 9, wherein the device comprises a server or a terminal device.
CN202010166996.9A 2020-03-11 2020-03-11 Method, device and equipment for recommending clothes Pending CN111429210A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010166996.9A CN111429210A (en) 2020-03-11 2020-03-11 Method, device and equipment for recommending clothes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010166996.9A CN111429210A (en) 2020-03-11 2020-03-11 Method, device and equipment for recommending clothes

Publications (1)

Publication Number Publication Date
CN111429210A true CN111429210A (en) 2020-07-17

Family

ID=71547669

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010166996.9A Pending CN111429210A (en) 2020-03-11 2020-03-11 Method, device and equipment for recommending clothes

Country Status (1)

Country Link
CN (1) CN111429210A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112417277A (en) * 2020-11-18 2021-02-26 青岛海尔科技有限公司 Clothes recommendation method and device, storage medium and electronic device
WO2023077782A1 (en) * 2021-11-08 2023-05-11 青岛海尔空调器有限总公司 Air conditioner control method and control apparatus, and air conditioner

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112417277A (en) * 2020-11-18 2021-02-26 青岛海尔科技有限公司 Clothes recommendation method and device, storage medium and electronic device
WO2023077782A1 (en) * 2021-11-08 2023-05-11 青岛海尔空调器有限总公司 Air conditioner control method and control apparatus, and air conditioner

Similar Documents

Publication Publication Date Title
CN108829764B (en) Recommendation information acquisition method, device, system, server and storage medium
US20180308149A1 (en) Systems and methods to curate, suggest and maintain a wardrobe
CN107665238B (en) Picture processing method and device for picture processing
CN105447047A (en) Method and device for establishing shooting template database and providing shooting recommendation information
US10007860B1 (en) Identifying items in images using regions-of-interest
CN111429210A (en) Method, device and equipment for recommending clothes
KR20190140597A (en) Social network-based smart wardrobe and costume recommendation method
CN111401306A (en) Method, device and equipment for recommending clothes putting on
KR20140124087A (en) System and method for recommending hair based on face and style recognition
CN111429207A (en) Method, device and equipment for recommending clothes
CN113487373A (en) Fitting mirror, terminal, clothing recommendation method and storage medium
US9953242B1 (en) Identifying items in images using regions-of-interest
CN112148912A (en) Method, device and equipment for recommending clothes
CN112287149A (en) Clothing matching recommendation method and device, mirror equipment and storage medium
CN111429206A (en) Method, device and equipment for recommending clothes
CN113377970A (en) Information processing method and device
KR20100005960A (en) Information system for fashion
CN111429213A (en) Method, device and equipment for simulating fitting of clothes
CN111524160A (en) Track information acquisition method and device, electronic equipment and storage medium
CN111429212A (en) Method, device and equipment for recommending clothes
CN114707614A (en) Target re-identification method, device, equipment and computer readable storage medium
CN111429208A (en) Method, device and equipment for recommending clothes
CN111429211A (en) Method, device and equipment for recommending clothes
CN111429205A (en) Method, device and equipment for recommending clothes
CN113569077A (en) Method, device and equipment for recommending clothes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination