CN110909746A - Clothing recommendation method, related device and equipment - Google Patents

Clothing recommendation method, related device and equipment Download PDF

Info

Publication number
CN110909746A
CN110909746A CN201811085483.4A CN201811085483A CN110909746A CN 110909746 A CN110909746 A CN 110909746A CN 201811085483 A CN201811085483 A CN 201811085483A CN 110909746 A CN110909746 A CN 110909746A
Authority
CN
China
Prior art keywords
parameter set
clothing
apparel
matched
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811085483.4A
Other languages
Chinese (zh)
Inventor
董尔希
黄轩
王孝宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Intellifusion Technologies Co Ltd
Original Assignee
Shenzhen Intellifusion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Intellifusion Technologies Co Ltd filed Critical Shenzhen Intellifusion Technologies Co Ltd
Priority to CN201811085483.4A priority Critical patent/CN110909746A/en
Publication of CN110909746A publication Critical patent/CN110909746A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0621Item configuration or customization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Accounting & Taxation (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Development Economics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the invention discloses a clothing recommendation method, a related device and equipment, wherein the method comprises the following steps: the method comprises the steps of obtaining an input image, wherein the input image comprises an image to be matched, carrying out feature extraction on the input image to obtain a first parameter set, representing the visual attribute of the clothes to be matched, outputting a target clothes which is matched with the clothes to be matched, carrying out feature extraction on the image of the target clothes to obtain a second parameter set, matching the first parameter set and the second parameter set, and representing the attribute of the target clothes. By adopting the embodiment of the invention, the clothing matching recommendation can be more efficiently provided for the user, and not only can the appropriate clothing matching be recommended by the personal taste of the user, but also the star matching money can be recommended.

Description

Clothing recommendation method, related device and equipment
Technical Field
The invention relates to the technical field of computer technology and image processing, in particular to a clothing matching recommendation method, a related device and equipment.
Background
Along with the development of society, the living standard of people is gradually improved, the requirement of people on wearing matching is higher and higher, and good dress matching effect can better show the taste of one person. The development of internet e-commerce makes online shopping an important consumption channel for people. For the online shopping users paying attention to the clothing matching, the E-commerce seller is expected to recommend the proper clothing matching according to the personal taste of the user. However, in the prior art, no matter the online clothing store or the entity clothing store, the work of recommending proper clothing matching for the user or providing star matching with similar matching style is basically completed manually, so that the efficiency is low, and the requirement of the user for clothing matching recommendation cannot be effectively met.
Disclosure of Invention
The technical problem to be solved by the embodiments of the present invention is to provide a clothing recommendation method, a clothing recommendation device, a clothing recommendation apparatus, and a computer-readable storage medium, so as to solve the technical problem of how to improve the efficiency of recommending clothing matching to a user.
In order to solve the above technical problem, a first aspect of an embodiment of the present invention discloses a method for recommending clothing, including;
acquiring an input image, wherein the input image comprises an image to be collocated;
performing feature extraction on the input image to obtain a first parameter set, wherein the first parameter set is characterized by the visual attribute of the clothing to be matched;
performing feature extraction on the input image through a first neural network to obtain a first parameter set;
the first neural network is trained using a first training data set, the first training data set comprising: the system comprises a plurality of clothing images and a parameter set corresponding to each of the clothing images, wherein parameters in the parameter set correspond to visual attributes of clothing;
the plurality of apparel images may include apparel images for which a user evaluation value on the network is above a first threshold;
the plurality of clothing images can also comprise clothing images uploaded by a client or clothing images purchased or collected by a user on a shopping platform;
the plurality of clothes images are subjected to image enhancement processing;
outputting a target garment matched with the garment to be matched, and matching a second parameter set obtained by extracting the characteristics of the image of the target garment with the first parameter set, wherein the second parameter set represents the attribute of the target garment;
the target clothes which output the clothes matched with the clothes to be matched mutually comprise:
searching a second parameter set matched with the first parameter set through a second neural network;
outputting the target clothes corresponding to the second parameter set;
the second neural network is trained using a second training data set, the second training data set comprising: a third parameter set and a fourth parameter set; wherein the third parameter set comprises a parameter set corresponding to each of the plurality of clothing images output after being trained by the first neural network; the fourth parameter set comprises a parameter set corresponding to a clothing image collocated with clothing represented by the third parameter set;
the target clothes which output the clothes matched with the clothes to be matched further comprise:
searching a second parameter set matched with the first parameter set through a preset matching rule;
and outputting the target clothes corresponding to the second parameter set.
In a second aspect, an embodiment of the present invention discloses an apparel recommendation apparatus, including a unit for performing the method according to the first aspect.
In a third aspect, an embodiment of the present invention discloses an apparel recommendation device, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store an application program code, and the processor is configured to call the program code to perform the method according to the first aspect.
A fourth aspect of embodiments of the present invention discloses a computer-readable storage medium, in which a computer program is stored, the computer program comprising program instructions, which, when executed by a processor, cause the processor to perform the method according to the first aspect.
By implementing the embodiment of the invention, the image of the clothes to be matched of the user can be obtained, the characteristic extraction is carried out to obtain the parameter set of the image of the clothes to be matched, the target clothes parameter set matched with the parameter set of the image of the clothes to be matched is searched, and the target clothes matched with the clothes to be matched with each other are output. The matched clothes can be recommended to the user according to the clothes image input by the user, the user can choose to recommend popular matching money and star matching money on the network, and also choose to recommend matching with the same or similar personal clothes matching style of the user. Since the plurality of clothing images used in the first nerve training may be clothing images on the network in which the user evaluation value is higher than the first threshold, a popular match on the network, a mindset match, may be recommended to the user. The plurality of clothing images can also comprise clothing images uploaded by the client or clothing images purchased or collected by the user on the shopping platform, so that matches with the personal clothing matching style of the user or matches similar to the personal clothing matching style of the user can be recommended, and the time spent by the user on clothing matching is saved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Wherein:
FIG. 1 is a system architecture diagram of a clothing recommendation method provided by an embodiment of the invention;
FIG. 2 is a flow chart of a method for recommending clothing according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of preprocessing a plurality of images of apparel according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating the result of image enhancement processing according to an embodiment of the present invention;
fig. 5 is a schematic flow chart of an apparel image with a user evaluation value higher than a first threshold value acquired from a network according to an embodiment of the present invention;
FIG. 6 is a schematic flowchart of a clothing recommendation method for searching a second parameter set matching the first parameter set through a second neural network according to an embodiment of the present invention;
fig. 7 is a schematic flowchart of a clothing recommendation method for searching a second parameter set matching with the first parameter set according to a preset matching rule according to an embodiment of the present invention;
FIG. 8 is a schematic structural diagram of an apparatus for recommending clothes according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an apparel recommendation device provided in an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
In a specific implementation, the clothing recommendation method described in the embodiment of the present invention may be implemented by a terminal device with an image processing function, such as a mobile phone, a desktop computer, a laptop computer, and a wearable device, which is not limited herein. The embodiment of the present invention is described with a terminal device as an execution subject.
The embodiment of the invention provides a clothing recommendation method, which comprises the following steps: acquiring an input image, wherein the input image comprises an image to be collocated; performing feature extraction on the input image to obtain a first parameter set, wherein the first parameter set is characterized by the visual attribute of the clothing to be matched; outputting a target garment matched with the garment to be matched, and matching a second parameter set obtained by extracting the characteristics of the image of the target garment with the first parameter set, wherein the second parameter set represents the attribute of the target garment. The embodiment of the invention also provides a corresponding clothing recommendation device and clothing recommendation equipment. Each of which is described in detail below.
In order to better understand the clothing recommendation method, the related apparatus and the device provided by the embodiment of the invention, a system architecture of the clothing recommendation method applied to the embodiment of the invention is described below. Referring to fig. 1, fig. 1 is a schematic diagram of a system architecture of a clothing recommendation method according to an embodiment of the present invention. As shown in fig. 1, the system architecture may include one or more servers and a plurality of terminals (or devices), wherein:
the server may include, but is not limited to, a background server, a component server, a clothing recommendation system server, or a clothing recommendation software server, etc., and may communicate with a plurality of terminals through the internet. And the server sends the clothing recommendation result to the terminal. The terminal (or device) may be installed and run with an associated Client (Client) (e.g., a clothing recommendation Client, etc.). The Client (Client) refers to a program corresponding to the server and providing a local service to the user. Here, the local service may include, but is not limited to: upload apparel images, provide apparel recommendation interfaces, and the like.
Specifically, the client may include: locally running applications, functions running on a Web browser (also known as Web apps), and the like. For the client, a corresponding server-side program needs to be run on the server to provide corresponding clothing image processing, clothing image feature extraction, processing of the clothing image feature process generation data and the like.
The terminal in the embodiment of the present invention may include, but is not limited to, any handheld electronic product based on an intelligent operating system, which can perform human-computer interaction with a user through an input device such as a keyboard, a virtual keyboard, a touch pad, a touch screen, and a voice control device, such as a smart phone, a tablet computer, a personal computer, and the like. The smart operating system includes, but is not limited to, any operating system that enriches device functionality by providing various mobile applications to the mobile device, such as Android (Android), iOSTMWindows Phone, etc.
It should be noted that the system architecture of the clothing recommendation method provided by the present application is not limited to that shown in fig. 1.
In order to better understand the clothing recommendation method, the clothing recommendation device and the clothing recommendation apparatus provided by the embodiments of the present invention, a flow of a clothing recommendation method applicable to the embodiments of the present invention is described below. Referring to fig. 2, fig. 2 is a schematic flow chart of a clothing recommendation method according to an embodiment of the present invention, where the clothing recommendation method according to the embodiment of the present invention includes:
s101, acquiring an input image, wherein the input image comprises an image to be collocated;
optionally, the acquired input image may be a clothing image acquired by a camera of the clothing matching device. Namely, the clothing recommendation device is provided with a camera, and a user can start the camera of the clothing recommendation matching device to shoot clothing to be matched, so that the clothing recommendation device obtains an input image.
Optionally, the user sends the clothing image on the local of the terminal or the clothing image currently taken by the terminal to the clothing recommendation device through the terminal (for example, a camera, a mobile phone, a tablet computer, etc.), or the user sends the clothing image stored in the readable storage medium to the clothing recommendation device through the readable storage medium, so that the clothing recommendation device obtains the input image. Specifically, the clothing recommendation device provides a clothing image uploading interface for the user, so that the user can select a required clothing image from a terminal or a readable storage medium with an established communication connection through the clothing image uploading interface and upload the clothing image to the interface.
Of course, in the embodiment of the present invention, the clothing recommendation device may also acquire the clothing image in other manners, which is not limited herein. Apparel in embodiments of the present invention includes, but is not limited to, coats, pants, shoes, accessories, and the like.
S102, extracting features of the input image to obtain a first parameter set, wherein the first parameter set represents visual attributes of the clothing to be matched;
optionally, feature extraction is performed on the input image through a feature extraction algorithm to obtain a first parameter set, where the first parameter set represents a visual attribute of the clothing to be matched, and the feature extraction algorithm may include: scale-invariant feature transform (SIFT), gray level co-occurrence matrix method, and the like.
Optionally, feature extraction is performed on the input image through a first neural network to obtain a first parameter set, where the first parameter set represents a visual attribute of the clothing to be matched. Before the input image is input into the neural network, adding an attention model, namely framing the clothing part in the input image to eliminate background information interference in the input image; the first neural network is trained using a first training data set, the first training data set comprising: the system comprises a plurality of clothing images and a parameter set corresponding to each of the clothing images, wherein the parameters in the parameter set correspond to the visual attributes of the clothing. The traditional feature extraction method can only extract partial features of the clothing image, the neural network can extract the global features of the clothing image, and the features extracted by the neural network can more accurately represent the corresponding clothing image. The neural network has learning ability, and the trained neural network can be used for directly outputting a characteristic parameter set corresponding to the clothing image, so that a manual calculation process can be omitted.
Alternatively, the first Neural Network may include a Deep Neural Network (DNN), a Convolutional Neural Network (CNN), a Recurrent Neural Network (RNN), and the like, and embodiments of the present invention are not listed here.
Specifically, a plurality of clothing images in the first training data set are preprocessed during training of the first neural network, please refer to fig. 3, and fig. 3 is a schematic flow chart of preprocessing the plurality of clothing images. Firstly, data cleaning is carried out on a plurality of clothing images, and repeated images in the clothing images are screened out by using a data cleaning algorithm (such as a difference value hash algorithm); then, performing image enhancement on the multiple clothes images after the duplication removal (for example, horizontally turning, rotating, adding noise and the like on the images), and referring to fig. 4 for the result of the image enhancement, fig. 4 is an exemplary diagram of the result of the image enhancement, wherein the upper left diagram in the diagram is an original image, the upper right diagram in the diagram is obtained by horizontally turning 180 ° the upper left diagram, the lower left diagram is obtained by horizontally turning 180 ° the upper left diagram and changing the brightness, saturation and contrast of the image, and the lower right diagram is obtained by horizontally turning 180 ° the upper left diagram and changing the brightness, saturation and contrast of the image; finally, the attention model is added to the plurality of clothing images after the image enhancement processing, and it should be noted that the attention model mainly frames the main part of one image, for example, in a person landscape image, we only need clothing information of the person to frame the person in the image, so as to reduce the interference of background information.
It should be further noted that the first neural network may be obtained by using server training, or may be a third-party trained network model called by the server, and is not limited herein. Data cleaning is carried out on a plurality of clothes images, repeated images in the clothes images can be screened out, the neural network training efficiency is improved, the images of the clothes images are subjected to image enhancement processing, a plurality of images of the same clothes with different angles, different brightness, saturation and contrast can be obtained, a training set can be expanded, the generalization capability of the neural network is improved, an attention model is added, and the figure part in the images is framed to reduce the interference of background information, so that the calculated amount of the neural network can be reduced.
Optionally, the clothing images may be clothing images with user evaluation values higher than a first threshold on the network, please refer to fig. 5, and fig. 5 is a schematic flow chart of clothing images with user evaluation values higher than the first threshold acquired from the network. Firstly, acquiring hotspot information on a network; then judging whether the hotspot information is about clothing matching, if so, performing text sentiment analysis on the comments of the corresponding clothing images; and then judging whether the user evaluation value of the clothing matching image is higher than a first threshold value, and if so, selecting to use the image. In the embodiment of the invention, the clothing image with the user evaluation value higher than the first threshold value can represent the popular matching money and the star matching money on the network.
Alternatively, the first threshold may be any natural number higher than 60 and lower than 100 in percentage, such as 60, 70, 80, 90, and so on.
Alternatively, the first threshold may also be two stars, three stars, four stars, etc., when represented by the five stars system.
Alternatively, the first threshold may be 2A, 3A, 4A, etc., as represented by the 5A system.
Optionally, the plurality of clothing images may also include clothing images uploaded by the client or clothing images purchased or collected by the user on the shopping platform.
In addition, the visual properties of the apparel described in the embodiments of the present invention may include at least one of color (e.g., red, green, blue, etc.), neckline design (e.g., V-neck, round-neck, stand-up collar, etc.), cuff design (e.g., sleeveless, short-sleeve, long-sleeve, trumpet-sleeve, etc.), length design, texture design (e.g., horizontal lines, vertical lines, etc.), pattern design, and material (e.g., chiffon, silk, pure cotton, etc.).
S103, outputting a target garment which is matched with the garment to be matched, and performing feature extraction on the image of the target garment to obtain a second parameter set which is matched with the first parameter set, wherein the second parameter set represents the attribute of the target garment.
Optionally, outputting a target garment matched with the garment to be matched, wherein the target garment is matched with the garment to be matched; and outputting the target clothes corresponding to the second parameter set. Referring to fig. 6, fig. 6 is a schematic flow chart of a clothing recommendation method for searching a second parameter set matching with the first parameter set through a second neural network. The second neural network is trained using a second training data set, the second training data set comprising: a third parameter set and a fourth parameter set; wherein the third parameter set comprises a parameter set corresponding to each of the plurality of clothing images output after being trained by the first neural network; the fourth parameter set comprises a parameter set corresponding to the clothing image collocated with the clothing image represented by the third parameter set. The correspondence between the second parameter set and the first parameter set searched by the second neural network may be as shown in table 1, the first parameter set in table 1 may be any one of I1, I2, and I3 …, elements a1, B1, C1, and D1 … in the set represent the above-mentioned clothing attribute, the second parameter set found by a preset matching rule may be any one of I1', I2', I3 ", and …, and elements a2, B2, C2, D2, and … in the set represent the above-mentioned clothing attribute. P11, P12, P13 … are the matching probabilities of each first parameter set matching the first parameter set, it should be noted that the first parameter sets in table 1 are the first parameter sets of a piece of clothing to be matched, and the first parameter sets may be sets containing different elements as shown in table 1.
TABLE 1
Figure BDA0001803011080000081
To better illustrate how embodiments of the present invention look up and describe via a second neural networkThe second parameter set in which the first parameter sets match each other will be illustrated below, for example, when the garment to be matched is a white pure cotton shirt, the first parameter set of the garment is (white, pure cotton, shirt, …), that is, corresponding to I in table 12(A1, B1, D1, …), namely A1 represents that the color attribute of the clothes is white, B1 represents that the material of the clothes is pure cotton, and D1 represents that the type of the clothes is a shirt. The second set of parameters available through the second neural network is (white, cowboy, overcoat, long sleeve), that is to say corresponds to I in table 121' (A2, B2, C2, D2, …) with a match probability of P21; (black, jeans, trousers), that is to say corresponding to I in Table 122' (A3, B3, D3, …) with a match probability of P22; (white, cotton, half-length skirt), i.e. corresponding to I in Table 123' (A4, C4, D4, …) with a match probability of P23; … A second neural network has n output nodes and n output results. It should be noted that the second neural network may be obtained by using server training, or may be a third-party trained network model called by the server, and is not limited herein.
The neural network can well represent the characteristic attributes of the clothing images, can classify a large number of clothing images according to the attributes, adopts the second neural network to search the second parameter set matched with the first parameter set, can search a plurality of target clothing matched with clothing to be matched, and also can output the matching probability of each first parameter set matched with the first parameter set, and the probability data is used for providing a selection reference for a user, so that clothing recommendation can be provided for the user more efficiently.
Optionally, outputting a target garment matched with the garment to be matched, and searching a second parameter set matched with the first parameter set through a preset matching rule; and outputting the target clothes corresponding to the second parameter set. Referring to fig. 7, fig. 7 is a schematic flow chart of a clothing recommendation method for searching a second parameter set matching with the first parameter set by using a preset matching rule. The corresponding relationship between the second parameter set and the first parameter set searched by the preset matching rule may be shown in table 1, the first parameter set in table 1 may be any one set of I1, I2, I3, I4, I5, I6, I7, and …, the elements a1, B1, C1, and D1 … in the set represent the above clothing attribute, the second parameter set searched by the preset matching rule may be any one set of I1', I2', I3', I4', I5', I6', and I7' …, and the elements a2, B2, C2, D2, and … in the set represent the above clothing attribute. For example, a1 represents the color of the clothing, which is red, and if the preset matching rule is red, which can match white, black, and gray, the color represented by the corresponding parameter a2 in the second parameter set found by the preset matching rule should be one of white, black, and gray. Further, if the clothing to be matched is a white pure cotton shirt, the first parameter set is (white, pure cotton, shirt) to describe the clothing to be matched, if the preset matching rule specifies that the color matched with white is white, black, red, pink, gray; the preset matching rule specifies that pure cotton, jeans and chiffon can be matched with the preset matching rule; the preset matching rule specifies that the shirt can be combined with an overcoat, a trousers and a half-length skirt, and the second parameter set corresponding to the target clothes according to the preset matching rule can be a parameter set formed by combining various elements such as (white, jeans, overcoat), (black, jeans, trousers) and (white, pure cotton and half-length skirt).
TABLE 1
Figure BDA0001803011080000091
Of course, the output form of the target apparel in the embodiment of the invention may include: target apparel images, lists describing target apparel attributes, target apparel images ranked five top in collocation recommendation percentage and their respective recommendation percentages, a piece of speech describing target apparel attributes, a piece of video containing target apparel, and so on, without limitation.
According to the embodiment of the invention, the matched clothing is recommended to the user through the clothing image input by the user, and the clothing images used in the first nerve training can be clothing images with the user evaluation value higher than the first threshold value on the network, so that the hot matching money and the star matching money on the network can be recommended to the user. The plurality of clothing images can also comprise clothing images uploaded by the client or clothing images purchased or collected by the user on the shopping platform, so that the matching with the personal clothing matching style of the user or similar matching can be recommended, and the time spent by the user on clothing matching is saved; the neural network is adopted to extract the features of the clothing image, the image enhancement processing is carried out on the image input into the neural network in the training of the neural network ornament, and the generalization capability of the neural network model for feature extraction is improved.
In order to better implement the above solution of the embodiments of the present invention, the present invention further provides a clothing recommendation device, which is described in detail below with reference to the accompanying drawings: please refer to the technical scheme of the method to correspondingly modify the technical scheme of the following device.
As shown in fig. 8, which is a schematic structural diagram of an apparel recommendation device provided in an embodiment of the present invention, the apparel recommendation device may include: an acquisition unit 100, an extraction unit 102 and an output unit 104,
wherein the content of the first and second substances,
an acquisition unit 100 configured to acquire an input image, where the input image includes an image to be collocated;
an extracting unit 102, configured to perform feature extraction on the input image to obtain a first parameter set, where the first parameter set represents a visual attribute of the clothing to be matched;
the output unit 104 is configured to output a target garment that is a matched garment with the garment to be matched, and match a second parameter set obtained by performing feature extraction on an image of the target garment with the first parameter set, where the second parameter set represents an attribute of the target garment.
Alternatively, the extraction unit 102 may include:
the first feature extraction unit is used for extracting features of the input image through a first neural network to obtain a first parameter set; the first neural network is trained using a first training data set, the first training data set comprising: the system comprises a plurality of clothing images and a parameter set corresponding to each of the clothing images, wherein the parameters in the parameter set correspond to the visual attributes of the clothing.
And the first training unit is used for training the first neural network.
Optionally, the extracting unit 102 may further include:
the second feature extraction unit is used for extracting features of the input image through a feature extraction algorithm to obtain a first parameter set; the feature extraction algorithm may include: scale-invariant feature transform (SIFT), gray level co-occurrence matrix method, and the like.
Alternatively, the output unit 104 may include:
the first searching unit is used for searching a second parameter set matched with the first parameter set through a second neural network; the second neural network is trained using a second training data set, the second training data set comprising: a third parameter set and a fourth parameter set; wherein the third parameter set comprises a parameter set corresponding to each of the plurality of clothing images output after being trained by the first neural network; the fourth parameter set comprises a parameter set corresponding to a clothing image collocated with clothing represented by the third parameter set;
a second training unit for training a second neural network;
and the first output subunit is used for outputting the target clothes corresponding to the second parameter set.
Optionally, the output unit 104 may further include:
the second searching unit is used for searching a second parameter set matched with the first parameter set through a preset matching rule;
and the second output subunit is used for outputting the target clothes corresponding to the second parameter set.
It should be noted that the clothing recommendation device 10 in the embodiment of the present invention is the clothing recommendation device in the embodiments of fig. 2 to 7, and the functions of each unit in the clothing recommendation device 10 may refer to the specific implementation manners of the embodiments of fig. 2 to 7 in the embodiments of the methods described above, which are not described herein again.
In order to better implement the above solution of the embodiments of the present invention, the present invention further provides a clothing recommendation device, which is described in detail below with reference to the accompanying drawings:
as shown in fig. 9, which is a schematic structural diagram of a clothing recommendation device provided in an embodiment of the present invention, the clothing recommendation device 110 may include a processor 1101, an input unit 1102, an output unit 1103, a memory 1104 and a communication unit 1105, and the processor 1101, the input unit 1102, the output unit 1103, the memory 1104 and the communication unit 1105 may be connected to each other through a bus 1106. The memory 1104 may be a high-speed RAM memory or a non-volatile memory (e.g., at least one disk memory). The memory 1104 may optionally be at least one memory system located remotely from the processor 1101. The memory 1104 is used for storing application program codes, which may include an operating system, a network communication module, a user interface module, and a clothing recommendation program, and the communication unit 1105 is used for information interaction with external units; the processor 1101 is configured to call the program code, and perform the following steps:
receiving an input image uploaded by a user at a client through the communication unit 1105, and acquiring the input image by the processor 1101, wherein the input image comprises an image to be collocated;
performing feature extraction on the input image through a processor 1101 to obtain a first parameter set, where the first parameter set represents a visual attribute of the clothing to be matched;
outputting a target garment which is matched with the garment to be matched through an output unit 1103, and displaying the target garment on a display corresponding to the garment recommendation device;
optionally, the target apparel may also be sent to the client through the communication unit 1105, and the target apparel is displayed on the client.
Optionally, the target apparel displayed on the display, client, may include: target apparel images, lists describing target apparel attributes, target apparel images with a collocation recommendation percentage ranking five above and each corresponding recommendation percentage, a piece of voice describing target apparel attributes, a piece of video containing target apparel, and so on.
It should be noted that the clothing recommendation device 110 in the embodiment of the present invention is the clothing recommendation device in the embodiments of fig. 2 to fig. 7, and specific reference may be made to specific implementation manners of the embodiments of fig. 2 to fig. 7 in the above method embodiments, which is not described herein again.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a usb disk, a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present invention, and it is therefore to be understood that the invention is not limited by the scope of the appended claims.

Claims (10)

1. A clothing recommendation method, comprising:
acquiring an input image, wherein the input image comprises an image to be collocated;
performing feature extraction on the input image to obtain a first parameter set, wherein the first parameter set represents the visual attribute of the clothing to be matched;
outputting a target garment matched with the garment to be matched, wherein a second parameter set obtained by extracting the characteristics of the image of the target garment is matched with the first parameter set, and the second parameter set represents the visual attributes of the target garment.
2. The method of claim 1, wherein the first set of parameters from feature extraction of the input image comprises:
performing feature extraction on the input image through a first neural network to obtain a first parameter set;
the first neural network is trained using a first training data set, the first training data set comprising: the system comprises a plurality of clothing images and a parameter set corresponding to each of the clothing images, wherein the parameters in the parameter set correspond to the visual attributes of the clothing.
3. The method of claim 2, wherein the plurality of apparel images comprises apparel images for which a user-rating value on the network is above a first threshold.
4. The method of claim 2, wherein the plurality of apparel images comprises apparel images uploaded by a client or apparel images purchased or collected by a user at a shopping platform.
5. The method of any of claims 2-4, wherein the plurality of apparel images are subjected to image enhancement processing.
6. The method of claim 1, wherein outputting the target apparel for the apparel to be paired with the apparel to be paired comprises:
searching a second parameter set matched with the first parameter set through a second neural network;
outputting the target clothes corresponding to the second parameter set;
the second neural network is trained using a second training data set, the second training data set comprising: a third parameter set and a fourth parameter set; the third parameter set comprises a parameter set corresponding to each of the plurality of clothing images extracted by the first neural network; the fourth parameter set comprises a parameter set corresponding to a clothing image collocated with clothing represented by the third parameter set.
7. The method of claim 1, wherein outputting the target apparel for the apparel to be paired with the apparel to be paired comprises:
searching a second parameter set matched with the first parameter set through a preset matching rule;
and outputting the target clothes corresponding to the second parameter set.
8. An apparel recommendation device comprising means for performing the method of any of claims 1-7.
9. An apparel recommendation device comprising a processor, an input device, an output device, and a memory, the processor, the input device, the output device, and the memory being interconnected, wherein the memory is configured to store application program code, and the processor is configured to invoke the program code to perform the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that the computer storage medium stores a computer program comprising computer program instructions which, when executed by a processor, the processor performs the method of any of claims 1-7.
CN201811085483.4A 2018-09-18 2018-09-18 Clothing recommendation method, related device and equipment Pending CN110909746A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811085483.4A CN110909746A (en) 2018-09-18 2018-09-18 Clothing recommendation method, related device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811085483.4A CN110909746A (en) 2018-09-18 2018-09-18 Clothing recommendation method, related device and equipment

Publications (1)

Publication Number Publication Date
CN110909746A true CN110909746A (en) 2020-03-24

Family

ID=69813494

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811085483.4A Pending CN110909746A (en) 2018-09-18 2018-09-18 Clothing recommendation method, related device and equipment

Country Status (1)

Country Link
CN (1) CN110909746A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111582979A (en) * 2020-04-29 2020-08-25 上海风秩科技有限公司 Clothing matching recommendation method and device and electronic equipment
CN112163926A (en) * 2020-09-24 2021-01-01 深圳莱尔托特科技有限公司 Clothing chest size matching method, device, equipment and storage medium
CN113222971A (en) * 2021-05-31 2021-08-06 深圳市蝶讯网科技股份有限公司 Method for browsing styles by colors and collocation, computer equipment and storage medium
WO2023185787A1 (en) * 2022-03-31 2023-10-05 华为技术有限公司 Article matching method and related device
WO2023207681A1 (en) * 2022-04-28 2023-11-02 人工智能设计研究所有限公司 Method and apparatus for intelligent clothing matching, and electronic device and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106504064A (en) * 2016-10-25 2017-03-15 清华大学 Clothes classification based on depth convolutional neural networks recommends method and system with collocation
CN107993131A (en) * 2017-12-27 2018-05-04 广东欧珀移动通信有限公司 Wear to take and recommend method, apparatus, server and storage medium
CN108109049A (en) * 2017-12-29 2018-06-01 广东欧珀移动通信有限公司 Clothing matching Forecasting Methodology, device, computer equipment and storage medium
CN108230082A (en) * 2017-06-16 2018-06-29 深圳市商汤科技有限公司 The recommendation method and apparatus of collocation dress ornament, electronic equipment, storage medium
US20180218433A1 (en) * 2017-01-27 2018-08-02 Robert Penner System and Method for Fashion Recommendations

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106504064A (en) * 2016-10-25 2017-03-15 清华大学 Clothes classification based on depth convolutional neural networks recommends method and system with collocation
US20180218433A1 (en) * 2017-01-27 2018-08-02 Robert Penner System and Method for Fashion Recommendations
CN108230082A (en) * 2017-06-16 2018-06-29 深圳市商汤科技有限公司 The recommendation method and apparatus of collocation dress ornament, electronic equipment, storage medium
CN107993131A (en) * 2017-12-27 2018-05-04 广东欧珀移动通信有限公司 Wear to take and recommend method, apparatus, server and storage medium
CN108109049A (en) * 2017-12-29 2018-06-01 广东欧珀移动通信有限公司 Clothing matching Forecasting Methodology, device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘玮等: "《计算机视觉中的目标特征模型和视觉注意模型》", 30 September 2016, 华中科技大学出版社 *
曾民族等: "《知识技术及其应用》", 30 November 2005 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111582979A (en) * 2020-04-29 2020-08-25 上海风秩科技有限公司 Clothing matching recommendation method and device and electronic equipment
CN112163926A (en) * 2020-09-24 2021-01-01 深圳莱尔托特科技有限公司 Clothing chest size matching method, device, equipment and storage medium
CN112163926B (en) * 2020-09-24 2024-04-09 深圳莱尔托特科技有限公司 Clothing chest size matching method, device, equipment and storage medium
CN113222971A (en) * 2021-05-31 2021-08-06 深圳市蝶讯网科技股份有限公司 Method for browsing styles by colors and collocation, computer equipment and storage medium
WO2023185787A1 (en) * 2022-03-31 2023-10-05 华为技术有限公司 Article matching method and related device
WO2023207681A1 (en) * 2022-04-28 2023-11-02 人工智能设计研究所有限公司 Method and apparatus for intelligent clothing matching, and electronic device and storage medium

Similar Documents

Publication Publication Date Title
US11227008B2 (en) Method, system, and device of virtual dressing utilizing image processing, machine learning, and computer vision
CN110909746A (en) Clothing recommendation method, related device and equipment
US10019779B2 (en) Browsing interface for item counterparts having different scales and lengths
US10109051B1 (en) Item recommendation based on feature match
US10346893B1 (en) Virtual dressing room
US9607010B1 (en) Techniques for shape-based search of content
CN107993131B (en) Putting-through recommendation method, device, server and storage medium
CN108229559B (en) Clothing detection method, clothing detection device, electronic device, program, and medium
CN108829764A (en) Recommendation information acquisition methods, device, system, server and storage medium
CN111325226B (en) Information presentation method and device
CN106202317A (en) Method of Commodity Recommendation based on video and device
US20200342320A1 (en) Non-binary gender filter
CN106202316A (en) Merchandise news acquisition methods based on video and device
JP2020522072A (en) Fashion coordination recommendation method and device, electronic device, and storage medium
JP2020502662A (en) Intelligent automatic cropping of images
CN111767817B (en) Dress collocation method and device, electronic equipment and storage medium
US10026176B2 (en) Browsing interface for item counterparts having different scales and lengths
US20190325497A1 (en) Server apparatus, terminal apparatus, and information processing method
CN109597907A (en) Dress ornament management method and device, electronic equipment, storage medium
CN112905889A (en) Clothing searching method and device, electronic equipment and medium
US11854069B2 (en) Personalized try-on ads
US20210150243A1 (en) Efficient image sharing
CN111429543B (en) Material generation method and device, electronic equipment and medium
CN113377970A (en) Information processing method and device
CN112862558A (en) Method and system for generating product detail page and data processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200324