CN111915400A - Personalized clothing recommendation method and device based on deep learning - Google Patents

Personalized clothing recommendation method and device based on deep learning Download PDF

Info

Publication number
CN111915400A
CN111915400A CN202010751386.5A CN202010751386A CN111915400A CN 111915400 A CN111915400 A CN 111915400A CN 202010751386 A CN202010751386 A CN 202010751386A CN 111915400 A CN111915400 A CN 111915400A
Authority
CN
China
Prior art keywords
commodity
user
data
clothing
personalized
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010751386.5A
Other languages
Chinese (zh)
Other versions
CN111915400B (en
Inventor
王国军
马鋆钰
邢萧飞
陈淑红
彭滔
汪建旭
韦霁纯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Southern Power Grid Internet Service Co ltd
Ourchem Information Consulting Co ltd
Original Assignee
Guangzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou University filed Critical Guangzhou University
Priority to CN202010751386.5A priority Critical patent/CN111915400B/en
Publication of CN111915400A publication Critical patent/CN111915400A/en
Application granted granted Critical
Publication of CN111915400B publication Critical patent/CN111915400B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations

Landscapes

  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the invention provides a personalized clothing recommendation method and device based on deep learning, a readable storage medium and a computing device. The method comprises the following steps: collecting commodity clothing data of a merchant; extracting a first characteristic according to the commodity clothing data of the merchant; constructing a commodity similarity matrix according to the first characteristics; collecting commodity clothing data personalized by a user; extracting a second characteristic according to the commodity clothing data personalized by the user; distributing characteristic weight according to the distribution condition of the second characteristic; adjusting the similarity matrix according to the feature weight; and after the user selects one commodity garment, selecting the commodity garment matched with the currently selected commodity garment from the commodity garments of the merchant according to the adjusted similarity matrix, and recommending the commodity garment to the user.

Description

Personalized clothing recommendation method and device based on deep learning
Technical Field
The invention relates to the technical field of deep learning, in particular to a personalized clothing recommendation method and device based on deep learning, a readable storage medium and a computing device.
Background
The recommendation system is used for mining items (such as information, services, articles and the like) which are interested by a user from mass data through a recommendation algorithm according to the requirements, interests and the like of the user, and recommending the results to the user in a personalized list mode. The core of the recommendation system is a recommendation algorithm which helps to find items that a user may be interested in based on a user historical behavior record or similarity relationship by utilizing a binary relationship between the user and the items. Through the development of more than twenty years, the recommendation system is successfully applied to a plurality of fields, and the most commonly mentioned application landing scenes in the RecSys conference are as follows: the online videos, social networks, online music, e-commerce, internet advertisements and the like are the stages of the large exhibition of the recommendation system, and are important experimental scenes for researching and applying the recommendation system in the industry in recent years.
The core of Deep Learning (DL) is to learn the intrinsic regularity and expression level of sample data, and the information obtained in these Learning processes is very helpful to the interpretation of data such as text, image and sound. The final aim of the method is to enable the machine to have the analysis and learning capability like a human, and to recognize data such as characters, images and sounds. Deep learning is a complex machine learning algorithm, and achieves the effect in speech and image recognition far exceeding the prior related art.
In recent years, deep learning develops rapidly in the field of recommendation systems, in which a deep learning technology extracts potential features of users and potential features of items, and generates recommended items for the users based on the potential features to complete a recommendation task. The neural network in the deep learning technology can learn not only the potential feature representation of the user or the item, but also the complex nonlinear interaction features between the user and the item, deeply analyze the user preference, solve some problems in the traditional recommendation method and better realize recommendation.
The research on personalized clothing recommendation draws the attention of experts and clothing enterprises, and the personalized clothing recommendation method is produced at the same time. However, the personalized demands of consumers are variable and affected by many factors, and obtaining the personalized demands of consumers is a difficult task. The personalized demand of consumers has the characteristics of variability and is influenced by a plurality of factors, and the existing methods have respective disadvantages:
the features used are often mixed together and cannot be explained; because the deep learning network has inexplicability, some garment matching recommendation systems do not distinguish the characteristics of the goods and mix the characteristics together as a recommendation basis, and the characteristics are generally called unexplainable characteristics, so that a plurality of unexplainable recommendation combinations are generated, the recommendation accuracy is not high, and a good recommendation effect is not achieved.
Personalized factors of the user are not considered; only the characteristic information of the commodity is extracted, and the commodity is recommended through the characteristics of the commodity, so that the requirements of the user on the personalized requirements are not considered, and the user generally has shopping preferences of the user during shopping, such as: preferred colors, preferred length of pants, preferred braces or short sleeves; in addition, shopping preferences for males and females are also greatly different. Based on these factors, the recommendation only by the clothing features cannot fully satisfy the personalized needs of the user.
After the above two aspects of consideration are combined, how to combine the characteristics of the commodity garment with the personalized characteristics of the user is a very valuable part of research.
Disclosure of Invention
Therefore, the invention provides a personalized clothing recommendation method and device based on deep learning, a readable storage medium and computing equipment, a neural network is constructed by utilizing a deep learning strategy, an effective clothing matching method aiming at the individual user is provided, the method can provide a personalized clothing matching scheme with visual coordination and aesthetic characteristics for the user, and the method has higher practical value.
According to an aspect of an embodiment of the present invention, there is provided a personalized clothing recommendation method based on deep learning, including:
collecting commodity clothing data of a merchant;
extracting a first characteristic according to the commodity clothing data of the merchant;
constructing a commodity similarity matrix according to the first characteristics;
collecting commodity clothing data personalized by a user;
extracting a second characteristic according to the commodity clothing data personalized by the user;
distributing characteristic weight according to the distribution condition of the second characteristic;
adjusting the similarity matrix according to the feature weight;
and after the user selects one commodity garment, selecting the commodity garment matched with the currently selected commodity garment from the commodity garments of the merchant according to the adjusted similarity matrix, and recommending the commodity garment to the user.
Optionally, extracting a first feature according to the article clothing data of the merchant includes:
and inputting the commodity clothing data of the merchant into an InfoGAN model, and extracting first characteristics which do not influence each other.
Optionally, inputting the commodity clothing data of the merchant into an InfoGAN model, and extracting first features that do not affect each other, including:
inputting the merchant's merchandise clothing data into a generator;
the generator generates a first sample and inputs the first sample to the discriminator;
the discriminator returns the discrimination result to the generator;
when the discriminator cannot identify the first sample as the sample generated by the generator, acquiring a first interpretable hidden variable c;
and determining first characteristics which do not influence each other according to the first interpretable hidden variable c.
Optionally, extracting a second feature according to the commodity clothing data personalized by the user, including:
inputting the commodity clothing data individualized by the user and the noise data into a GAN model to obtain a sample set;
and inputting the sample set into an InfoGAN model, and extracting second features which do not influence each other.
Optionally, inputting the commodity clothing data personalized by the user and the noise data into a GAN model to obtain a sample set, including:
training a discriminator according to the commodity clothing data personalized by the user; and the number of the first and second groups,
inputting noise data into a generator, and judging a sample generated by the generator by the discriminator;
acquiring a second sample discriminated by the discriminator;
generating a sample set from the second sample.
Optionally, inputting the sample set into an InfoGAN model, and extracting second features that do not affect each other, including:
inputting the sample set into a generator;
the generator generates a third sample and inputs the third sample to the discriminator;
the discriminator returns the discrimination result to the generator;
when the discriminator cannot identify the third sample as the sample generated by the generator, acquiring a second interpretable hidden variable c;
and determining second characteristics which do not influence each other according to the second interpretable hidden variable c.
Optionally, the first feature and the second feature each include an image feature and a data dictionary feature.
Optionally, assigning a feature weight according to the distribution of the second feature includes:
counting the number of each label contained in the second characteristic;
and determining the weight of the single label according to the number of the single label and the total number of all the labels.
According to another aspect of the embodiments of the present invention, there is provided a personalized clothing recommendation device based on deep learning, including:
the merchant data collection module is used for collecting the commodity clothing data of the merchant;
the characteristic extraction module is used for extracting a first characteristic according to the commodity clothing data of the merchant;
the similarity calculation module is used for constructing a commodity similarity matrix according to the first characteristics;
the user data collection module is used for collecting the personalized commodity clothing data of the user;
the characteristic extraction module is also used for extracting a second characteristic according to the commodity clothing data personalized by the user;
the similarity calculation module is further used for distributing characteristic weight according to the distribution situation of the second characteristic; adjusting the similarity matrix according to the feature weight;
and the recommending module is used for selecting the commodity clothes matched with the commodity clothes currently selected by the user from the commodity clothes of the merchant and recommending the commodity clothes to the user according to the adjusted similarity matrix after the user selects one commodity clothes.
According to a further aspect of the embodiments of the present invention, there is provided a readable storage medium having executable instructions thereon, which when executed, cause a computing device to perform operations included in a method for deep learning based personalized clothing recommendation.
According to yet another aspect of the embodiments of the present invention, there is provided a computing device, including a processor and a memory, having executable instructions stored thereon, which when executed, cause the processor to perform operations included in a method for deep learning based personalized garment recommendation.
In the embodiment of the invention, commodity clothing data of a merchant is collected, first characteristics are extracted according to the commodity clothing data of the merchant, a commodity similarity matrix is constructed according to the first characteristics, personalized commodity clothing data of a user is collected, second characteristics are extracted according to the personalized commodity clothing data of the user, characteristic weights are distributed according to the distribution condition of the second characteristics, the similarity matrix is adjusted according to the characteristic weights, and after the user selects one commodity clothing, the commodity clothing matched with the currently selected commodity clothing of the user is selected from the commodity clothing of the merchant according to the adjusted similarity matrix and recommended to the user; the commodity clothing features and the user personalized features are combined and a recommendation system is set up, the hidden shopping preferences of users are fully considered, and different recommendation results can be generated according to the preferences of each user.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the principles of the invention.
FIG. 1 is a schematic block diagram of an exemplary computing device 100;
FIG. 2 is a flow chart of a method for deep learning based personalized clothing recommendation according to an embodiment of the present invention;
FIG. 3 is an overall model architecture diagram according to an embodiment of the present invention;
FIG. 4 is a diagram of a feature extraction network architecture according to an embodiment of the present invention;
FIG. 5 is a diagram of a feature extraction network architecture according to yet another embodiment of the present invention;
FIG. 6(a) (b) (c) is a garment recommendation effect diagram according to an embodiment of the present invention;
fig. 7 is a block diagram of a personalized clothing recommendation device based on deep learning according to an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
FIG. 1 is a block diagram of an example computing device 100 arranged to implement a deep learning based personalized garment recommendation method in accordance with the present invention. In a basic configuration 102, computing device 100 typically includes system memory 106 and one or more processors 104. A memory bus 108 may be used for communication between the processor 104 and the system memory 106.
Depending on the desired configuration, the processor 104 may be any type of processing, including but not limited to: a microprocessor (μ P), a microcontroller (μ C), a Digital Signal Processor (DSP), or any combination thereof. The processor 104 may include one or more levels of cache, such as a level one cache 110 and a level two cache 112, a processor core 114, and registers 116. The example processor core 114 may include an Arithmetic Logic Unit (ALU), a Floating Point Unit (FPU), a digital signal processing core (DSP core), or any combination thereof. The example memory controller 118 may be used with the processor 104, or in some implementations the memory controller 118 may be an internal part of the processor 104.
Depending on the desired configuration, system memory 106 may be any type of memory, including but not limited to: volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.), or any combination thereof. System memory 106 may include an operating system 120, one or more programs 122, and program data 124. In some implementations, the program 122 can be configured to execute instructions on an operating system by one or more processors 104 using program data 124.
Computing device 100 may also include an interface bus 140 that facilitates communication from various interface devices (e.g., output devices 142, peripheral interfaces 144, and communication devices 146) to the basic configuration 102 via the bus/interface controller 130. The example output device 142 includes a graphics processing unit 148 and an audio processing unit 150. They may be configured to facilitate communication with various external devices, such as a display terminal or speakers, via one or more a/V ports 152. Example peripheral interfaces 144 may include a serial interface controller 154 and a parallel interface controller 156, which may be configured to facilitate communication with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device) or other peripherals (e.g., printer, scanner, etc.) via one or more I/O ports 158. An example communication device 146 may include a network controller 160, which may be arranged to facilitate communications with one or more other computing devices 162 over a network communication link via one or more communication ports 164.
A network communication link may be one example of a communication medium. Communication media may typically be embodied by computer readable instructions, data structures, program modules, and may include any information delivery media, such as carrier waves or other transport mechanisms, in a modulated data signal. A "modulated data signal" may be a signal that has one or more of its data set or its changes made in such a manner as to encode information in the signal. By way of non-limiting example, communication media may include wired media such as a wired network or private-wired network, and various wireless media such as acoustic, Radio Frequency (RF), microwave, Infrared (IR), or other wireless media. The term computer readable media as used herein may include both storage media and communication media.
Computing device 100 may be implemented as part of a small-form factor portable (or mobile) electronic device such as a cellular telephone, a Personal Digital Assistant (PDA), a personal media player device, a wireless web-watch device, a personal headset device, an application specific device, or a hybrid device that include any of the above functions. Computing device 100 may also be implemented as a personal computer including both desktop and notebook computer configurations.
Among other things, one or more programs 122 of computing device 100 include instructions for performing a deep learning based personalized apparel recommendation method in accordance with the present invention.
Fig. 2 illustrates a flow chart of a method 200 for deep learning based personalized garment recommendation according to the present invention, the method 200 starting at step S210.
And S210, collecting the commodity clothing data of the merchant.
The merchant's clothing data may include a large amount of clothing data that can be sold, for example, all the clothing data of a certain e-commerce website. The merchandise garment data includes two parts: the image data comprises image characteristics of the clothes of the commodity, and the data dictionary comprises a plurality of pieces of character description information of the commodity.
S220, extracting a first feature according to the commodity clothing data of the merchant.
According to the embodiment of the invention, a Generative countermeasure network (InfoGAN, Information maximum Generation adaptive Networks) model with maximized Information is adopted for feature extraction; the InfoGAN model specifically extracts the image features of the commodity clothing data.
Further, step S220 includes: and inputting the commodity clothing data of the merchant into an InfoGAN model, and extracting first characteristics which do not influence each other.
Specifically, the first feature is extracted by:
inputting the merchant's merchandise clothing data into a generator;
the generator generates a first sample and inputs the first sample to the discriminator;
the discriminator returns the discrimination result to the generator;
when the discriminator can not identify the first sample as the sample generated by the generator, obtaining an interpretable hidden variable c;
and determining first characteristics which do not influence each other according to the interpretable hidden variable c.
And S230, constructing a commodity similarity matrix according to the first characteristics.
And S240, collecting the personalized commodity clothing data of the user.
The personalized commodity clothing data of the user comprises commodity clothing data corresponding to historical purchase records of the user, and in addition, the personalized commodity clothing data obtained in other forms such as shopping carts, browsing histories, collection records and the like of the user can also be included. The user-personalized merchandise garment data comprises two parts: the image data comprises image characteristics of the clothes of the commodity, and the data dictionary comprises a plurality of pieces of character description information of the commodity.
And S250, extracting a second feature according to the commodity clothing data personalized by the user.
Step S250 specifically includes:
inputting the commodity clothing data individualized by the user and the noise data into a GAN model to obtain a sample set;
and inputting the sample set into an InfoGAN model, and extracting second features which do not influence each other.
The GAN model is used for performing feature filtering and sample filling on historical purchase records, and after the samples are expanded through the GAN model, the extraction of second features which are not influenced mutually is performed through the InfoGAN model.
Optionally, the noise data comprises a random picture unrelated to clothing.
Specifically, the step of obtaining the sample set according to the GAN model includes:
training a discriminator according to the commodity clothing data personalized by the user; and the number of the first and second groups,
inputting noise data into a generator, and judging a sample generated by the generator by the discriminator;
acquiring a second sample discriminated by the discriminator;
generating a sample set from the second sample.
And S260, distributing characteristic weight according to the distribution situation of the second characteristic.
Further, step S260 includes:
counting the number of each label contained in the second characteristic;
and determining the weight of the single label according to the number of the single label and the total number of all the labels.
And S270, adjusting the similarity matrix according to the characteristic weight.
And S280, after the user selects one commodity garment, selecting the commodity garment matched with the currently selected commodity garment from the commodity garments of the merchant according to the adjusted similarity matrix, and recommending the commodity garment to the user.
The above steps are described in detail with reference to specific examples.
The specific embodiment is used for calculating the similarity between other commodities in the data set and the commodity when a user purchases the commodity, extracting the purchase preference of the exclusive user according to the historical purchase record of the user, calculating the commodity similarity in the data set again by taking the exclusive preference of the user as a reference, and customizing a set of exclusive clothes matching for the user by taking the commodity with the highest similarity score in each type (jacket, under-garment, shoes and bags) as a recommended commodity. The overall model design is shown in fig. 3.
In the embodiment, the commodity information collected from a certain e-commerce website is used as a data set for carrying out an experiment, and the historical purchase record of the user is used as the extraction basis of the personalized features to adjust the recommendation result according to the personalized features, so that the system for recommending the clothing matching of the user is realized.
The specific technical scheme is as follows:
step 1: a clothing matching data set R (goods database) is constructed.
Collecting image data of clothing collocation and corresponding commodity characteristic fields from E-commerce websites, constructing a clothing collocation data set R by using the group data, and each group of data DiComprises a commodity image PiAnd information data dictionary C of the commoditykTo CkThe fields of commodity id, picture file storage name, commodity type, color and the like in the system are splitAnd the extraction process is divided into q descriptive words Wi|0≤i≤q-1},2≤q≤50。
Scaling the groups of images of R to NxN, where N is 299, 598 or 1196, counting the word frequencies of all descriptive words in R, from CkAnd deleting descriptive words with the word frequency lower than H, wherein H is more than or equal to 20 and less than or equal to 50.
Step 2: and constructing a commodity feature extraction network.
The network is used for extracting features of commodities in the data set, which do not affect each other, such as: color, shape, material, texture, etc.
The feature extraction network is intended to adopt InfoGAN to carry out feature extraction, the InfoGAN is an improvement on a GAN model, and the purpose of obtaining decomposable feature representation through unsupervised learning is achieved by using the GAN and mutual information between a maximally generated picture and an input code.
GAN is a deep learning model that passes through two modules in the framework: the mutual game learning of the Generative Model (Generative Model) and the discriminant Model (discriminant Model) yields good output. The idea of generating the model G is to pack a noise into a vivid sample, and the discrimination model D needs to judge whether the fed sample is a real sample or a false sample, namely, the process of common progress, the discrimination capability of the discrimination model D on the sample is continuously improved, and the counterfeiting capability of the generated model G is also continuously improved, so that a clearer and real sample is generated.
The loss function of GAN is:
Figure BDA0002610106940000121
where D (x) denotes discrimination of a real sample, the closer the discrimination result is to 1, the better the loss function is log (D (x)), and z is a random input, G (z) denotes a generated sample, and for the generated sample, the closer the discrimination result D (G (z)) of the discriminator is to 0, the better, so D is mainly used to maximize the loss function, and G is the opposite, and is mainly used to minimize the loss function.
Within the InfoGAN, the input vector z is divided into two parts, c and z'. c may be understood as an interpretable hidden variable (in this invention referred to as a data set), while z' may be understood as an incompressible noise. By constraining the relation between c and output (including the training result generated finally and the generated sample which can cheat D generated by the training of the GAN model), the dimension of c corresponds to the semantic features of the output, such as stroke weight, gradient and the like.
The loss function for the InfoGAN is:
Figure BDA0002610106940000122
v (D, G) is a loss function of the GAN model, which is a λ I (c; G (z, c)) more than the original GAN, InfoGAN, and represents the mutual information of the outputs of c and G. Mutual information is the relative entropy of the joint distribution p (x, y) and the edge distribution p (x) p (y), which can be considered as the amount of information contained in one random variable about another random variable, i.e.:
Figure BDA0002610106940000123
in the present invention, the similarity between the data set and the "fake" clothing picture generated by the GAN model is indexed.
The commodity feature extraction network is designed as shown in fig. 4. Wherein the commodity represents real commodity data in the dataset; c represents 'false' commodity data trained by the GAN model; and X is a simplified representation of the training process of the GAN model, the generator G generates X and gives D to discriminate, the result is returned to G after D discriminates, the generator G and the generator G play games with each other until the generated X can "cheat" D, and the training is finished to generate a training sample c.
The input data z is composed of a data set and random noise z', a fake commodity is generated through a generator of a GAN model and then is delivered to a discriminator to discriminate whether the commodity is the commodity in the data set, a classifier (classifier), namely a feature extraction network, is added in the process, in the actual process, the classifier and the classifier D share parameters, only the last layer is different, and feature labels (Lable) which do not influence each other can be obtained through adjusting the parameters, such as: color, shape, etc.
The resolution of each image in Di is scaled to 299 x 299, the tensor shape is 299 x 3, and then the images are input into a feature extraction network (features network), and the output result of the feature extraction network is the feature F extracted from each image in Di groupk(i.e., Lable in FIG. 4), extracting the feature F output by the networkkAnd features W extracted from the data dictionaryiPut into bag of words model (bag _ of _ words) to create vocabulary representation.
And step 3: a vector representation is created for bag of words and a similarity matrix is created.
The step is to calculate the similarity of all the commodities in the data set, and the similarity is a reference index for commodity recommendation.
Converting a bag-of-words model 'bag _ of _ words' into vector representation by using a CountVectorizer, wherein the CountVectorizer is a text feature extraction method, calculating word frequency by counting the occurrence frequency of each word in a 'bag _ of _ words' column, and writing out word vectors according to the word frequency, thereby converting text information into feature vectors. The word vectors are combined into a commodity matrix, once the matrix containing the occurrence times of all words exists, a similarity matrix of commodities can be obtained by using a cosine similarity function, and the matrix contains similarity decimal representation between every two commodities, so that a similarity comparison basis between the commodities can be obtained.
The formula for calculating cosine similarity is as follows:
Figure BDA0002610106940000141
Figure BDA0002610106940000142
wherein u and v represent characteristic parameters of two different commodities.
And 4, step 4: a historical purchase record of the user is extracted.
By passingSearching all commodity ids in the user historical purchase record in a data set to obtain a historical record data set U, wherein each group of data DuiComprises a commodity image PuiAnd the descriptive word W of the goodui
And 5: and constructing a personalized feature extraction network.
The network is used for extracting the personalized features of the user, the personalized features are extracted from the historical purchase record of the user and are used as a reference to influence the final matching result, namely, the personalized whole set of clothes which really like the user in the website commodity are customized and matched. For example: the user purchases more black commodities, and the user purchases the beret for multiple times, so that the subsequent matching result is biased to dark commodities, and the recommended hat is also more similar to the beret in shape.
The scheme design is shown in figure 5. The method uses a GAN model in the personalized feature extraction part, takes the history record as the input of the GAN model, can generate 'false' history record which can be falsified and truthful through the training of the GAN model, and simultaneously can control the quantity of the generated result through parameter adjustment, thereby generating more clear and real samples from the historical purchase record of the user through the model, and simultaneously achieving the purpose of expanding the samples, so that the label extracted by the feature extraction model has the personalized features of the user.
In the invention, a user history picture P is recordeduiAnd noise z' is used as input, the resolution of each image is scaled to 299 x 299, the tensor shape is 299 x 3, more false commodity pictures which are similar to commodities purchased by users and can reflect the characteristics purchased by the users more clearly and truly are obtained through the training of the GAN model, the pictures are input into the characteristic extraction network generated in the second step, and the characteristic labels of the commodities are extracted.
Step six: a user-specific weight assignment scheme is generated statistically.
And counting the extracted commodity feature labels through a counting function, distributing the maximum weight to the labels with the most occurrence times, and repeating the steps. For example: when 100 "red" tags, 50 "shorts" tags, and 20 "shoes" tags are extracted from the user's historical purchase record, the weight of the "red" tag is 100/(100+50+20) ═ 0.588, and the weight of the "shorts" tag is
50/(100+50+20)=0.117。
Step seven: top-N before recommendation
And calculating the commodity similarity matrix obtained in the third step through the weight obtained in the sixth step, readjusting the similarity among the commodities according to the weight, selecting one commodity with the highest similarity with the purchased commodities from all categories of commodities, searching commodity pictures through the commodity id to form a set of collocation, and recommending the pictures to the user.
The overall system loss function is:
E=EfpEp
FIG. 6(a) shows a prior art recommendation effect map for the E-commerce website; FIG. 6(b) is a diagram illustrating the recommended effect of the solution provided according to the specific embodiment of the present disclosure; fig. 6(c) shows the process of augmenting the sample and extracting features using GAN model according to a specific embodiment of the present disclosure.
In the invention, the user history record and the noise z' are used as input, a fake commodity is generated through a generator of the GAN model and then is delivered to a discriminator to discriminate whether the commodity is the commodity in the user history record, and a classifier, namely a feature extraction model, is added in the process.
The technical scheme provided by the invention has the following characteristics:
1. a new clothing matching recommendation method customized for individual individuation is provided.
2. The GAN model is combined with the recommendation system, so that the personal preference of the user can be extracted more accurately, and the cold start problem can be relieved to a certain extent.
3. The method can recommend only through the picture information, so that the requirements on the application platform and the information commodity information of the user are not high, and the popularization and the migration are easy to realize.
Most garment personalized recommendation methods in the prior art are combined with an expert system or aesthetic characteristics, but for garments, the reference bases are relatively subjective, the same recommendation results can be generated only for the same articles, and the personalized characteristics of users are considered, but the recommendation results customized for individuals similar to private persons cannot be generated. The invention has low requirement on data, only needs the historical record and the picture information of the user, and is convenient for migration and popularization. In addition, by using the GAN model, a Markov chain type learning mechanism can be avoided, and sampling and inference can be directly carried out, so that the utilization efficiency of the GAN is improved, and the application scene is wider. Various types of loss functions can be integrated into the GAN model so that for different tasks, we can design different types of loss functions, all of which can be learned and optimized under the framework of GAN.
As shown in fig. 7, an apparatus for recommending personalized clothing based on deep learning according to an embodiment of the present invention includes:
a merchant data collection module 710 for collecting the data of the goods clothing of the merchant;
a feature extraction module 720, configured to extract a first feature according to the commodity clothing data of the merchant;
the similarity calculation module 730 is used for constructing a commodity similarity matrix according to the first characteristic;
a user data collecting module 740 for collecting the personalized commodity clothing data of the user;
the feature extraction module 720 is further configured to extract a second feature according to the personalized commodity clothing data of the user;
the similarity calculation module 730 is further configured to assign a feature weight according to the distribution of the second feature; adjusting the similarity matrix according to the feature weight;
and the recommending module 750 is configured to, after the user selects one piece of commodity clothing, select, according to the adjusted similarity matrix, the commodity clothing matched with the currently selected commodity clothing of the user from the commodity clothing of the merchant and recommend the selected commodity clothing to the user.
Optionally, the feature extraction module 720 extracts a first feature according to the commodity clothing data of the merchant, including:
and inputting the commodity clothing data of the merchant into an InfoGAN model, and extracting first characteristics which do not influence each other.
Optionally, the feature extraction module 720 is configured to input the commodity clothing data of the merchant into the InfoGAN model, and when extracting the first feature that does not affect each other, specifically configured to:
inputting the merchant's merchandise clothing data into a generator;
the generator generates a first sample and inputs the first sample to the discriminator;
the discriminator returns the discrimination result to the generator;
when the discriminator can not identify the first sample as the sample generated by the generator, obtaining an interpretable hidden variable c;
and determining first characteristics which do not influence each other according to the interpretable hidden variable c.
Optionally, the feature extraction module 720 extracts a second feature according to the data of the personalized goods clothing of the user, including:
inputting the commodity clothing data individualized by the user and the noise data into a GAN model to obtain a sample set;
and inputting the sample set into an InfoGAN model, and extracting second features which do not influence each other.
Optionally, the feature extraction module 720 is configured to input the user-customized commodity clothing data and the noise data into a GAN model, and when obtaining the sample set, is specifically configured to:
training a discriminator according to the commodity clothing data personalized by the user; and the number of the first and second groups,
inputting noise data into a generator, and judging a sample generated by the generator by the discriminator;
acquiring a second sample discriminated by the discriminator;
generating a sample set from the second sample.
Optionally, the similarity calculation module 730 is configured to, when the feature weight is assigned according to the distribution condition of the second feature, specifically:
counting the number of each label contained in the second characteristic;
and determining the weight of the single label according to the number of the single label and the total number of all the labels.
It should be understood that the various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the various methods of the present invention according to instructions in the program code stored in the memory.
By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer-readable media includes both computer storage media and communication media. Computer storage media store information such as computer readable instructions, data structures, program modules or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of computer readable media.
It should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (10)

1. A personalized clothing recommendation method based on deep learning is characterized by comprising the following steps:
collecting commodity clothing data of a merchant;
extracting a first characteristic according to the commodity clothing data of the merchant;
constructing a commodity similarity matrix according to the first characteristics;
collecting commodity clothing data personalized by a user;
extracting a second characteristic according to the commodity clothing data personalized by the user;
distributing characteristic weight according to the distribution condition of the second characteristic;
adjusting the similarity matrix according to the feature weight;
and after the user selects one commodity garment, selecting the commodity garment matched with the currently selected commodity garment from the commodity garments of the merchant according to the adjusted similarity matrix, and recommending the commodity garment to the user.
2. The method of claim 1, wherein extracting a first feature from the merchant's merchandise clothing data comprises:
and extracting first characteristics which do not influence each other by a generating countermeasure network InfoGAN model which maximizes the commodity clothing data input information of the merchant.
3. The method as claimed in claim 2, wherein inputting the merchandise clothing data of the merchant into the InfoGAN model, and extracting the first feature without mutual influence comprises:
inputting the merchant's merchandise clothing data into a generator;
the generator generates a first sample and inputs the first sample to the discriminator;
the discriminator returns the discrimination result to the generator;
when the discriminator can not identify the first sample as the sample generated by the generator, obtaining an interpretable hidden variable c;
and determining first characteristics which do not influence each other according to the interpretable hidden variable c.
4. The method of claim 2, wherein extracting a second feature from the user-personalized merchandise garment data comprises:
inputting the commodity clothing data personalized by the user and the noise data into a GAN model to obtain a sample set;
and inputting the sample set into an InfoGAN model, and extracting second features which do not influence each other.
5. The method of claim 4, wherein inputting the user-personalized merchandise garment data and the noise data into a GAN model, resulting in a sample set, comprises:
training a discriminator according to the commodity clothing data personalized by the user; and the number of the first and second groups,
inputting noise data into a generator, and judging a sample generated by the generator by the discriminator;
acquiring a second sample discriminated by the discriminator;
generating a sample set from the second sample.
6. The method of any of claims 1-5, wherein the first feature and the second feature each comprise an image feature and a data dictionary feature.
7. The method of any of claims 1-5, wherein assigning feature weights based on the distribution of the second features comprises:
counting the number of each label contained in the second characteristic;
and determining the weight of the single label according to the number of the single label and the total number of all the labels.
8. A personalized clothing recommendation device based on deep learning is characterized by comprising:
the merchant data collection module is used for collecting the commodity clothing data of the merchant;
the characteristic extraction module is used for extracting a first characteristic according to the commodity clothing data of the merchant;
the similarity calculation module is used for constructing a commodity similarity matrix according to the first characteristics;
the user data collection module is used for collecting the personalized commodity clothing data of the user;
the characteristic extraction module is also used for extracting a second characteristic according to the commodity clothing data personalized by the user;
the similarity calculation module is further used for distributing characteristic weight according to the distribution situation of the second characteristic; adjusting the similarity matrix according to the feature weight;
and the recommending module is used for selecting the commodity clothes matched with the commodity clothes currently selected by the user from the commodity clothes of the merchant and recommending the commodity clothes to the user according to the adjusted similarity matrix after the user selects one commodity clothes.
9. A readable storage medium having executable instructions thereon that, when executed, cause a computing device to perform the operations included in any of claims 1 to 7.
10. A computing device, comprising:
a processor; and
a memory storing executable instructions that, when executed, cause the processor to perform the operations included in any of claims 1 to 7.
CN202010751386.5A 2020-07-30 2020-07-30 Personalized clothing recommendation method and device based on deep learning Active CN111915400B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010751386.5A CN111915400B (en) 2020-07-30 2020-07-30 Personalized clothing recommendation method and device based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010751386.5A CN111915400B (en) 2020-07-30 2020-07-30 Personalized clothing recommendation method and device based on deep learning

Publications (2)

Publication Number Publication Date
CN111915400A true CN111915400A (en) 2020-11-10
CN111915400B CN111915400B (en) 2022-03-22

Family

ID=73287958

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010751386.5A Active CN111915400B (en) 2020-07-30 2020-07-30 Personalized clothing recommendation method and device based on deep learning

Country Status (1)

Country Link
CN (1) CN111915400B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112231583A (en) * 2020-11-11 2021-01-15 重庆邮电大学 E-commerce recommendation method based on dynamic interest group identification and generation of countermeasure network
CN113763114A (en) * 2021-03-04 2021-12-07 北京沃东天骏信息技术有限公司 Article information matching method and device and storage medium
CN113793167A (en) * 2021-02-03 2021-12-14 北京沃东天骏信息技术有限公司 Method and apparatus for generating information
CN113822735A (en) * 2021-02-24 2021-12-21 北京沃东天骏信息技术有限公司 Goods recommendation method and device, storage medium and electronic equipment
CN115983921A (en) * 2022-12-29 2023-04-18 广州市玄武无线科技股份有限公司 Offline store commodity association combination method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046515A (en) * 2015-06-26 2015-11-11 深圳市腾讯计算机系统有限公司 Advertisement ordering method and device
US20150356353A1 (en) * 2013-01-10 2015-12-10 Thomson Licensing Method for identifying objects in an audiovisual document and corresponding device
CN110246011A (en) * 2019-06-13 2019-09-17 中国科学技术大学 Interpretable fashion clothing personalized recommendation method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150356353A1 (en) * 2013-01-10 2015-12-10 Thomson Licensing Method for identifying objects in an audiovisual document and corresponding device
CN105046515A (en) * 2015-06-26 2015-11-11 深圳市腾讯计算机系统有限公司 Advertisement ordering method and device
CN110246011A (en) * 2019-06-13 2019-09-17 中国科学技术大学 Interpretable fashion clothing personalized recommendation method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112231583A (en) * 2020-11-11 2021-01-15 重庆邮电大学 E-commerce recommendation method based on dynamic interest group identification and generation of countermeasure network
CN112231583B (en) * 2020-11-11 2022-06-28 重庆邮电大学 E-commerce recommendation method based on dynamic interest group identification and generation of confrontation network
CN113793167A (en) * 2021-02-03 2021-12-14 北京沃东天骏信息技术有限公司 Method and apparatus for generating information
CN113822735A (en) * 2021-02-24 2021-12-21 北京沃东天骏信息技术有限公司 Goods recommendation method and device, storage medium and electronic equipment
CN113763114A (en) * 2021-03-04 2021-12-07 北京沃东天骏信息技术有限公司 Article information matching method and device and storage medium
CN115983921A (en) * 2022-12-29 2023-04-18 广州市玄武无线科技股份有限公司 Offline store commodity association combination method, device, equipment and storage medium
CN115983921B (en) * 2022-12-29 2023-11-14 广州市玄武无线科技股份有限公司 Off-line store commodity association combination method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN111915400B (en) 2022-03-22

Similar Documents

Publication Publication Date Title
CN111915400B (en) Personalized clothing recommendation method and device based on deep learning
CN110909176B (en) Data recommendation method and device, computer equipment and storage medium
Zhang et al. ClothingOut: a category-supervised GAN model for clothing segmentation and retrieval
US11188831B2 (en) Artificial intelligence system for real-time visual feedback-based refinement of query results
CN109493199A (en) Products Show method, apparatus, computer equipment and storage medium
CN110910199B (en) Method, device, computer equipment and storage medium for ordering project information
Zhou et al. Fashion recommendations through cross-media information retrieval
CN111784455A (en) Article recommendation method and recommendation equipment
CN111125495A (en) Information recommendation method, equipment and storage medium
Ullah et al. Image-based service recommendation system: A JPEG-coefficient RFs approach
CN111898031A (en) Method and device for obtaining user portrait
CN110955750A (en) Combined identification method and device for comment area and emotion polarity, and electronic equipment
CN106776716A (en) A kind of intelligent Matching marketing consultant and the method and apparatus of user
CN114663197A (en) Commodity recommendation method and device, equipment, medium and product thereof
CN107305677A (en) Product information method for pushing and device
CN110727864A (en) User portrait method based on mobile phone App installation list
JP2018156238A (en) Information processing apparatus, information processing method, and program
CN116796027A (en) Commodity picture label generation method and device, equipment, medium and product thereof
WO2023029350A1 (en) Click behavior prediction-based information pushing method and apparatus
CN114862480A (en) Advertisement putting orientation method and its device, equipment, medium and product
KR20210058525A (en) Method and device for classifying unstructured item data automatically for goods or services
Han et al. Multimodal-adaptive hierarchical network for multimedia sequential recommendation
JP5480008B2 (en) Summary manga image generation apparatus, program and method for generating manga content summary
CN111787042B (en) Method and device for pushing information
CN116823410A (en) Data processing method, object processing method, recommending method and computing device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221227

Address after: Room 301, No. 235, Kexue Avenue, Huangpu District, Guangzhou, Guangdong 510000

Patentee after: OURCHEM INFORMATION CONSULTING CO.,LTD.

Address before: No. 230, Waihuan West Road, Guangzhou University Town, Panyu, Guangzhou City, Guangdong Province, 510006

Patentee before: Guangzhou University

Effective date of registration: 20221227

Address after: 510000 room 606-609, compound office complex building, No. 757, Dongfeng East Road, Yuexiu District, Guangzhou City, Guangdong Province (not for plant use)

Patentee after: China Southern Power Grid Internet Service Co.,Ltd.

Address before: Room 301, No. 235, Kexue Avenue, Huangpu District, Guangzhou, Guangdong 510000

Patentee before: OURCHEM INFORMATION CONSULTING CO.,LTD.