CN111415221B - Clothing recommendation method and device based on interpretable convolutional neural network and terminal - Google Patents

Clothing recommendation method and device based on interpretable convolutional neural network and terminal Download PDF

Info

Publication number
CN111415221B
CN111415221B CN202010193933.2A CN202010193933A CN111415221B CN 111415221 B CN111415221 B CN 111415221B CN 202010193933 A CN202010193933 A CN 202010193933A CN 111415221 B CN111415221 B CN 111415221B
Authority
CN
China
Prior art keywords
neural network
convolutional neural
clothing
image
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010193933.2A
Other languages
Chinese (zh)
Other versions
CN111415221A (en
Inventor
金书季
肖若水
漆爽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN202010193933.2A priority Critical patent/CN111415221B/en
Publication of CN111415221A publication Critical patent/CN111415221A/en
Application granted granted Critical
Publication of CN111415221B publication Critical patent/CN111415221B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention relates to the field of characteristic research about clothing recommendation, in particular to a clothing recommendation method, a clothing recommendation device and a clothing recommendation terminal based on an interpretable convolutional neural network, wherein the clothing recommendation method comprises the steps of preprocessing an image, and dividing the preprocessed image into a training set and a verification set; constructing a convolutional neural network and constructing a characteristic matrix to extract clothing characteristics of the acquired image; performing end-to-end iterative training on the convolutional neural network by using a training set; inputting the picture into the trained convolutional neural network by the user, outputting the characteristic diagram of the picture by the convolutional neural network, and recommending the clothing consistent with the characteristic diagram of the picture input by the user to the user; according to the method, the characteristics of the recommended clothes are visualized, so that the customer can clearly like which characteristics of the clothes, the clothes can be more accurately pushed to the customer, and the user experience is improved.

Description

Interpretable convolutional neural network-based garment recommendation method and device and terminal
Technical Field
The invention relates to the field of characteristic research on clothing recommendation, in particular to a clothing recommendation method, a clothing recommendation device and a clothing recommendation terminal based on an interpretable convolutional neural network.
Background
With the development of e-commerce and the clothing market, most people choose to browse some clothes of interest on the internet. However, many people, especially some people with poor aesthetic quality, have little success in dealing with the full spectrum of the tourmaline brand of clothing. Therefore, when the customer selects the clothes, the system can screen out useful information from massive clothes data, and the useful information is provided for the customer.
Conventional recommendation systems are based on convolutional neural network implementations. The basic principle is that the clothes favorite features are firstly extracted through a convolutional neural network, then the clothes with the features are searched according to the features, and finally the searched clothes are recommended to a client. Through the recommendation system, the user experience is greatly improved, and the further development of the clothing market is promoted.
However, such recommendation systems also have a number of disadvantages: first, the convolutional neural network extracts features in conjunction with the context, i.e., the same filter may extract multiple features simultaneously, some of which are preferred and some of which are not preferred by the client. The clothing errors retrieved from these features are large. Secondly, the system directly gives a recommendation result without giving a search result basis, so that the client can only passively accept the recommendation result and cannot learn from the recommendation result to improve the aesthetic quality, thereby autonomously selecting other clothes.
Disclosure of Invention
In order to enable a user to search more accurately, the invention provides a clothing recommendation method, a clothing recommendation device and a clothing recommendation terminal based on an interpretable convolutional neural network, wherein the method, as shown in fig. 1, specifically comprises the following steps:
preprocessing the image, and dividing the preprocessed image into a training set and a verification set;
constructing a convolutional neural network and constructing a characteristic matrix to extract clothing characteristics of the acquired image;
performing end-to-end iterative training on the convolutional neural network by using a training set;
and the user inputs the picture into the trained convolutional neural network, the convolutional neural network can output the characteristic diagram of the picture, and the clothing consistent with the characteristic diagram of the picture input by the user is recommended to the user.
Further, as shown in fig. 2, the preprocessing the picture includes:
zooming the picture to make the size of the characteristic part of the clothing in the picture consistent;
and cutting and filling the pictures, and enabling the clothing characteristic parts of the pictures to be centered and the size of each picture to be consistent through cutting and filling all the pictures.
Further, the process of acquiring the clothing characteristics of the image comprises the following steps:
the last convolutional layer of the convolutional neural network comprises a plurality of filters;
constructing n multiplied by m +1 characteristic matrixes as a filter of the convolutional neural network;
the first n multiplied by m feature matrixes respectively correspond to features at each pixel point in the collected image with the size of n multiplied by m pixels, and the n multiplied by m +1 feature matrix represents whether the features to be extracted do not exist in the image or not;
and in the end-to-end iterative training process of the convolutional neural network by utilizing the training set, selecting a characteristic matrix for each image by utilizing the back propagation process of the convolutional neural network, wherein the characteristic matrix forms the clothing characteristics of the image.
Further, in the end-to-end iterative training process of the convolutional neural network by using the training set, the loss function when the convolutional neural network is propagated in the forward direction is represented as:
Loss f =-MI(X;T);
therein, loss f And X represents a feature map extracted from the last convolutional layer of the convolutional neural network, T represents a feature matrix, and MI (phi) represents the operation of solving the information entropy.
Further, in the end-to-end iterative training process of the convolutional neural network by using the training set, the training of the convolutional neural network back propagation is represented as:
Figure GDA0004048644840000021
wherein Loss represents the total Loss function of the convolutional neural network; x is a radical of a fluorine atom ij Expressing pixel points of an ith row and a jth column on the characteristic diagram; loss f A loss function representing a feature map and a feature matrix; λ represents a filter threshold;
Figure GDA0004048644840000031
represents a loss function which predicts whether the outcome is correct or not, and>
Figure GDA0004048644840000032
represents a true tag, <' > or>
Figure GDA0004048644840000033
Represents a predictive tag; k represents the type of label.
The invention also provides a clothing recommendation device based on the interpretable convolutional neural network, which is characterized by comprising a data acquisition module, a data preprocessing module, a data characteristic analysis module and a clothing recommendation module, wherein:
the data acquisition module is used for acquiring training data and data to be predicted;
the data preprocessing module is used for zooming, cutting and filling the acquired picture;
the data characteristic analysis module is used for training the convolutional neural network according to the training data with the labels and inputting the data to be predicted without the labels into the trained convolutional neural network to obtain a characteristic diagram;
and the clothing recommending module is used for searching according to the characteristic diagram of the picture input by the user and recommending the clothing consistent with the characteristic diagram to the user.
Furthermore, the convolutional neural network in the data feature analysis module comprises an input layer, a plurality of convolutional layers and an output layer, wherein the last convolutional layer is connected with the output layer, the last convolutional layer comprises a filter formed by n × m +1 feature matrixes, n × m is the pixel size of the input image, and Hadamard product is carried out on a feature map obtained through the last convolutional layer and a feature template.
The invention also provides a clothing recommendation terminal based on the interpretable convolutional neural network, which is characterized by comprising any clothing recommendation device based on the interpretable convolutional neural network, wherein a user packages and uploads training data through the terminal, after a data characteristic analysis module of the clothing recommendation device based on the interpretable convolutional neural network finishes training of the convolutional neural network, the terminal prompts the user to input a picture to be predicted, the trained convolutional neural network obtains a characteristic diagram of the picture according to the picture input by the user, the data characteristic analysis module obtains a picture consistent with the characteristic diagram according to the characteristic diagram, and the terminal pushes the obtained picture to the user.
Furthermore, the user packages and uploads the training data through the terminal, wherein the training data comprise favorite clothing pictures of the user and favorite clothing pictures of the user.
Further, the terminal has a recovery initialization authority for the clothes recommendation device based on the interpretable convolutional neural network.
On one hand, the invention visualizes the characteristics of the recommended clothes, so that the client can clearly understand the patterns and places (namely the characteristics) on the clothes, which the client likes, thereby particularly knowing the characteristics of the clothes which the client likes; the device can more accurately know the clothes liked by the customer, thereby improving the network model.
Drawings
FIG. 1 is a flow chart of a method for recommending clothing based on an interpretable convolutional neural network according to the present invention;
FIG. 2 is a flow chart of image preprocessing performed by the interpretable convolutional neural network-based clothing recommendation method of the present invention;
FIG. 3 is a schematic diagram of matching a feature matrix and a feature map in the interpretable convolutional neural network-based clothing recommendation method of the present invention;
fig. 4 is a schematic flowchart of an embodiment of the clothing recommendation terminal based on the interpretable convolutional neural network according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention provides a clothing recommendation method based on an interpretable convolutional neural network, which specifically comprises the following steps:
preprocessing the image, and dividing the preprocessed image into a training set and a verification set;
constructing a convolutional neural network and constructing a characteristic matrix to extract clothing characteristics of the acquired image;
performing end-to-end iterative training on the convolutional neural network by using a training set;
and the user inputs the picture into the trained convolutional neural network, the convolutional neural network can output the characteristic diagram of the picture, and the clothing consistent with the characteristic diagram of the picture input by the user is recommended to the user.
In this embodiment, the collected graphics with the like and dislike labels are preprocessed to make the size, shape, etc. of the images conform to the designed convolutional neural network structure, in addition, 70% of the images are randomly removed as a training set, and the preprocessed images are made into a format that the TensorFlow can process through a toolkit of a TensorFlow open-source framework of a deep learning tool.
Further, the pretreatment comprises:
zooming the picture to make the size of the characteristic part of the clothing in the picture consistent;
and cutting and filling the pictures, and enabling the clothing characteristic parts of the pictures to be centered and the size of each picture to be consistent through cutting and filling all the pictures.
In this embodiment, when a convolutional neural network is constructed, an iterative model from input to output is constructed, the number of layers of a hidden layer in the middle of the network, functions of each layer, the size and the number of convolutional kernels of each layer, an activation function of each layer, a feature matrix are set, and meanwhile, an appropriate learning rate, a learning rate reduction algorithm, an error optimization algorithm and a parameter updating algorithm are also set.
And importing the training data set which is well preprocessed into a convolutional neural network to drive the neural network to carry out end-to-end training. In the training process, all hidden layers in the network automatically learn the characteristics of the input clothing image, and parameters and weights of a convolution kernel are adjusted according to an error optimization algorithm. In the parameter adjusting process, the filter learns from image characteristics, a Hadamard product is carried out on a characteristic matrix and a characteristic diagram extracted by the filter in the fitting process, a characteristic template is obtained by training and learning in the forward propagation process of a convolutional neural network, and the characteristic diagram learns favorite clothing characteristics to realize the learning of the favorite clothing characteristics; after training, the filter can find the clothes features liked by the client from the input image, and learning of the liked clothes features by the neural network is achieved. Referring to fig. 3, each convolutional layer has a plurality of filters, each filter is set with a loss function, and each filter corresponds to the feature map obtained by the convolution. An upward arrow indicates a forward propagation algorithm, and image features are extracted; downward arrows indicate back propagation, optimizing the parameters. I.e. both up and down arrows are used to train the network.
In this embodiment, a specific convolutional neural network structure is provided, which includes:
a first layer of a convolutional layer: filters =64, kernel size = (3,3);
a second layer of convolutional layers: filters =64, kernel size = (3, 3);
the third maximum pooling layer: pool _ size = (2, 2), strings = (2, 2);
a fourth layer of convolutional layers: filters =128, kernel size = (3,3);
a fifth layer of convolutional layers: filters =128, kernel size = (3,3);
the sixth maximum pooling layer: pool _ size = (2, 2), strings = (2, 2);
a seventh layer of convolutional layers: filters =256, kernel size = (3,3);
the eighth layer of the convolutional layer: filters =256, kernel_size = (3, 3);
ninth layer of convolutional layer: filters =256, kernel size = (1, 1);
tenth largest pooling layer: pool _ size = (2, 2), strings = (2, 2);
the eleventh layer of convolutional layers: filters =512, kernel size = (3,3);
a twelfth layer of convolutional layers: filters =512, kernel size = (3, 3);
a thirteenth layer of convolutional layers: filters =512, kernel size = (1, 1);
the fourteenth largest pooling layer: pool _ size = (2, 2), strings = (2, 2);
fifteenth fully-connected layer: dense =2048, dropout =0.5, activation function: relu;
sixteenth full connection layer: dense =1024, dropout =0.5, activation function: relu;
seventeenth fully-connected layer: density =2, activation function: softmax;
the activation function of each convolution layer is a Relu activation function;
in the above neural network structure, the first to sixth layers belong to the bottom layer, the extracted low-level features such as color, texture and shape are all low-level features, the extracted intermediate semantics, namely incomplete semantic features, are extracted from the seventh to tenth layers, and the feature graph obtained from the thirteenth convolutional layer needs to be subjected to Hadamard product noise filtering with a feature template, so that the filter in the convolutional layer represents the unique feature; the function of the fifteenth to sixteenth fully-connected layers is to prevent overfitting; the seventeenth layer full-connection layer is used as a classifier, whether the prediction result is the clothes liked by the client or not is output, when the prediction result is the liked clothes, the feature map of the ninth layer is visualized, and the basis for extracting the clothes features, namely predicting the liked clothes is shown on the feature map, so that the visualized feature map is shown to the client, and the client can understand how the recommendation system recommends.
When the training error of the convolutional neural network model is gradually reduced in the training process, the fact that the neural network is learning features is indicated, and when the error is reduced to an acceptable range, the training is finished.
In the training process, in the end-to-end iterative training process of the convolutional neural network by using a training set, the loss function of the convolutional neural network in forward propagation is expressed as:
Loss f =-MI(X;T);
therein, loss f And X represents a feature map extracted from the last convolutional layer of the convolutional neural network, T represents a feature matrix, and MI (phi) represents the operation of solving the information entropy.
In the process of carrying out end-to-end iterative training on the convolutional neural network by utilizing a training set, the training of the convolutional neural network back propagation is represented as follows:
Figure GDA0004048644840000071
wherein Loss represents the total Loss function of the convolutional neural network; x is a radical of a fluorine atom ij Representing the pixel points of the ith row and the jth column on the characteristic diagram; loss f A loss function representing the feature map and the feature matrix; λ represents a filter threshold;
Figure GDA0004048644840000072
represents a loss function which predicts whether the outcome is correct or not, and>
Figure GDA0004048644840000073
represents a true tag, <' > or>
Figure GDA0004048644840000074
Represents a predictive tag; k represents the type of the label, and the type of the label in the embodiment is divided into two categories, i.e. likeness and dislike of the label of the user, wherein k =1 represents the label liked by the user, and k =2 represents the label disliked by the user.
Further, the total loss function of the convolutional neural network of the present invention is expressed as:
Loss=λLoss f +L;
wherein L is a square loss function expressed as
Figure GDA0004048644840000075
And storing the trained network model and parameters, and verifying the accuracy of the model identification image by using a verification set, wherein the accuracy is the accuracy of the model, and the more accurate the model indicates that the finally obtained visual favorite clothing characteristics are more credible.
The customer can use the new favorite garment image as input, input it into the saved model for calculation, and the model analyzes the features of the customer's favorite garment and visualizes the features.
The embodiment also provides a clothing recommendation device based on the interpretable convolutional neural network, which comprises a data acquisition module, a data preprocessing module, a data characteristic analysis module and a clothing recommendation module, wherein:
the data acquisition module is used for acquiring training data and data to be predicted;
the data preprocessing module is used for zooming, cutting and filling the acquired picture;
the data characteristic analysis module is used for training the convolutional neural network according to the training data with the labels and inputting the data to be predicted without the labels into the trained convolutional neural network to obtain a characteristic diagram;
and the clothing recommending module is used for searching according to the characteristic diagram of the picture input by the user and recommending the clothing consistent with the characteristic diagram to the user.
In this embodiment, the data preprocessing module is further described. And the data preprocessing module is used for zooming, cutting and filling the acquired picture, and in actual operation, because the image is acquired from the network, the size of the image is not uniform, and the image needs to be cut to the size which can be processed by the designed convolutional neural network, wherein the size of the uniform image is 299 by 299 and the unit is a pixel. Sometimes, some images are close to the frame, the side close to the image content needs to be supplemented, and the side not close to the image content needs to be cut, namely, the clothing part in the picture is positioned in the center of the image. Finally, the whole image is adjusted to 299 x 299.
In the data feature analysis module, when the convolutional neural network extracts features, the features extracted by the lower layer belong to colors, textures and shapes, and the features extracted by the convolutional layer at the upper layer have semantics. In brief, the features of the bottom layer cannot be seen clearly what the extracted features are, while the features of the high layer are the features of the garment which can be seen clearly by people.
In order to effectively extract the characteristics, the data characteristic analysis module comprises a convolutional neural network with an input layer, a plurality of convolutional layers and an output layer, wherein the input layer, the plurality of convolutional layers and the output layer of the convolutional neural network are sequentially connected, the last convolutional layer is connected with the output layer, the convolutional layers comprise filters formed by n x m +1 characteristic matrixes, n x m is the pixel size of an input image, and n x m is the size of a picture after preprocessing.
In the process of training the convolutional neural network, each filter is respectively provided with a loss function, namely the loss function of the forward propagation process of each filter is independent, and the loss function is expressed as:
Loss f =-MI(X;T);
therein, loss f And X represents a feature map extracted from the last convolutional layer of the convolutional neural network, T represents a feature matrix, and MI (phi) represents the operation of solving the information entropy.
In the process of training the convolutional neural network, when the convolutional neural network is reversely propagated, not only the accuracy of the final task but also the suitability of matching the feature matrix and the filter are considered, so that the loss function adopted by the reverse propagation is expressed as:
Figure GDA0004048644840000091
wherein, loss represents; x is a radical of a fluorine atom ij Represents; loss f A loss function representing the feature map and the feature matrix; λ represents a filter threshold;
Figure GDA0004048644840000092
loss functions indicating whether the prediction is correct or not>
Figure GDA0004048644840000093
Represents a true tag, <' > or>
Figure GDA0004048644840000094
Represents a predictive tag; k represents the type of tag.
The invention also provides a clothing recommendation terminal based on the interpretable convolutional neural network, which comprises any clothing recommendation device based on the interpretable convolutional neural network, as shown in fig. 4, a user packages and uploads training data through the terminal, after a data characteristic analysis module of the clothing recommendation device based on the interpretable convolutional neural network finishes training of the convolutional neural network, the terminal prompts the user to input a picture to be predicted, the trained convolutional neural network obtains a characteristic diagram of the picture according to the picture input by the user, the data characteristic analysis module obtains a picture consistent with the characteristic diagram according to the characteristic diagram, and the terminal pushes the obtained picture to the user.
The user uploads two types of pictures through the terminal, wherein one type of pictures is favorite pictures of the user, and the other type of pictures is disliked pictures of the user.
Furthermore, when the terminal gives the searched picture, the terminal gives the searched clothing feature basis, if the user does not like the searched picture, the picture can be listed as disliked through the terminal, and the data feature analysis module in the clothing recommendation device based on the interpretable convolutional neural network further optimizes the parameters of the convolutional neural network.
In addition, the clothes have different popularity trends over time, and the clothes characteristics change rapidly, so the terminal is also provided with a recovery initialization authority for the clothes recommendation device based on the interpretable convolutional neural network, and after the authority is used, a user needs to upload pictures again and reinitialize the clothes recommendation device based on the interpretable convolutional neural network.
Preferably, in the present embodiment, the convolutional neural network used is Fast R-CNN.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that various changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. The clothing recommendation method based on the interpretable convolutional neural network is characterized by comprising the following steps:
preprocessing the image, and dividing the preprocessed image into a training set and a verification set;
the method comprises the following steps of constructing a convolutional neural network and constructing a characteristic matrix to extract clothing characteristics of the acquired image, and specifically comprises the following steps:
the last convolutional layer of the convolutional neural network comprises a plurality of filters;
constructing n multiplied by m +1 characteristic matrixes as a filter of the convolutional neural network;
the first n multiplied by m feature matrixes respectively correspond to features at each pixel point in the collected image with the size of n multiplied by m pixels, and the n multiplied by m +1 feature matrix represents whether the features to be extracted do not exist in the image or not;
performing end-to-end iterative training on the convolutional neural network by using a training set; in the process of carrying out end-to-end iterative training on the convolutional neural network by utilizing a training set, selecting a characteristic matrix for each image by utilizing the back propagation process of the convolutional neural network, wherein the characteristic matrix forms the clothing characteristics of the image;
and the user inputs the picture into the trained convolutional neural network, and the convolutional neural network can output the characteristic diagram of the picture and recommend the clothing consistent with the characteristic diagram of the picture input by the user to the user.
2. The interpretable convolutional neural network-based garment recommendation method of claim 1, wherein preprocessing the picture comprises:
zooming the picture to make the size of the characteristic part of the clothing in the picture consistent;
and cutting and filling the pictures, and enabling the clothing characteristic parts of the pictures to be centered and the size of each picture to be consistent through cutting and filling all the pictures.
3. The method for recommending clothing based on interpretable convolutional neural network of claim 1, wherein in the course of performing end-to-end iterative training on the convolutional neural network by using the training set, the loss function of the convolutional neural network in forward propagation is expressed as:
Loss f =-MI(X;T);
therein, loss f And X represents a feature map extracted from the last convolutional layer of the convolutional neural network, T represents a feature matrix, and MI (phi) represents the operation of solving the information entropy.
4. The interpretable convolutional neural network-based garment recommendation method of claim 1, wherein in the end-to-end iterative training process for the convolutional neural network by using the training set, the training of the convolutional neural network back propagation is represented as:
Figure FDA0004056519960000021
wherein Loss represents the total Loss function of the convolutional neural network; x is the number of ij Representing the pixel points of the ith row and the jth column on the characteristic diagram; loss f A penalty function representing a pair of the feature map and the feature matrix; λ represents a filter threshold;
Figure FDA0004056519960000022
loss functions indicating whether the prediction is correct or not>
Figure FDA0004056519960000023
Represents a true tag, <' > or>
Figure FDA0004056519960000024
Represents a predictive tag; k represents the type of tag.
5. The clothing recommendation device based on the interpretable convolutional neural network is characterized by comprising a data acquisition module, a data preprocessing module, a data characteristic analysis module and a clothing recommendation module, wherein:
the data acquisition module is used for acquiring training data and data to be predicted;
the data preprocessing module is used for zooming, cutting and filling the acquired picture;
the data feature analysis module is used for training a convolutional neural network according to the training data with labels and inputting data to be predicted without labels into the trained convolutional neural network to obtain a feature map, the convolutional neural network in the data feature analysis module comprises an input layer, a plurality of convolutional layers and an output layer, wherein the last convolutional layer is connected with the output layer, the last convolutional layer comprises a filter formed by n x m +1 feature matrixes, n x m is the pixel size of an input image, the first n x m feature matrixes respectively correspond to features of each pixel point in an acquired image with the size of n x m pixels, and the n x m +1 features represent whether the features to be extracted do not exist in the image or not; performing Hadamard product on the feature map and the feature matrix obtained by the last convolutional layer;
and the clothing recommending module is used for searching according to the characteristic diagram of the image input by the user and recommending the clothing consistent with the characteristic diagram to the user.
6. The clothing recommendation terminal based on the interpretable convolutional neural network is characterized by comprising the clothing recommendation device based on the interpretable convolutional neural network, a user uploads training data in a packaging mode through the terminal, after a data feature analysis module of the clothing recommendation device based on the interpretable convolutional neural network completes training of the convolutional neural network, the terminal prompts the user to input an image to be predicted, the trained convolutional neural network obtains a feature diagram of the image according to the image input by the user, the data feature analysis module obtains an image consistent with the feature diagram according to the feature diagram, and the terminal pushes the obtained image to the user.
7. The interpretable convolutional neural network-based garment recommendation terminal of claim 6, wherein the user package upload training data through the terminal comprises a user favorite garment picture and a user disliked garment picture.
8. The interpretable convolutional neural network-based garment recommendation terminal as claimed in claim 6, wherein the terminal has a recovery initialization right for the interpretable convolutional neural network-based garment recommendation device.
CN202010193933.2A 2020-03-19 2020-03-19 Clothing recommendation method and device based on interpretable convolutional neural network and terminal Active CN111415221B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010193933.2A CN111415221B (en) 2020-03-19 2020-03-19 Clothing recommendation method and device based on interpretable convolutional neural network and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010193933.2A CN111415221B (en) 2020-03-19 2020-03-19 Clothing recommendation method and device based on interpretable convolutional neural network and terminal

Publications (2)

Publication Number Publication Date
CN111415221A CN111415221A (en) 2020-07-14
CN111415221B true CN111415221B (en) 2023-04-07

Family

ID=71491221

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010193933.2A Active CN111415221B (en) 2020-03-19 2020-03-19 Clothing recommendation method and device based on interpretable convolutional neural network and terminal

Country Status (1)

Country Link
CN (1) CN111415221B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104616018A (en) * 2014-12-09 2015-05-13 四川大学 Improved Hopfield neural network based recognition method for clothes logo
CN105117739A (en) * 2015-07-29 2015-12-02 南京信息工程大学 Clothes classifying method based on convolutional neural network
CN110210567A (en) * 2019-06-06 2019-09-06 广州瑞智华创信息科技有限公司 A kind of image of clothing classification and search method and system based on convolutional neural networks
CN110209860A (en) * 2019-05-13 2019-09-06 山东大学 A kind of interpretable garment coordination method and device based on clothes attribute of template-directed
CN110246011A (en) * 2019-06-13 2019-09-17 中国科学技术大学 Interpretable fashion clothing personalized recommendation method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8977629B2 (en) * 2011-05-24 2015-03-10 Ebay Inc. Image-based popularity prediction
CN107392085B (en) * 2017-05-26 2021-07-02 上海精密计量测试研究所 Method for visualizing a convolutional neural network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104616018A (en) * 2014-12-09 2015-05-13 四川大学 Improved Hopfield neural network based recognition method for clothes logo
CN105117739A (en) * 2015-07-29 2015-12-02 南京信息工程大学 Clothes classifying method based on convolutional neural network
CN110209860A (en) * 2019-05-13 2019-09-06 山东大学 A kind of interpretable garment coordination method and device based on clothes attribute of template-directed
CN110210567A (en) * 2019-06-06 2019-09-06 广州瑞智华创信息科技有限公司 A kind of image of clothing classification and search method and system based on convolutional neural networks
CN110246011A (en) * 2019-06-13 2019-09-17 中国科学技术大学 Interpretable fashion clothing personalized recommendation method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
傅夏生 ; 林毅 ; .基于Hopfield网络的服装品牌标志识别.现代计算机(专业版).2017,(第06期),第20-23页. *

Also Published As

Publication number Publication date
CN111415221A (en) 2020-07-14

Similar Documents

Publication Publication Date Title
CN111797321B (en) Personalized knowledge recommendation method and system for different scenes
CN111310063B (en) Neural network-based article recommendation method for memory perception gated factorization machine
CN112926396A (en) Action identification method based on double-current convolution attention
CN107330750A (en) A kind of recommended products figure method and device, electronic equipment
CN113297370B (en) End-to-end multi-modal question-answering method and system based on multi-interaction attention
CN110119479B (en) Restaurant recommendation method, restaurant recommendation device, restaurant recommendation equipment and readable storage medium
CN111984824A (en) Multi-mode-based video recommendation method
CN114780831A (en) Sequence recommendation method and system based on Transformer
CN112613548B (en) User customized target detection method, system and storage medium based on weak supervised learning
CN113761359B (en) Data packet recommendation method, device, electronic equipment and storage medium
CN115080865B (en) E-commerce data operation management system based on multidimensional data analysis
CN112487291A (en) Big data-based personalized news recommendation method and device
CN114741599A (en) News recommendation method and system based on knowledge enhancement and attention mechanism
CN112468853B (en) Television resource recommendation method and device, computer equipment and storage medium
CN107169830B (en) Personalized recommendation method based on clustering PU matrix decomposition
Du et al. Image recommendation algorithm combined with deep neural network designed for social networks
CN111415221B (en) Clothing recommendation method and device based on interpretable convolutional neural network and terminal
CN110147464B (en) Video recommendation method and device, electronic equipment and readable storage medium
CN110276283B (en) Picture identification method, target identification model training method and device
CN116703523A (en) Electronic commerce system based on big data and method thereof
CN109800424A (en) It is a kind of based on improving matrix decomposition and the recommended method across channel convolutional neural networks
CN115687760A (en) User learning interest label prediction method based on graph neural network
CN115238188A (en) Object recommendation method and system and object recommendation model system
CN114637920A (en) Object recommendation method and device
CN110399527B (en) Movie recommendation method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant