CN114707427B - Personalized modeling method of graph neural network based on effective neighbor sampling maximization - Google Patents

Personalized modeling method of graph neural network based on effective neighbor sampling maximization Download PDF

Info

Publication number
CN114707427B
CN114707427B CN202210571867.7A CN202210571867A CN114707427B CN 114707427 B CN114707427 B CN 114707427B CN 202210571867 A CN202210571867 A CN 202210571867A CN 114707427 B CN114707427 B CN 114707427B
Authority
CN
China
Prior art keywords
user
representation
module
fashion
item
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210571867.7A
Other languages
Chinese (zh)
Other versions
CN114707427A (en
Inventor
刘金环
侯磊
杜军威
于旭
马军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao University of Science and Technology
Original Assignee
Qingdao University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao University of Science and Technology filed Critical Qingdao University of Science and Technology
Priority to CN202210571867.7A priority Critical patent/CN114707427B/en
Publication of CN114707427A publication Critical patent/CN114707427A/en
Application granted granted Critical
Publication of CN114707427B publication Critical patent/CN114707427B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/27Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/12Geometric CAD characterised by design entry means specially adapted for CAD, e.g. graphical user interfaces [GUI] specially adapted for CAD
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/12Cloth

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Computational Mathematics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Architecture (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Molecular Biology (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses an effective neighbor sampling maximization-based personalized modeling method for a graph neural network, and belongs to the technical field of personalized fashion recommendation systems. The invention also provides an individualized model which comprises a composition module, a fashion single product representation module, a user representation module, an upper garment-lower garment compatibility module, a user individualized module and a recommendation module; the fashion single representation module is connected with the upper garment-lower garment compatibility module, the fashion single representation module and the user representation module are connected with the user personalization module, and the upper garment-lower garment compatibility module and the user personalization module are connected with the recommendation module. The invention provides the specific application of the graph neural network in the field of fashion recommendation, and the recommendation accuracy is improved; different sampling probabilities are adopted in different layers of the graph neural network, and the representation capability of the model on the target nodes and the robustness of the model are improved.

Description

Personalized modeling method of graph neural network based on effective neighbor sampling maximization
Technical Field
The invention belongs to the technical field of personalized fashion recommendation systems, and particularly relates to a personalized and compatible modeling method of a graph neural network based on effective neighbor sampling maximization.
Background
In recent years, with the rapid development of fashion industry, the demand of consumers for fashion recommendations is increasing, thereby creating a strong demand for dress matches. In the aspect of garment compatibility modeling, current work can be divided into two broad categories: one is to model the compatibility of two fashion items (such as upper garment and lower garment) and the other is to model the compatibility of a complete set of clothing (multiple fashion items). In the aspect of fashion item modeling, most methods treat a pair of fashion items with compatible relationships as a metric learning problem by assuming that they are close to each other in a potential space; later, with the progress of related research and the sequential proposal of a data independent function and a data dependent function, researchers begin to use the methods to model the compatibility between fashion singles; in the aspect of suit modeling, most of the current research methods are to regard the suit as a sequence, model by using a Bi-LSTM or RNN method, and then start to apply the method to suit modeling along with the wide application of the graph neural network.
Based on the above, when garment compatibility modeling is performed in the fashion recommendation field, the current method mainly has the following three disadvantages: firstly, the association degree between different clothes is neglected, and only the visual features or text features of the clothes are used for feature representation, which is not enough to obtain accurate single-item feature representation; secondly, in a modeling scheme considering personalized factors, influence of a user relationship network on a user is ignored, so that accurate user characteristic representation cannot be obtained, and a recommendation system cannot make accurate clothing recommendation according to different users; thirdly, in the scheme of modeling by using the graph neural network, all neighbor information of the target node is generally aggregated, and the influence of the noise node on the representation of the target node is not considered.
Disclosure of Invention
The invention aims to provide an individualized modeling method of a graph neural network based on effective neighbor sampling maximization, so as to make up for the defects of the prior art.
In order to achieve the aim of the invention, the invention adopts the following specific technical scheme:
the personalized modeling method of the graph neural network based on the effective neighbor sampling maximization comprises the following steps:
s1: firstly, constructing a basic model: namely the personalization, compatibility model;
s2: according to an interaction diagram between the user and the fashion single product: constructing a user-fashion single picture, a user interaction picture, a fashion single-user picture and a fashion single-fashion single interaction picture;
s3: according to the fashion single item interaction diagram created in the step S2, creating a fashion single item feature representation matrix, constructing a diagram neural network model with a common L layer, sending the fashion single item feature representation matrix into the diagram neural network model, and obtaining the updated fashion single item feature representation matrix, wherein the updated fashion single item feature representation matrix comprises the feature representation of an upper garment and the feature representation of a lower garment;
s4: creating a characteristic representation matrix of the user according to the user interaction diagram created in the step S2, and sending the characteristic representation matrix of the user into the neural network model by using the neural network model created in the step S3 to obtain an updated user characteristic representation matrix, namely user characteristic representation;
s5: through the step of S3, acquiring the feature representation of the upper garment and the feature representation of the lower garment, sending the two feature representations into the upper garment-lower garment compatibility module, and calculating the compatibility scores of the upper garment and the lower garment;
s6: respectively obtaining the characteristic representation and the user characteristic representation of the lower garment through the steps of S3 and S4, and sending the two representations into the user personalization module to obtain the preference score of the user for the given lower garment;
s7: the compatibility scores of the upper garment and the lower garment obtained from the step of S5 and the preference score of the user for the given lower garment obtained from the step of S5 are fed to a recommending module to obtain the compatibility scores of the upper garment and the lower garment with the personalization factors fused.
Further, the basic model comprises a composition module, a fashion single representation module, a user representation module, an upper garment-lower garment compatibility module, a user personalization module and a recommendation module; the fashion single representation module is connected with the upper garment-lower garment compatibility module, the fashion single representation module and the user representation module are connected with the user personalization module, and the upper garment-lower garment compatibility module and the user personalization module are connected with the recommendation module.
Further, the patterning module: according to the relation between the user and the fashion single, constructing a user-fashion single graph, a user-user relation graph, a fashion single-user graph and a fashion single-fashion single graph;
the fashion singles representation module: the system comprises a graph neural network model, a fashion single product-user graph, a fashion single product-fashion single product graph, a user and single product characteristic representation matrix, a user and a single product characteristic representation matrix, wherein the user and the single product characteristic representation matrix are initialized randomly;
the user representation module: the system comprises a neural network model, a user-fashion single-item graph, a user-user relation graph, a user and single-item feature representation matrix, a user-fashion single-item graph and a user and single-item feature representation matrix, wherein the user-fashion single-item graph and the user-user relation graph are initialized randomly;
the upper garment-lower garment compatibility module: the upper garment feature representation and the lower garment feature representation obtained from the fashion single representation module are used as input information and are sent into the module, and the compatibility degree of the upper garment and the lower garment is calculated;
the user personalization module: the lower clothes feature representation obtained from the fashion single representation module and the user feature representation obtained from the user representation module are used as input information and are sent into the module, and the like degree of the user for the given lower clothes is calculated;
the recommendation module: and taking the compatibility degree of the upper garment and the lower garment obtained by the upper garment-lower garment compatibility module and the user like degree of the lower garment obtained by the user personalization module as input, calculating the compatibility score of the upper garment and the lower garment integrated with the personalization factors, and generating recommendation for the user according to the score.
Furthermore, the fashion single representation module comprises a first representation submodule, a second representation submodule and a third representation submodule, wherein the first representation submodule and the second representation submodule are connected with the third representation submodule;
a first representation submodule: the method is used for obtaining single-item feature representation from a single-item level by means of a graph neural network model and combining an attention mechanism from a fashion single-item interaction space;
the second representation submodule: the system is used for obtaining feature representation of the single item from a user level by means of a graph neural network model and combining an attention mechanism from a fashion single item-user interaction space;
a third representation submodule: and taking the output contents of the first representation submodule and the second representation submodule as input for fusing the single-item feature representation obtained from the single-item level and the single-item feature representation obtained from the user level to obtain the final feature representation of the target fashion single item.
Furthermore, the user representation module comprises a first representation submodule, a second representation submodule and a third representation submodule, wherein the first representation submodule and the second representation submodule are connected with the third representation submodule;
a first representation submodule: the system is used for obtaining user feature representation from a single product level by means of a graph neural network and combining an attention mechanism from a user-fashion single product interaction space;
the second representation submodule: the system is used for obtaining a feature representation of a user from a user interaction layer by means of a graph neural network and combining an attention mechanism from a user-user interaction relation space;
a third representation submodule: and taking the output contents of the first expression submodule and the second expression submodule as input, and fusing the user characteristic expression obtained from the single-item level and the user characteristic expression obtained from the user interaction level to obtain the final characteristic expression of the target user.
Further, in step S2, the specific method for creating the interaction graph includes:
s21: first, two symbolic representations are defined
Figure 889963DEST_PATH_IMAGE001
The method comprises the steps of setting a user set containing N users;
Figure 893691DEST_PATH_IMAGE002
a garment set containing M fashion singles;
s22: creating a dictionary-style fashion item-fashion item interaction diagram
Figure 803878DEST_PATH_IMAGE003
Wherein, in the step (A),
Figure 994688DEST_PATH_IMAGE004
and
Figure 15734DEST_PATH_IMAGE005
representing the destination node of the fashion singles,
Figure 608389DEST_PATH_IMAGE006
and with
Figure 423898DEST_PATH_IMAGE007
Are respectively nodes
Figure 570846DEST_PATH_IMAGE008
Figure 130003DEST_PATH_IMAGE009
The neighbor node set of (2);
s23: creating a dictionary-style fashion singleton-user interaction diagram
Figure 842744DEST_PATH_IMAGE010
Wherein, in the step (A),
Figure 563576DEST_PATH_IMAGE008
and
Figure 994557DEST_PATH_IMAGE009
the destination node of the fashion item is represented,
Figure 900282DEST_PATH_IMAGE011
and
Figure 733109DEST_PATH_IMAGE012
are respectively nodes
Figure 624842DEST_PATH_IMAGE013
Figure 746382DEST_PATH_IMAGE014
The neighbor node set of (2);
s24: creating a dictionary-form user-user interaction graph
Figure 647341DEST_PATH_IMAGE015
Wherein, in the step (A),
Figure 69096DEST_PATH_IMAGE016
and
Figure 662888DEST_PATH_IMAGE017
which represents the destination node of the user,
Figure 802882DEST_PATH_IMAGE018
and with
Figure 507533DEST_PATH_IMAGE019
Are respectively nodes
Figure 49373DEST_PATH_IMAGE020
Figure 548487DEST_PATH_IMAGE021
The neighbor node set of (2);
s25: creating a dictionary-style user-fashion singleton interaction diagram
Figure 379040DEST_PATH_IMAGE022
Wherein, in the step (A),
Figure 621803DEST_PATH_IMAGE023
and
Figure 18149DEST_PATH_IMAGE024
which represents the destination node of the user,
Figure 953744DEST_PATH_IMAGE025
and
Figure 68330DEST_PATH_IMAGE026
are respectively nodes
Figure 849205DEST_PATH_IMAGE023
Figure 568899DEST_PATH_IMAGE024
Set of neighbor nodes.
Further, the step S3 specifically includes:
s31: constructing a static feature matrix of the fashion single item according to the created interactive graph
Figure 206554DEST_PATH_IMAGE027
;
Wherein, the fashion single item feature matrix
Figure 808436DEST_PATH_IMAGE028
Each row of (A) represents a fashion item
Figure 393001DEST_PATH_IMAGE029
Is expressed as
Figure 763940DEST_PATH_IMAGE030
This is also a single item
Figure 979020DEST_PATH_IMAGE029
An initial vector of a first layer in the graph neural network;
s32: from the level of fashion single products, by means of the neural network of the graph, the states of the neighbor nodes are aggregated, and the single products are matched
Figure 802620DEST_PATH_IMAGE029
The characteristic vector is updated, and the single product is obtained through state updating
Figure 190876DEST_PATH_IMAGE029
Node representation at the kth level
Figure 416321DEST_PATH_IMAGE031
S33: from the user level, aggregating the states of the neighbor nodes by means of the graph neural network, and aligning the singles
Figure 599041DEST_PATH_IMAGE029
The feature vectors are updated, and the list is obtained through state updatingArticle (A)
Figure 441095DEST_PATH_IMAGE029
Node representation at the kth level
Figure 367462DEST_PATH_IMAGE032
S34: the items to be obtained from the fashion item level
Figure 447414DEST_PATH_IMAGE033
To represent
Figure 66614DEST_PATH_IMAGE034
And the singleton obtained from the user plane
Figure 68068DEST_PATH_IMAGE033
To represent
Figure 798127DEST_PATH_IMAGE035
Splicing to obtain the final single product
Figure 732585DEST_PATH_IMAGE033
Represent
Figure 522686DEST_PATH_IMAGE036
Further, the step S4 specifically includes:
s41: constructing a user static feature matrix according to the created interaction graph
Figure 73753DEST_PATH_IMAGE037
;
Wherein, the user feature matrix
Figure 341923DEST_PATH_IMAGE038
Each row of (a) represents a user
Figure 599729DEST_PATH_IMAGE039
Is expressed as
Figure 560732DEST_PATH_IMAGE040
This is also the user
Figure 67937DEST_PATH_IMAGE039
An initial vector of a first layer in the graph neural network;
s42: from the level of fashion singles, by means of a graph neural network, aggregating the states of neighbor nodes and aiming at users
Figure 139798DEST_PATH_IMAGE039
The characteristic vector is updated, and the user can be obtained through state updating
Figure 580007DEST_PATH_IMAGE039
Node representation at the kth level
Figure 446332DEST_PATH_IMAGE041
S43: from the user interaction relation level, by means of the graph neural network, the states of the neighbor nodes are aggregated, and the users are provided with the information
Figure 706412DEST_PATH_IMAGE039
The characteristic vector is updated, and the user can be obtained through state updating
Figure 316385DEST_PATH_IMAGE039
Node representation at the kth level
Figure 548783DEST_PATH_IMAGE042
S44: users to be obtained from the fashion singles level
Figure 586009DEST_PATH_IMAGE039
To represent
Figure 333385DEST_PATH_IMAGE041
And users obtained from the user interaction layer
Figure 747049DEST_PATH_IMAGE039
To represent
Figure 896270DEST_PATH_IMAGE042
Splicing to obtain the final user
Figure 369977DEST_PATH_IMAGE039
To represent
Figure 339070DEST_PATH_IMAGE043
Furthermore, when the graph neural network aggregates information of neighboring nodes, a strategy of sampling effective nodes to participate in information aggregation in different network layers according to different probabilities p is adopted, and the detailed steps are as follows:
a1: the relation between the sampling probability and the number of layers of the neural network of the graph is as follows:
Figure 290846DEST_PATH_IMAGE044
whereinAs the first of a neural networkA layer of a material selected from the group consisting of,the sampling probability of the layer network;
a2: for the target nodeAccording to the sampling probabilityAt the first of the networkThe layers perform sampling nodes to participate in information aggregation of target nodes, wherein the number of the sampling nodesnumAnd probabilitypThe relationship of (1) is:
Figure 497836DEST_PATH_IMAGE045
wherein, in the step (A),dis a target nodeoThe degree of (a) is greater than (b),ceilis a function rounded up;
a3: calculating target nodesoAll neighbors and nodes ofoBefore removal ofnumThe nodes with large similarity participate in the target nodeoThe feature of (1) is aggregated.
Further, the step S5 specifically includes:
s51: the features of the jacket obtained through S3 are expressed
Figure 876865DEST_PATH_IMAGE046
With lower clothesRepresentation of features
Figure 333254DEST_PATH_IMAGE047
Sending the data to an upper garment-lower garment compatibility modeling module;
s52: calculating compatibility scores of upper garment and lower garment
Figure 88720DEST_PATH_IMAGE048
In the formula (I), the compound is shown in the specification,
Figure 212534DEST_PATH_IMAGE049
representing a dot product operation.
Further, the step S6 specifically includes:
s61: the lower clothes characteristics obtained through the step of S3 are expressed
Figure 28043DEST_PATH_IMAGE050
And the user feature representation obtained through the step of S4
Figure 971729DEST_PATH_IMAGE051
Sending the data to a user personalized modeling module;
s62: calculating a user's like score for a given garment
Figure 265307DEST_PATH_IMAGE052
Further, the step S7 specifically includes:
s71: obtaining the upper garment and lower garment compatibility scores according to the step of S52
Figure 181310DEST_PATH_IMAGE053
And the user' S like score for a given lower garment obtained according to the step of S52
Figure 167721DEST_PATH_IMAGE054
Garment compatibility with user personalized preferences computed
Figure 598702DEST_PATH_IMAGE055
:
Figure 961550DEST_PATH_IMAGE056
Where pi is a non-negative trade-off parameter between (0, 1).
S72: the Loss was calculated using BPR Loss:
Figure 528798DEST_PATH_IMAGE057
in the formula (I), the compound is shown in the specification,
Figure 686110DEST_PATH_IMAGE058
Figure 807649DEST_PATH_IMAGE059
coat with personalized factors fused respectively
Figure 443030DEST_PATH_IMAGE060
With lower clothes
Figure 130363DEST_PATH_IMAGE061
Lower clothes
Figure 458577DEST_PATH_IMAGE062
The compatibility of (c).
The method can be applied to the fields of fashion recommendation, clothes matching and the like.
The invention has the following advantages and beneficial effects:
(1) the invention provides a specific application of a graph neural network in the field of fashion recommendation, and by mining interaction relations between users and fashion singles and by means of a graph representation learning method, feature representations of the users and the fashion singles can be fully learned, so that the recommendation accuracy is improved.
(2) According to the method, different sampling probabilities are adopted in different layers of the graph neural network, effective nodes are sampled as much as possible, and noise nodes are abandoned, so that the representation capability of the model on target nodes is improved; meanwhile, the problem of over-smoothness caused by excessive layers in the neural network of the graph is effectively solved by the probability sampling strategy, and the robustness of the model is improved.
Drawings
FIG. 1 is a block diagram of the overall modeling process of the present invention;
FIG. 2 is a flow chart of a modeling method of use of the present invention.
Detailed Description
The invention will be further elucidated with reference to the drawing.
Example (b):
a user personalization and compatibility method of a graph neural network based on effective neighbor sampling maximization comprises the following steps:
firstly, establishing a basic model (as shown in figures 1-2), wherein the model mainly comprises a composition module, a fashion single-item representation module, a user representation module, an upper garment-lower garment compatibility module, a user personalization module and a recommendation module, the composition module is connected with the fashion single-item representation module and the user representation module, the fashion single-item representation module is connected with the upper garment-lower garment compatibility module, the user representation module is connected with the user personalization module, and the upper garment-lower garment compatibility module and the user personalization module are connected with the recommendation module;
a patterning module: constructing a user-fashion single picture, a user-user interaction relation picture, a fashion single-user picture and a fashion single-fashion single picture so as to provide an interaction picture for a fashion single representation module and a user representation module;
fashion singleton representation module: and the method is used for inputting two interaction graphs of the fashion single item-user graph and the fashion single item-fashion single item graph as well as a randomly initialized user and fashion single item feature representation matrix into the constructed graph neural network model so as to obtain the updated fashion single item feature representation. The fashion singles representation module comprises: the first representation submodule, the second representation submodule and the third representation submodule are connected with the third representation submodule;
a first representation submodule: the method is used for obtaining the fashion single-item feature representation from the single-item level by means of a graph neural network and combining an attention mechanism from a fashion single-item interaction space;
a second representation sub-module: the system is used for obtaining feature representation of the single item from a user level by means of a graph neural network and combining an attention mechanism from a fashion single item-user interaction space;
a third representation submodule: and taking the output contents of the first representation submodule and the second representation submodule as input for fusing the single-item feature representation obtained from the single-item level and the single-item feature representation obtained from the user level to obtain the final feature representation of the target fashion single item.
A user representation module: and the method is used for inputting a user-fashion single-item diagram, two interaction diagrams of the user-user interaction diagram and a randomly initialized user and single-item feature representation matrix into the constructed diagram neural network model to obtain the updated user feature representation. The user representation module comprises: the first representation submodule, the second representation submodule and the third representation submodule are connected with the third representation submodule;
a first representation submodule: the system is used for obtaining user feature representation from a single-item level by means of a graph neural network and combining an attention mechanism from a user-fashion single-item interaction space;
the second representation submodule: the system comprises a neural network module, a user interaction space and an attention mechanism module, wherein the neural network module is used for acquiring a feature representation of a user from a user interaction layer by means of a graph neural network and fusing the attention mechanism;
a third representation submodule: and taking the output contents of the first representation submodule and the second representation submodule as input, and fusing the user feature representation obtained from the single-product level and the user feature representation obtained from the user interaction level to obtain the final feature representation of the target user.
Upper garment-lower garment compatibility module: the upper garment feature representation and the lower garment feature representation obtained from the fashion single representation module are used as input information and are sent into the module, and the compatibility degree of the upper garment and the lower garment is calculated;
a user personalization module: the lower clothing feature representation obtained from the fashion item representation module and the user feature representation obtained from the user representation module are used as input information and are sent into the module, and the user's liking degree of a given lower clothing is calculated;
a recommendation module: and taking the compatibility degree of the upper garment and the lower garment obtained by the upper garment-lower garment compatibility module and the user like degree of the lower garment obtained by the user personalization module as input, calculating the compatibility score of the upper garment and the lower garment integrated with the personalization factors, and generating recommendation for the user according to the score.
The concrete modeling method based on the model comprises the following steps:
s1: constructing a user-fashion single picture, a user-user interaction relation picture, a fashion single-user picture and a fashion single-fashion single picture, wherein the four constructed interaction pictures are undirected pictures, and the detailed interaction picture constructing mode is as follows:
s11: first, two symbolic representations are defined
Figure 864150DEST_PATH_IMAGE001
To compriseNA user set of individual users;
Figure 580520DEST_PATH_IMAGE063
to compriseMA set of garments for each fashion item;
s12: creating a dictionary-style fashion item-fashion item interaction diagram
Figure 122359DEST_PATH_IMAGE064
Wherein, in the step (A),
Figure 621474DEST_PATH_IMAGE004
and
Figure 452027DEST_PATH_IMAGE005
representing the destination node of the fashion singles,
Figure 429210DEST_PATH_IMAGE006
and
Figure 91135DEST_PATH_IMAGE007
are respectively nodes
Figure 761151DEST_PATH_IMAGE065
Figure 141317DEST_PATH_IMAGE066
The neighbor node set of (2);
s13: creating a dictionary-style fashion singleton-user interaction diagram
Figure 922191DEST_PATH_IMAGE067
Wherein, in the process,
Figure 438623DEST_PATH_IMAGE068
and
Figure 482803DEST_PATH_IMAGE005
representing the destination node of the fashion singles,
Figure 84685DEST_PATH_IMAGE011
and
Figure 669250DEST_PATH_IMAGE012
are respectively nodes
Figure 40189DEST_PATH_IMAGE069
Figure 317586DEST_PATH_IMAGE014
The neighbor node set of (2);
s14: creating a dictionary-form user-user interaction graph
Figure 141186DEST_PATH_IMAGE015
Wherein, in the process,
Figure 529442DEST_PATH_IMAGE020
and
Figure 958149DEST_PATH_IMAGE021
which represents the destination node of the user,
Figure 140869DEST_PATH_IMAGE018
and
Figure 717344DEST_PATH_IMAGE019
are respectively nodes
Figure 909291DEST_PATH_IMAGE020
Figure 989242DEST_PATH_IMAGE021
The neighbor node set of (2);
s15: creating a dictionary-style user-fashion singleton interaction diagram
Figure 608442DEST_PATH_IMAGE070
Wherein, in the step (A),
Figure 406634DEST_PATH_IMAGE023
and
Figure 74376DEST_PATH_IMAGE024
which represents the destination node of the user,
Figure 274413DEST_PATH_IMAGE025
and
Figure 64514DEST_PATH_IMAGE026
are respectively nodes
Figure 350002DEST_PATH_IMAGE071
Figure 883752DEST_PATH_IMAGE072
The neighbor node set of (2);
in the four interactive graphs constructed above, two points need to be noted, wherein self-loops are not allowed to appear in one graph and are reflected to a dictionary form
Figure 938295DEST_PATH_IMAGE073
In the interaction graph ofvalueNone of the sets withkeyThe same elements; two of itvalueLength of collectionI.e. the degree of the target node.
The embodiment adopts a method of different probabilities in the neural networkpThe strategy of sampling effective nodes in different network layers to participate in information aggregation comprises the following detailed steps:
a1: the relation between the sampling probability and the number of layers of the neural network of the graph is as follows:
Figure 633719DEST_PATH_IMAGE074
in whichjAs the first of a neural networkjA layer of a material selected from the group consisting of,pthe sampling probability of the network of the layer;
a2: for the target nodeoAccording to the sampling probabilitypAt the first of the networkjThe layers perform sampling nodes to participate in information aggregation of target nodes, wherein the number of the sampling nodesnumAnd probabilitypThe relationship of (1) is:
Figure 609765DEST_PATH_IMAGE075
wherein, in the step (A),dis a target nodeoThe degree of (a) is greater than (b),ceilis a function of the rounding-up,ceilfunction and probabilitypEnsures the sampling node number of the target node
Figure 681626DEST_PATH_IMAGE076
A3: calculating target nodesoAll neighbors and nodes ofoBefore removal ofnumThe nodes with large similarity participate in the target nodeoThe operation of calculating the similarity specifically comprises the following steps:
Figure 590676DEST_PATH_IMAGE077
wherein the content of the first and second substances,
Figure 722581DEST_PATH_IMAGE078
is a target nodeoIs determined by the feature vector of (a),Eis a target nodeoOf the neighbor of (a), each row of the matrix representing itThe feature vector of the one neighbor in the set,
Figure 717081DEST_PATH_IMAGE079
the function has the effect of
Figure 592633DEST_PATH_IMAGE078
And feature matrixEEach line of (1) is dot-product, and finally, the dot product is calculated according to the calculatedMMatrix, before selection thereofnumThe index with large similarity is used for finding the corresponding node according to the index to participate in the target nodeoThe feature of (1) is aggregated.
S2: through the step S1, the four interaction maps are created, and the static interaction matrix of the fashion singleton is constructed
Figure 621769DEST_PATH_IMAGE080
Using a Pythrch frame
Figure 658996DEST_PATH_IMAGE081
Whereinnum_embIs the total number of fashion items,emb_dimis the dimension of the feature vector. Wherein, the matrix
Figure 671951DEST_PATH_IMAGE080
Each row of (A) represents a fashion item
Figure 85615DEST_PATH_IMAGE082
Is expressed as
Figure 969257DEST_PATH_IMAGE083
This is also a single item
Figure 177384DEST_PATH_IMAGE082
Initial vectors at the first layer in the neural network of the graph.
The information gathering process in the neural network of the graph is described in detail by taking the fashion single product as an example:
s21: gathering information at the fashion singles level:
Figure 146478DEST_PATH_IMAGE084
wherein the content of the first and second substances,
Figure 363832DEST_PATH_IMAGE085
coat capable of showing clothes-clothes space
Figure 101981DEST_PATH_IMAGE086
In the neural network of the figure
Figure 481010DEST_PATH_IMAGE087
The representation vector of the layer is represented by,
Figure 734137DEST_PATH_IMAGE088
is a weighted aggregator that performs aggregation operations in the garment-to-garment space.
Figure 489603DEST_PATH_IMAGE089
Show coat
Figure 82258DEST_PATH_IMAGE086
All of the neighbors of (a) are,
Figure 897768DEST_PATH_IMAGE090
coat capable of showing clothes space
Figure 372611DEST_PATH_IMAGE086
Is in the neural network of the graph
Figure 666189DEST_PATH_IMAGE091
The representation vector of the layer is represented by,
Figure 378931DEST_PATH_IMAGE092
and
Figure 162079DEST_PATH_IMAGE093
respectively representing weights and bias which can be learned in the neural network, wherein sigma is a sigmoid nonlinear activation function, and ^ indicates a splicing operation.
The detailed neighbor aggregation operation at the fashion singleton level is as follows:
Figure 327481DEST_PATH_IMAGE094
an attention mechanism is used in information aggregation, in which,
Figure 221488DEST_PATH_IMAGE095
indicating the attention coefficients of the different network layers. The detailed attention coefficient is obtained as follows:
Figure 523156DEST_PATH_IMAGE096
Figure 477205DEST_PATH_IMAGE097
wherein the content of the first and second substances,
Figure 864324DEST_PATH_IMAGE098
and with
Figure 765284DEST_PATH_IMAGE099
Garment with separate indication
Figure 452618DEST_PATH_IMAGE086
User of
Figure 780831DEST_PATH_IMAGE100
In the neural network of the figure
Figure 920825DEST_PATH_IMAGE101
A representation of the layer.
S22: and (3) aggregating information at a user plane:
Figure 184115DEST_PATH_IMAGE102
wherein the content of the first and second substances,
Figure 991534DEST_PATH_IMAGE103
show clothing-user figure middle upper garment
Figure 490649DEST_PATH_IMAGE086
In a neural network of the figure
Figure 117939DEST_PATH_IMAGE101
The representation vector of the layer is represented by,
Figure 360702DEST_PATH_IMAGE104
is a weighted aggregator that performs aggregation operations in the garment-user space.
Figure 553786DEST_PATH_IMAGE105
Show coat
Figure 223801DEST_PATH_IMAGE086
All of the neighbors of (a) are,
Figure 807230DEST_PATH_IMAGE106
clothing in presentation clothing-user space
Figure 853683DEST_PATH_IMAGE086
Of (2)
Figure 370115DEST_PATH_IMAGE107
In the first place
Figure 742190DEST_PATH_IMAGE108
The representation vector of the layer.
Figure 734286DEST_PATH_IMAGE109
And
Figure 646747DEST_PATH_IMAGE110
respectively, representing weights and biases that can be learned among the neural network of the garment-user space.
The detailed neighbor aggregation operation at the user plane is as follows:
Figure 548844DEST_PATH_IMAGE111
in the formula, an attention mechanism and an attention coefficient are also adopted in the polymerization process
Figure 295083DEST_PATH_IMAGE112
The obtaining manner is the same as S21.
S23: jacket to be obtained from fashion singles
Figure 181000DEST_PATH_IMAGE086
Represent
Figure 303677DEST_PATH_IMAGE085
And a jacket obtained from the user level
Figure 794701DEST_PATH_IMAGE086
To represent
Figure 977420DEST_PATH_IMAGE103
Splicing to obtain the final coat
Figure 288316DEST_PATH_IMAGE086
To represent
Figure 214684DEST_PATH_IMAGE113
The vector splicing method is as follows:
Figure 560214DEST_PATH_IMAGE114
in the same way, the lower clothes can be obtained
Figure 913835DEST_PATH_IMAGE115
Is finally expressed
Figure 915289DEST_PATH_IMAGE116
S3: via S1, the above-created four are utilizedAn interaction graph for constructing a static interaction matrix of users
Figure 645348DEST_PATH_IMAGE117
Using a Pythrch frame
Figure 845385DEST_PATH_IMAGE118
Whereinnum_embIs the total number of users and is,emb_dimis the dimension of the feature vector. Wherein, the matrix
Figure 369907DEST_PATH_IMAGE117
Each row of (a) represents a user
Figure 655395DEST_PATH_IMAGE119
Is expressed as
Figure 189145DEST_PATH_IMAGE120
This is also the user
Figure 243689DEST_PATH_IMAGE121
Initial vectors at the first layer in the neural network of the graph. The detailed information gathering process of the neural network of the graph is as follows:
s31: gathering information from the fashion singles level:
Figure 939112DEST_PATH_IMAGE122
wherein, the first and the second end of the pipe are connected with each other,
Figure 711896DEST_PATH_IMAGE123
representing users in a user-clothing space
Figure 49336DEST_PATH_IMAGE124
In the network
Figure 958387DEST_PATH_IMAGE101
The representation vector of the layer is represented by,
Figure 90291DEST_PATH_IMAGE125
is a weighted aggregator employed in the present invention.
Figure 84791DEST_PATH_IMAGE126
Representing a user
Figure 960344DEST_PATH_IMAGE124
All of the neighbors of (a) are,
Figure 723900DEST_PATH_IMAGE127
representing users in a user-clothing space
Figure 26706DEST_PATH_IMAGE124
Of (2)
Figure 711765DEST_PATH_IMAGE107
In the first place
Figure 125429DEST_PATH_IMAGE091
The representation vector of the layer is represented by,
Figure 9071DEST_PATH_IMAGE128
and
Figure 482778DEST_PATH_IMAGE129
respectively representing weights and biases that can be learned in the neural network.
The detailed neighbor aggregation operation at the fashion singleton level is as follows:
Figure 451871DEST_PATH_IMAGE130
in the formula, an attention mechanism and an attention coefficient are also adopted in the polymerization process
Figure 669225DEST_PATH_IMAGE131
The obtaining manner is the same as S21.
S32: aggregating information from the user interaction level:
Figure 672954DEST_PATH_IMAGE132
wherein the content of the first and second substances,
Figure 317562DEST_PATH_IMAGE133
representing that user u is in the neural network of the graph in the user interaction space
Figure 773951DEST_PATH_IMAGE101
The representation vector of the layer is represented by,
Figure 529417DEST_PATH_IMAGE134
is the weighted aggregator employed in the present invention,
Figure 387652DEST_PATH_IMAGE135
representing a user
Figure 937582DEST_PATH_IMAGE136
All of the neighbors of (a) are,
Figure 84529DEST_PATH_IMAGE137
representing users in a user-interaction space
Figure 643687DEST_PATH_IMAGE138
Of (2)
Figure 622007DEST_PATH_IMAGE107
In the neural network of the figure
Figure 608417DEST_PATH_IMAGE139
The representation vector of the layer is represented by,
Figure 773820DEST_PATH_IMAGE140
and
Figure 882807DEST_PATH_IMAGE141
weights and biases among the neural networks representing the user interaction space, respectively, can be learned.
The detailed neighbor aggregation operation of the user interaction level is as follows:
Figure 715634DEST_PATH_IMAGE142
in the formula, an attention mechanism is also adopted in the polymerization process, and the attention coefficient η is obtained in the same manner as in S21.
S33: user representation to be obtained from fashion singles level
Figure 872946DEST_PATH_IMAGE143
With user representation obtained from the user interaction layer
Figure 525644DEST_PATH_IMAGE144
Splicing to obtain the final user representation
Figure 426604DEST_PATH_IMAGE145
Figure 113937DEST_PATH_IMAGE146
S4: the jacket obtained by the step of S2
Figure 442150DEST_PATH_IMAGE147
Representation of features
Figure 582145DEST_PATH_IMAGE148
And a lower garment
Figure 21216DEST_PATH_IMAGE149
Is characterized by
Figure 94215DEST_PATH_IMAGE150
Is sent to
Clothes-lower clothes compatibility module and calculation upper garment
Figure 593329DEST_PATH_IMAGE147
With lower clothes
Figure 220619DEST_PATH_IMAGE149
The specific calculation process of the compatibility score is as follows:
Figure 197803DEST_PATH_IMAGE151
the simplest dot product method is used in the above equation to calculate the compatibility score of the upper garment and the lower garment.
S5: the lower clothes obtained through the step of S2
Figure 594149DEST_PATH_IMAGE152
Is characteristic of
Figure 264165DEST_PATH_IMAGE153
And the user characteristic representation obtained by the step of S3
Figure 378751DEST_PATH_IMAGE154
Sending the user preference score to a user personalization module, and calculating the preference score of the user for the given lower garment, wherein the specific calculation process is as follows:
Figure 425205DEST_PATH_IMAGE155
the simplest dot product method is used in the above equation to calculate the user's like score for a given under-garment.
S6: the fashion recommendation of the lower garment is mainly completed in the recommendation module, and the detailed steps are as follows:
s61: the upper garment is obtained through the steps of S4 and S5 respectively
Figure 941637DEST_PATH_IMAGE156
With lower clothes
Figure 579291DEST_PATH_IMAGE157
Is given a compatibility score
Figure 977912DEST_PATH_IMAGE158
And the user
Figure 296898DEST_PATH_IMAGE159
For clothes
Figure 464574DEST_PATH_IMAGE160
Degree of preference of
Figure 476392DEST_PATH_IMAGE161
Garment compatibility with user personalized preferences fused
Figure 768833DEST_PATH_IMAGE162
Can be expressed as follows:
Figure 891510DEST_PATH_IMAGE163
s62: four tuples are constructed from the existing dataset:
Figure 116955DEST_PATH_IMAGE164
wherein, in quadrupletOShowing a set of upper and lower garments. Quadruplet
Figure 565254DEST_PATH_IMAGE165
Shown in a given coat
Figure 141729DEST_PATH_IMAGE166
In the case of (2), the user
Figure 68096DEST_PATH_IMAGE167
More like getting off the clothes
Figure 413627DEST_PATH_IMAGE168
Rather than to
Figure 970510DEST_PATH_IMAGE169
S63: by definition of the BPR loss function, the objective function is as follows:
Figure 768702DEST_PATH_IMAGE170
Figure 764340DEST_PATH_IMAGE171
wherein, λ is a non-negative hyperparameter,
Figure 964377DEST_PATH_IMAGE172
representing the parameters that occur throughout the model.
S64: according to the calculated loss value, parameters in the neural network of the graph are updated by adopting a random gradient descent method until the model converges, and then the training work of the model is completed.
S65: meanwhile, in the whole experiment process, the data set is divided into a training set, a verification set and a test set, the proportion of the three data sets is 7:2:1, wherein the training set is used for training model parameters, the verification set is used for adjusting hyper-parameters in the model, and the test set is used for evaluating the performance of the model.
The personalized modeling method of the graph neural network based on the effective neighbor sampling maximization fully utilizes the user, the fashion single product and the interaction information between the user and the fashion single product, constructs the user, the fashion single product and the interaction graph between the user and the fashion single product, and accurately represents the user and the fashion single product by means of the graph neural network and by adopting a maximization neighbor sampling strategy, so that high-quality recommendation conforming to the actual situation is realized. The invention realizes more accurate entity feature representation, thereby completing the personalized fashion recommendation task.
Finally, although the present description refers to embodiments, not every embodiment contains only a single technical solution, and such description of the present description is for clarity reasons only, and those skilled in the art should make the description as a whole, and the technical solutions in the embodiments can be appropriately combined to form other embodiments that can be understood by those skilled in the art.

Claims (9)

1. The personalized modeling method of the graph neural network based on the effective neighbor sampling maximization is characterized by comprising the following steps of:
s1: firstly, constructing a basic model; the basic model comprises a composition module, a fashion single item representation module, a user representation module, an upper garment-lower garment compatibility module, a user personalization module and a recommendation module; the fashion single-item representation module and the user representation module are connected with the user personalization module, and the upper garment-lower garment compatibility module and the user personalization module are connected with the recommendation module;
s2: according to an interaction diagram between the user and the fashion single product: constructing a user-fashion single picture, a user interaction picture, a fashion single-user picture and a fashion single-fashion single interaction picture;
s3: according to the fashion single item interaction diagram created in the step S2, creating a fashion single item feature representation matrix, constructing a diagram neural network model with a common L layer, sending the fashion single item feature representation matrix into the diagram neural network model, and obtaining the updated fashion single item feature representation matrix, wherein the updated fashion single item feature representation matrix comprises the feature representation of an upper garment and the feature representation of a lower garment;
s4: creating a characteristic representation matrix of the user according to the user interaction diagram created in the step S2, and sending the characteristic representation matrix of the user into the neural network model by using the neural network model created in the step S3 to obtain an updated user characteristic representation matrix, namely user characteristic representation;
in S3 and S4, the neural network model adopts a strategy of performing sampling on effective nodes in different network layers according to different probabilities p to participate in information aggregation;
s5: through the step of S3, acquiring the feature representation of the upper garment and the feature representation of the lower garment, sending the two feature representations into the upper garment-lower garment compatibility module, and calculating the compatibility scores of the upper garment and the lower garment;
s6: respectively obtaining the characteristic representation and the user characteristic representation of the lower garment through the steps of S3 and S4, and sending the two representations into the user personalization module to obtain the preference score of the user for the given lower garment;
s7: the compatibility scores of the upper garment and the lower garment obtained from the step of S5 and the preference score of the user for the given lower garment obtained from the step of S5 are fed to a recommending module to obtain the compatibility scores of the upper garment and the lower garment with the personalized factors fused.
2. The personalized modeling method of claim 1, wherein the composition module: according to the relation between the user and the fashion single, constructing a user-fashion single graph, a user-user relation graph, a fashion single-user graph and a fashion single-fashion single graph;
the fashion singles representation module: the system comprises a graph neural network model, a fashion single product-user graph, a fashion single product-fashion single product graph, a user and single product characteristic representation matrix, a user and a single product characteristic representation matrix, wherein the user and the single product characteristic representation matrix are initialized randomly;
the user representation module: the system comprises a neural network model, a user-fashion single-item graph, a user-user relation graph, a user and single-item feature representation matrix, a user-fashion single-item graph and a user and single-item feature representation matrix, wherein the user-fashion single-item graph and the user-user relation graph are initialized randomly;
the upper garment-lower garment compatibility module: the upper garment feature representation and the lower garment feature representation obtained from the fashion single representation module are used as input information and are sent into the module, and the compatibility degree of the upper garment and the lower garment is calculated;
the user personalization module: the lower clothes feature representation obtained from the fashion single representation module and the user feature representation obtained from the user representation module are used as input information and are sent into the module, and the like degree of the user for the given lower clothes is calculated;
the recommendation module: and taking the compatibility degree of the upper garment and the lower garment obtained by the upper garment-lower garment compatibility module and the user like degree of the lower garment obtained by the user personalization module as input, calculating the compatibility score of the upper garment and the lower garment integrated with the personalization factors, and generating recommendation for the user according to the score.
3. The personalized modeling method of claim 1, wherein the fashion singles representation module comprises a first representation submodule, a second representation submodule, and a third representation submodule, wherein the first representation submodule and the second representation submodule are connected to the third representation submodule;
a first representation submodule: the method is used for obtaining single-item feature representation from a single-item level by means of a graph neural network model and combining an attention mechanism from a fashion single-item interaction space;
the second representation submodule: the system is used for obtaining the feature representation of the single product from the user level by means of a graph neural network model and combining an attention mechanism from a fashion single product-user interaction space;
a third representation submodule: and taking the output contents of the first representation submodule and the second representation submodule as input for fusing the single-item feature representation obtained from the single-item level and the single-item feature representation obtained from the user level to obtain the final feature representation of the target fashion single item.
4. The personalized modeling method of claim 1, wherein the user representation module comprises a first representation sub-module, a second representation sub-module, and a third representation sub-module, wherein the first representation sub-module and the second representation sub-module are connected to the third representation sub-module;
a first representation submodule: the system is used for obtaining user feature representation from a single-item level by means of a graph neural network and combining an attention mechanism from a user-fashion single-item interaction space;
the second representation submodule: the system comprises a neural network and a user interaction interface, wherein the neural network is used for acquiring a characteristic representation of a user from a user interaction layer by means of a graph neural network and fusing an attention mechanism;
a third representation submodule: and taking the output contents of the first representation submodule and the second representation submodule as input, and fusing the user feature representation obtained from the single-product level and the user feature representation obtained from the user interaction level to obtain the final feature representation of the target user.
5. The personalized modeling method according to claim 1, wherein in the step S2, the specific method for creating the interaction graph is as follows:
s21: first, two symbolic representations are defined
Figure DEST_PATH_IMAGE002
The method comprises the steps of (1) setting a user set containing N users;
Figure DEST_PATH_IMAGE004
a garment set containing M fashion singles;
s22: creating a dictionary-style fashion item-fashion item interaction diagram
Figure DEST_PATH_IMAGE006
Which, it does,
Figure DEST_PATH_IMAGE008
and
Figure DEST_PATH_IMAGE010
the node of the target is represented by,
Figure DEST_PATH_IMAGE012
and
Figure DEST_PATH_IMAGE014
are respectively nodes
Figure DEST_PATH_IMAGE016
Figure DEST_PATH_IMAGE018
The neighbor node set of (2);
s23: creating a dictionary-style fashion item-user interaction diagram
Figure DEST_PATH_IMAGE020
Wherein, in the step (A),
Figure DEST_PATH_IMAGE022
and
Figure DEST_PATH_IMAGE024
the node of the target is represented by,
Figure DEST_PATH_IMAGE026
and
Figure DEST_PATH_IMAGE028
are respectively nodes
Figure DEST_PATH_IMAGE030
Figure DEST_PATH_IMAGE032
The neighbor node set of (2);
s24: creating a dictionary-form user-user interaction graph
Figure DEST_PATH_IMAGE034
Wherein, in the step (A),
Figure DEST_PATH_IMAGE036
and
Figure DEST_PATH_IMAGE038
the node of the target is represented by,
Figure DEST_PATH_IMAGE040
and with
Figure DEST_PATH_IMAGE042
Are respectively nodes
Figure DEST_PATH_IMAGE044
Figure DEST_PATH_IMAGE046
The neighbor node set of (2);
s25: creating a dictionary-style user-fashion singleton interaction diagram
Figure DEST_PATH_IMAGE048
Wherein, in the step (A),
Figure DEST_PATH_IMAGE050
and
Figure DEST_PATH_IMAGE052
the node of the target is represented by,
Figure DEST_PATH_IMAGE054
and
Figure DEST_PATH_IMAGE056
are respectively nodes
Figure DEST_PATH_IMAGE058
Figure DEST_PATH_IMAGE060
Set of neighbor nodes.
6. The personalized modeling method according to claim 1, wherein the step S3 specifically comprises:
s31: constructing a static feature matrix of the fashion single item according to the created interactive graph
Figure DEST_PATH_IMAGE062
;
Wherein, the fashion single item feature matrix
Figure DEST_PATH_IMAGE063
Each row of (1) representsFashion single article
Figure DEST_PATH_IMAGE065
Is expressed as
Figure DEST_PATH_IMAGE067
This is also a single item
Figure 804806DEST_PATH_IMAGE065
An initial vector of a first layer in the graph neural network;
s32: from the level of fashion single products, by means of the neural network of the graph, the states of the neighbor nodes are aggregated, and the single products are matched
Figure DEST_PATH_IMAGE068
The characteristic vectors are updated, and the singles are obtained through state updating
Figure DEST_PATH_IMAGE069
Node representation at the kth level
Figure DEST_PATH_IMAGE071
S33: from the user level, aggregating the states of the neighbor nodes by means of the graph neural network, and aligning the singles
Figure 144783DEST_PATH_IMAGE065
The characteristic vectors are updated, and the singles are obtained through state updating
Figure DEST_PATH_IMAGE073
Node representation at the kth level
Figure DEST_PATH_IMAGE075
S34: the items to be obtained from the fashion item level
Figure DEST_PATH_IMAGE076
To represent
Figure DEST_PATH_IMAGE078
And the singleton obtained from the user plane
Figure DEST_PATH_IMAGE079
To represent
Figure DEST_PATH_IMAGE080
Splicing to obtain the final single product
Figure DEST_PATH_IMAGE081
To represent
Figure DEST_PATH_IMAGE083
7. The personalized modeling method according to claim 1, wherein the step S4 specifically comprises:
s41: constructing a user static feature matrix according to the created interaction graph
Figure DEST_PATH_IMAGE085
;
Wherein, the user feature matrix
Figure DEST_PATH_IMAGE086
Each row of (a) represents a user
Figure DEST_PATH_IMAGE088
Is expressed as
Figure DEST_PATH_IMAGE090
This is also the user
Figure 737569DEST_PATH_IMAGE088
An initial vector of a first layer in the graph neural network;
s42: from the fashion single-item level, the neighbors are aggregated by means of a graph neural networkThe state of the node, to the user
Figure 867199DEST_PATH_IMAGE088
The characteristic vector is updated, and the user can be obtained through state updating
Figure 369987DEST_PATH_IMAGE088
Node representation at the kth level
Figure DEST_PATH_IMAGE092
S43: from the user interaction relation level, by means of the graph neural network, the states of the neighbor nodes are aggregated, and the users are provided with the information
Figure 317083DEST_PATH_IMAGE088
The characteristic vector is updated, and the user can be obtained through state updating
Figure 725062DEST_PATH_IMAGE088
Node representation at the kth level
Figure DEST_PATH_IMAGE094
S44: users to be obtained from a fashion singleton level
Figure 564709DEST_PATH_IMAGE088
To represent
Figure DEST_PATH_IMAGE095
And users obtained from the user interaction layer
Figure 623801DEST_PATH_IMAGE088
To represent
Figure 882744DEST_PATH_IMAGE094
Splicing to obtain the final user
Figure DEST_PATH_IMAGE096
Represent
Figure DEST_PATH_IMAGE098
8. The personalized modeling method according to claim 1, wherein the step S5 specifically comprises:
s51: the jacket characteristics obtained by S3 are expressed
Figure DEST_PATH_IMAGE100
With lower garment characterization
Figure DEST_PATH_IMAGE102
Sending the data to an upper garment-lower garment compatibility modeling module;
s52: calculating compatibility scores of upper garment and lower garment
Figure DEST_PATH_IMAGE104
In the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE106
representing a dot product operation;
the step S6 specifically includes:
s61: the lower clothes characteristics obtained by the step of S3 are expressed
Figure DEST_PATH_IMAGE108
And the user feature representation obtained through the step of S4
Figure DEST_PATH_IMAGE110
Sending the data to a user personalized modeling module;
s62: calculating a user's like score for a given garment
Figure DEST_PATH_IMAGE112
The step S7 specifically includes:
s71: obtaining the upper garment and lower garment compatibility scores according to the step of S52
Figure DEST_PATH_IMAGE114
And the user' S like score for a given lower garment obtained according to the step of S52
Figure DEST_PATH_IMAGE116
Garment compatibility with user personalized preferences integrated in calculation
Figure DEST_PATH_IMAGE118
:
Figure DEST_PATH_IMAGE120
Where π is a non-negative trade-off parameter between (0, 1);
s72: the Loss was calculated using BPR Loss:
Figure DEST_PATH_IMAGE122
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE124
Figure DEST_PATH_IMAGE126
coat with integrated personalized factors
Figure DEST_PATH_IMAGE128
With lower clothes
Figure DEST_PATH_IMAGE130
Lower clothes
Figure DEST_PATH_IMAGE132
The compatibility of (c).
9. The personalized modeling method according to claim 1 can be applied to the technical fields of fashion recommendation and clothes matching.
CN202210571867.7A 2022-05-25 2022-05-25 Personalized modeling method of graph neural network based on effective neighbor sampling maximization Active CN114707427B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210571867.7A CN114707427B (en) 2022-05-25 2022-05-25 Personalized modeling method of graph neural network based on effective neighbor sampling maximization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210571867.7A CN114707427B (en) 2022-05-25 2022-05-25 Personalized modeling method of graph neural network based on effective neighbor sampling maximization

Publications (2)

Publication Number Publication Date
CN114707427A CN114707427A (en) 2022-07-05
CN114707427B true CN114707427B (en) 2022-09-06

Family

ID=82176860

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210571867.7A Active CN114707427B (en) 2022-05-25 2022-05-25 Personalized modeling method of graph neural network based on effective neighbor sampling maximization

Country Status (1)

Country Link
CN (1) CN114707427B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113378048A (en) * 2021-06-10 2021-09-10 浙江工业大学 Personalized recommendation method based on multi-view knowledge graph attention network
CN113763300A (en) * 2021-09-08 2021-12-07 湖北工业大学 Multi-focus image fusion method combining depth context and convolution condition random field

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108875910A (en) * 2018-05-23 2018-11-23 山东大学 Garment coordination method, system and the storage medium extracted based on attention knowledge
CN108960959B (en) * 2018-05-23 2020-05-12 山东大学 Multi-mode complementary clothing matching method, system and medium based on neural network
US10824949B2 (en) * 2018-09-27 2020-11-03 Babylon Partners Limited Method and system for extracting information from graphs
CN110825963B (en) * 2019-10-18 2022-03-25 山东大学 Generation-based auxiliary template enhanced clothing matching scheme generation method and system
CN110807477B (en) * 2019-10-18 2022-06-07 山东大学 Attention mechanism-based neural network garment matching scheme generation method and system
CN111382309B (en) * 2020-03-10 2023-04-18 深圳大学 Short video recommendation method based on graph model, intelligent terminal and storage medium
KR20210127464A (en) * 2020-04-14 2021-10-22 주식회사 제이어스 Coodinating and styling methods and systems through deep learning
CN111881342A (en) * 2020-06-23 2020-11-03 北京工业大学 Recommendation method based on graph twin network
CN113158088A (en) * 2021-04-16 2021-07-23 桂林电子科技大学 Position recommendation method based on graph neural network
CN113592609B (en) * 2021-08-17 2024-06-04 中山大学 Personalized clothing collocation recommendation method and system utilizing time factors
CN114444369A (en) * 2021-09-29 2022-05-06 中南大学 Fashion suit compatibility modeling method based on heterogeneous graph neural network
CN114090902B (en) * 2021-11-22 2022-09-09 中国人民解放军国防科技大学 Social network influence prediction method and device based on heterogeneous network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113378048A (en) * 2021-06-10 2021-09-10 浙江工业大学 Personalized recommendation method based on multi-view knowledge graph attention network
CN113763300A (en) * 2021-09-08 2021-12-07 湖北工业大学 Multi-focus image fusion method combining depth context and convolution condition random field

Also Published As

Publication number Publication date
CN114707427A (en) 2022-07-05

Similar Documents

Publication Publication Date Title
CN109299396A (en) Merge the convolutional neural networks collaborative filtering recommending method and system of attention model
CN112364976B (en) User preference prediction method based on session recommendation system
CN111932336A (en) Commodity list recommendation method based on long-term and short-term interest preference
CN107563841A (en) A kind of commending system decomposed that scored based on user
CN110263257B (en) Deep learning based recommendation method for processing multi-source heterogeneous data
CN111310063A (en) Neural network-based article recommendation method for memory perception gated factorization machine
CN113590900A (en) Sequence recommendation method fusing dynamic knowledge maps
CN110008377B (en) Method for recommending movies by using user attributes
CN112115377A (en) Graph neural network link prediction recommendation method based on social relationship
CN115438732A (en) Cross-domain recommendation method for cold start user based on classification preference migration
CN115082147A (en) Sequence recommendation method and device based on hypergraph neural network
CN113592609A (en) Personalized clothing matching recommendation method and system using time factors
CN106897776A (en) A kind of continuous type latent structure method based on nominal attribute
CN115221413B (en) Sequence recommendation method and system based on interactive graph attention network
CN107391577A (en) A kind of works label recommendation method and system based on expression vector
CN115269977A (en) Recommendation method for fusion knowledge and collaborative information based on graph neural network
CN111414555A (en) Personalized recommendation method based on collaborative filtering
Ahamed et al. A recommender system based on deep neural network and matrix factorization for collaborative filtering
CN115374288A (en) Recommendation method based on knowledge graph and attention mechanism
CN114997476A (en) Commodity prediction method fusing commodity incidence relation
CN114742564A (en) False reviewer group detection method fusing complex relationships
CN114707427B (en) Personalized modeling method of graph neural network based on effective neighbor sampling maximization
CN116342228B (en) Related recommendation method based on directed graph neural network
CN111429175B (en) Method for predicting click conversion under sparse characteristic scene
CN110570226A (en) scoring prediction method combining topic model and heterogeneous information network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant