CN117391816A - Heterogeneous graph neural network recommendation method, device and equipment - Google Patents

Heterogeneous graph neural network recommendation method, device and equipment Download PDF

Info

Publication number
CN117391816A
CN117391816A CN202311478823.0A CN202311478823A CN117391816A CN 117391816 A CN117391816 A CN 117391816A CN 202311478823 A CN202311478823 A CN 202311478823A CN 117391816 A CN117391816 A CN 117391816A
Authority
CN
China
Prior art keywords
feature
cooperative
heterogeneous
user
article
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311478823.0A
Other languages
Chinese (zh)
Inventor
刘广聪
林源升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202311478823.0A priority Critical patent/CN117391816A/en
Publication of CN117391816A publication Critical patent/CN117391816A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/042Knowledge-based neural networks; Logical representations of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • G06N3/0455Auto-encoder networks; Encoder-decoder networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/082Learning methods modifying the architecture, e.g. adding, deleting or silencing nodes or connections

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Molecular Biology (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a heterogeneous graph neural network recommendation method, device and equipment, which comprise the steps of obtaining an article user set to be detected, inputting the article user set to be detected into a preset heterogeneous graph neural network model, executing coding operation on the article user set to be detected by adopting an embedded layer to generate a heterogeneous graph, and sub-sampling the heterogeneous graph by a small-batch graph sampling algorithm to generate node sub-graph data; inputting the node sub-graph data to a global view module for self-adaptive enhancement, and outputting a first article cooperative feature and a first user cooperative feature; carrying out structural data enhancement on the heterogeneous diagram through a local view module to generate a second object cooperative feature and a second user cooperative feature; and performing feature stitching on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature and the second user cooperative feature, and outputting an article recommendation score result. The technical problem of poor quality of article recommendation in the prior art is solved.

Description

Heterogeneous graph neural network recommendation method, device and equipment
Technical Field
The present invention relates to the field of digital information processing technologies, and in particular, to a heterogeneous neural network recommendation method, apparatus, and device.
Background
Under the e-commerce transaction scene, because the relationship among users, commodities and the relationship between the users and the commodities is complex and various, the knowledge graph is used as auxiliary information in the recommendation service, the information characterization can be enhanced by utilizing the isomerism of the context auxiliary information, and the complex isomerism relationship is processed by adopting the graph representation learning method, so that the method has higher advantages.
Under a real electronic market, the data of the interactions of the users and the articles are very sparse, and the situation that only a small amount of hot list item interaction records are more and the interaction proportion of most non-hot list items is low often occurs, and the popularity deviation causes long tail distribution and bias distribution of the data. Previous studies have demonstrated that the introduction of self-supervised learning can effectively solve the data sparseness problem, and the self-supervised learning method is divided into contrast learning and generative learning, the generative learning comprising generative antagonism network (Generative Adversarial Network, GAN) and self-encoder learning (AutoEncoder), the contrast learning comprising contrast learning at the structure level, contrast learning at the feature level and contrast learning at the model level.
In the prior art, a homogeneous graph neural network and a self-supervision learning method enhanced by utilizing a graph structure are adopted, and the distance between an original sample and a positive sample is adjusted through constructing contrast loss, so that the positive sample is closer, the negative sample is more distant, and commodities with higher similarity are recommended to a user.
Disclosure of Invention
The invention provides a heterogeneous graph neural network recommendation method, device and equipment, which solve the technical problem that the quality of article recommendation is poor due to the fact that only single node characteristics and node relations are considered in the prior art.
The heterogeneous graph neural network recommendation method provided by the first aspect of the invention comprises the following steps:
acquiring an article user set to be detected, and inputting the article user set to be detected into a preset heterogeneous graph neural network model, wherein the preset heterogeneous graph neural network model comprises an embedding layer, a global view module and a local view module;
performing coding operation on the user set of the object to be detected by adopting the embedded layer to generate a heterogeneous graph, and sub-sampling the heterogeneous graph by using a small-batch graph sampling algorithm to generate node sub-graph data;
inputting the node sub-graph data to the global view module for self-adaptive enhancement, and outputting a first article cooperative feature and a first user cooperative feature;
carrying out structural data enhancement on the heterogeneous graph through the local view module to generate a second object cooperative feature and a second user cooperative feature;
and performing feature stitching on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature and the second user cooperative feature, and outputting an article recommendation score result.
Optionally, the node sub-graph data includes a heterogeneous sub-graph, a source node and a target node; the global view module comprises a path perception network layer, a heterogeneous mutual attention layer, a heterogeneous message transmission layer and a target message aggregation layer; the step of inputting the node sub-graph data to the global view module for self-adaptive enhancement and outputting the first article cooperative characteristic and the first user cooperative characteristic comprises the following steps:
executing space conversion operation on the heterogeneous subgraphs through the path perception network layer to generate a path space;
inputting the path space, the source node and the target node into a heterogeneous mutual attention layer for attention splicing, and outputting multi-head attention;
information splicing is carried out on the target node by adopting a heterogeneous message transfer layer, and multi-head node information is generated;
the multi-head attention and the multi-head node information are aggregated and enhanced through a target message aggregation layer, and a first article feature set and a first user feature set are generated;
and respectively carrying out nonlinear processing on the first article feature set and the first user feature set by adopting a Leaky ReLU nonlinear function to generate a first article cooperative feature and a first user cooperative feature.
Optionally, the local view module includes a node drop layer, an edge perturbation layer, a first graph encoder, a second graph encoder, and a graph aggregator; the step of generating the second article cooperative feature and the second user cooperative feature by performing structural data enhancement on the heterogeneous map through the local view module includes:
adopting a node discarding layer to discard nodes of the node characteristic matrix in the heterogeneous graph to generate a heterogeneous discarding graph;
performing edge disturbance on an edge adjacent matrix in the heterogeneous graph through an edge disturbance layer to generate a heterogeneous disturbance graph;
inputting the heterogeneous discard diagram to a first diagram encoder for view enhancement, and outputting node enhancement characteristics;
performing view enhancement on the heterogeneous disturbance map by adopting a second map encoder, and outputting edge enhancement features;
respectively carrying out convolution operation on the node enhancement features and the edge enhancement features through a graph aggregator, and determining node convolution features and edge convolution features;
respectively carrying out nonlinear processing on the node convolution characteristic and the edge convolution characteristic by adopting a leak ReLU nonlinear function to generate a first target characteristic and a second target characteristic;
performing a summation operation on a first item target feature of the first target features and a second item target feature of the second target features to generate a second item cooperative feature;
And executing summation operation on the first user target feature in the first target features and the second user target feature in the second target features to generate a second user cooperative feature.
Optionally, the step of performing feature stitching on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature, and the second user cooperative feature, and outputting an article recommendation score result includes:
performing feature stitching on the first article cooperative features and the second article cooperative features, and outputting target article cooperative features;
performing feature stitching on the first user cooperative feature and the second user cooperative feature, and outputting a target user cooperative feature;
and executing inner product operation on the target object cooperative characteristic and the target user cooperative characteristic, and outputting an object recommendation score result.
Optionally, before the step of acquiring the set of users of the to-be-detected object and inputting the set of users of the to-be-detected object into the preset heterogram neural network model, the method comprises the following steps:
acquiring a to-be-trained article user set, inputting the to-be-trained article user set into an initial heterogeneous graph neural network model, and outputting a first to-be-trained article cooperative feature, a first to-be-trained user cooperative feature, a second to-be-trained article cooperative feature and a second to-be-trained user cooperative feature;
Calculating a target loss value by adopting the first to-be-trained article cooperative characteristic, the first to-be-trained user cooperative characteristic, the second to-be-trained article cooperative characteristic and the second to-be-trained user cooperative characteristic;
and if the target loss value is converged, taking the trained initial heterogeneous graph neural network model as the preset heterogeneous graph neural network model.
The heterogeneous graph neural network recommendation device provided in the second aspect of the present invention includes:
the acquisition module is used for acquiring a user set of the article to be detected and inputting the user set of the article to be detected into a preset heterogeneous graph neural network model, wherein the preset heterogeneous graph neural network model comprises an embedding layer, a global view module and a local view module;
the adoption module is used for executing coding operation on the user set of the object to be detected by adopting the embedded layer to generate a heterogeneous graph, and sub-sampling the heterogeneous graph by a small-batch graph sampling algorithm to generate node sub-graph data;
the self-adaptive enhancement module is used for inputting the node sub-graph data to the global view module for self-adaptive enhancement and outputting a first article cooperative characteristic and a first user cooperative characteristic;
The structure data enhancement module is used for enhancing the structure data of the heterogeneous graph through the local view module and generating a second article cooperative feature and a second user cooperative feature;
and the output result module is used for carrying out characteristic splicing on the first article cooperative characteristic, the first user cooperative characteristic, the second article cooperative characteristic and the second user cooperative characteristic and outputting an article recommendation score result.
Optionally, the node sub-graph data includes a heterogeneous sub-graph, a source node and a target node; the global view module comprises a path perception network layer, a heterogeneous mutual attention layer, a heterogeneous message transmission layer and a target message aggregation layer; the adaptive enhancement module comprises:
the first generation sub-module is used for executing space conversion operation on the heterogeneous subgraphs through the path perception network layer to generate a path space;
the first output sub-module is used for inputting the path space, the source node and the target node into the heterogeneous mutual attention layer for attention splicing and outputting multi-head attention;
the second generation submodule is used for carrying out information splicing on the target node by adopting a heterogeneous message transfer layer to generate multi-head node information;
The third generation submodule is used for carrying out aggregation enhancement on the multi-head attention and the multi-head node information through a target message aggregation layer to generate a first object feature set and a first user feature set;
and the fourth generation submodule is used for respectively carrying out nonlinear processing on the first article feature set and the first user feature set by adopting a leakage ReLU nonlinear function to generate a first article cooperative feature and a first user cooperative feature.
Optionally, the local view module includes a node drop layer, an edge perturbation layer, a first graph encoder, a second graph encoder, and a graph aggregator; the structural data enhancement module comprises:
a fifth generation sub-module, configured to perform node discarding on the node feature matrix in the heterogeneous graph by using a node discarding layer, to generate a heterogeneous discarding graph;
a sixth generation submodule, configured to perform edge perturbation on an edge adjacency matrix in the heterogeneous graph through an edge perturbation layer, to generate a heterogeneous perturbation graph;
the second output sub-module is used for inputting the heterogeneous discard image to the first image encoder for view enhancement and outputting node enhancement characteristics;
a third output sub-module, configured to perform view enhancement on the heterogeneous scramble map by using a second map encoder, and output an edge enhancement feature;
The first determining submodule is used for respectively carrying out convolution operation on the node enhancement feature and the edge enhancement feature through the graph aggregator and determining the node convolution feature and the edge convolution feature;
a seventh generating sub-module, configured to perform nonlinear processing on the node convolution feature and the edge convolution feature by using a leak ReLU nonlinear function, to generate a first target feature and a second target feature;
an eighth generation sub-module, configured to perform a summation operation on a first object target feature of the first target features and a second object target feature of the second target features, to generate a second object cooperative feature;
and a ninth generation sub-module, configured to perform a summation operation on a first user target feature of the first target features and a second user target feature of the second target features, to generate a second user cooperative feature.
Optionally, the output result module includes:
the fourth output submodule is used for carrying out feature stitching on the first article cooperative feature and the second article cooperative feature and outputting a target article cooperative feature;
the fifth output sub-module is used for carrying out feature stitching on the first user cooperative feature and the second user cooperative feature and outputting a target user cooperative feature;
And the sixth output sub-module is used for executing inner product operation on the target article cooperative characteristic and the target user cooperative characteristic and outputting an article recommendation score result.
An electronic device according to a third aspect of the present invention includes a memory and a processor, where the memory stores a computer program, and the computer program when executed by the processor causes the processor to execute the steps of the heterogeneous graph neural network recommendation method according to any one of the above.
From the above technical scheme, the invention has the following advantages:
the technical scheme of the invention provides a heterogeneous graph neural network recommendation method, which comprises the steps of firstly, acquiring a to-be-detected article user set, and inputting the to-be-detected article user set into a preset heterogeneous graph neural network model, wherein the preset heterogeneous graph neural network model comprises an embedded layer, a global view module and a local view module; then, an embedded layer is adopted to execute coding operation on the user set of the article to be detected, a heterogeneous graph is generated, sub-sampling is carried out on the heterogeneous graph through a small-batch graph sampling algorithm, and node sub-graph data are generated; inputting the node sub-graph data to a global view module for self-adaptive enhancement, and outputting a first article cooperative feature and a first user cooperative feature; carrying out structural data enhancement on the heterogeneous diagram through a local view module to generate a second object cooperative feature and a second user cooperative feature; finally, performing feature stitching on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature and the second user cooperative feature, and outputting an article recommendation score result; according to the technical scheme, the heterogeneous graph is adopted as auxiliary information, self-adaptive enhancement is realized on node sub-graph data through the global view module, structural data enhancement is realized on the heterogeneous graph through the local view module, and the output first article cooperative features, the first user cooperative features, the second article cooperative features and the second user cooperative features are subjected to feature stitching, so that an article recommendation score result is output.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained from these drawings without inventive faculty for a person skilled in the art.
Fig. 1 is a flowchart of steps of a heterogeneous neural network recommendation method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a neural network model with a preset heterogeneous diagram according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating steps of another heterogeneous neural network recommendation method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a heterogeneous graph convolution module composed of a heterogeneous mutual attention layer, a heterogeneous message transfer layer and a target message aggregation layer according to an embodiment of the present invention;
fig. 5 is a block diagram of a heterogeneous neural network recommendation device according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a heterogeneous graph neural network recommendation method, device and equipment, which are used for solving the technical problem that the quality of article recommendation is poor due to the fact that only single node characteristics and node relations are considered in the prior art.
In order to make the objects, features and advantages of the present invention more comprehensible, the technical solutions in the embodiments of the present invention are described in detail below with reference to the accompanying drawings, and it is apparent that the embodiments described below are only some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, fig. 1 is a flowchart illustrating steps of a heterogeneous neural network recommendation method according to an embodiment of the present invention.
The invention provides a heterogeneous graph neural network recommendation method, which comprises the following steps:
step 101, acquiring a user set of an object to be detected, and inputting the user set of the object to be detected into a preset heterogeneous graph neural network model, wherein the preset heterogeneous graph neural network model comprises an embedding layer, a global view module and a local view module.
It should be noted that, referring to fig. 2, the preset heterogeneous graph neural network model provided in the present application is composed of an Embedding layer, a global view module and a local view module, where the Embedding layer is used to take a user article semantic relation graph (heterogeneous graph) formed by encoding according to user assembly and article assembly as input; the global view module and the local view module are used for processing the image data through the image structure enhancement method and the self-adaptive enhancement method, and augmenting the sample pairs.
Further, when the initial heterogeneous graph neural network model is trained, the contrast loss is constructed through the loss adjusting layer, so that the distance relation between positive and negative samples is adjusted, namely, in the model training stage, the loss value corresponding to the model is calculated, and whether the loss value reaches convergence is judged, so that the super-parameters of the model are further adjusted; after the model is trained, features of different propagation layers are embedded and aggregated through a prediction layer, affinity scores of users for articles are output, namely, feature stitching is carried out on first article cooperative features, first user cooperative features, second article cooperative features and second user cooperative features which are output by a heterogeneous graph neural network model in advance, and therefore article recommendation score results (affinity scores of users for articles) are output.
In this embodiment, a set of users of an object to be detected is obtained, and the set of users of the object to be detected is input into a preset heterogeneous graph neural network model, where the preset heterogeneous graph neural network model includes an embedding layer, a global view module and a local view module.
And 102, executing coding operation on the user set of the object to be detected by adopting the embedded layer to generate a heterogeneous graph, and sub-sampling the heterogeneous graph by using a small-batch graph sampling algorithm to generate node sub-graph data.
The user set of the to-be-detected object comprises the user set to be detected and the object set.
The node sub-graph data includes a heterogeneous sub-graph, a source node, and a target node.
It should be noted that, for the user set and the item set, the structure encoded into the heterogeneous graph is used to represent the heterogeneous graph, that is, the heterogeneous graph may be represented as g= (N, E), where N is a node set and E is an edge-linked set; mapping of all node types toThe mapping of edge type is +.>The heterogeneous network map (heterogeneous map) should satisfy the relationshipAfter the heterogeneous graph is obtained, the heterogeneous graph is sampled through a small-batch graph sampling algorithm (mini-batch graph sampling algorithm) to generate node sub-graph data.
Further, the primitive path of the heterogeneous graph may be represented asWherein A is l Object node representing the first hop, R l Representing the node relationship of the first hop. In practice, the meta-path can be seen as an aggregation of multiple-layer relationship edges, A 1 To A l+1 The composite relationship of (2) can be expressed as:
wherein,representing the composite operator on the relationship.
In this embodiment, an embedded layer is adopted to perform coding operation on a user set of an object to be detected, a heterogeneous graph is generated, sub-sampling is performed on the heterogeneous graph through a small-batch graph sampling algorithm, and node sub-graph data is generated.
And 103, inputting the node sub-graph data into a global view module for self-adaptive enhancement, and outputting the first article cooperative characteristic and the first user cooperative characteristic.
In this embodiment, the node sub-graph data is input to the global view module for adaptive enhancement, and the first item coordination feature and the first user coordination feature are output.
And 104, carrying out structural data enhancement on the heterogeneous diagram through a local view module to generate a second object cooperative feature and a second user cooperative feature.
Note that, for the input heterogeneous graph g= (N, E), it may also be expressed as g= (D, X), where D E [0,1] N×N As an adjacency matrix of edges, x= [ X ] 1 ,x 2 ,...,x N ]And as the characteristic matrix of the node, generating a second article cooperative characteristic and a second user cooperative characteristic by carrying out structural data enhancement on the edge adjacent matrix and the node characteristic matrix in the heterogeneous graph.
In this embodiment, the local view module is used to enhance structural data of the heterogeneous graph, so as to generate the second article cooperative feature and the second user cooperative feature.
And 105, performing feature stitching on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature and the second user cooperative feature, and outputting an article recommendation score result.
It should be noted that, through comparing and learning the recommendation of the optimization graph neural network, the user characteristics of different views can be obtained. I.e. first user collaborative feature z u And a second user cooperative feature s u And item characteristics, i.e. first item co-ordination characteristics z v Synergistic features s with a second article v Splicing the characteristics obtained by enhancing the structure data of the local view module and the characteristics obtained by adaptively enhancing the global view module, namely respectively splicing the first objectPerforming feature splicing on the cooperative features and the second article cooperative features, performing feature splicing on the first user cooperative features and the second user cooperative features, performing inner product operation after splicing, and outputting article recommendation score results, so that the preference score of the user on the article is predicted; the process of splicing and inner product of the article recommendation score results specifically comprises the following steps:
in the method, in the process of the invention,recommending score results for the item; z u Collaborative features for a first user; s is(s) u Collaborative features for a second user; z v A cooperative feature for the first item; s is(s) v Is a second article synergistic feature; t is the transpose.
In this embodiment, feature stitching is performed on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature, and the second user cooperative feature, and an article recommendation score result is output.
In the embodiment of the invention, the application provides a heterogeneous graph neural network recommendation method, which comprises the steps of firstly acquiring a to-be-detected article user set, and inputting the to-be-detected article user set into a preset heterogeneous graph neural network model, wherein the preset heterogeneous graph neural network model comprises an embedded layer, a global view module and a local view module; then, an embedded layer is adopted to execute coding operation on the user set of the article to be detected, a heterogeneous graph is generated, sub-sampling is carried out on the heterogeneous graph through a small-batch graph sampling algorithm, and node sub-graph data are generated; inputting the node sub-graph data to a global view module for self-adaptive enhancement, and outputting a first article cooperative feature and a first user cooperative feature; carrying out structural data enhancement on the heterogeneous diagram through a local view module to generate a second object cooperative feature and a second user cooperative feature; finally, performing feature stitching on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature and the second user cooperative feature, and outputting an article recommendation score result; according to the technical scheme, the heterogeneous graph is adopted as auxiliary information, self-adaptive enhancement is realized on node sub-graph data through the global view module, structural data enhancement is realized on the heterogeneous graph through the local view module, and the output first article cooperative features, the first user cooperative features, the second article cooperative features and the second user cooperative features are subjected to feature stitching, so that an article recommendation score result is output. The method and the system better realize the aggregation of paths and the target node prediction by improving the graph annotation force mechanism as a traditional node attention distribution mode, thereby realizing more robust node sample enhancement. For the source node and the target node on each path, heterogeneous mutual attention is employed to evaluate the importance of each source node.
Referring to fig. 3, fig. 3 is a flowchart illustrating steps of another heterogeneous neural network recommendation method according to an embodiment of the present invention.
Step 301, acquiring a user set of an object to be trained, inputting the user set of the object to be trained into an initial heterogeneous graph neural network model, and outputting a first object to be trained, a first user to be trained, a second object to be trained and a second user to be trained.
The user set of the articles to be trained comprises the article set to be trained and the user set.
The method includes the steps that a user set of an article to be trained is subjected to coding operation through an initial embedding layer in an initial heterogeneous graph neural network model, a heterogeneous training graph is output, sub-sampling is carried out on the heterogeneous training graph through a small-batch graph sampling algorithm, node sub-graph training data are generated, the node sub-graph training data are input into an initial global view module in the initial heterogeneous graph neural network model for self-adaptive enhancement, and a first article to be trained cooperative feature and a first user to be trained cooperative feature are output; and carrying out structural data enhancement on the heterogeneous training diagram through an initial local view module in the initial heterogeneous diagram neural network model, and generating a second article cooperative feature to be trained and a second user cooperative feature to be trained.
In this embodiment, a set of users of the to-be-trained object is obtained, and the set of users of the to-be-trained object is input to an initial heterogeneous neural network model, and a first to-be-trained object cooperative feature, a first to-be-trained user cooperative feature, a second to-be-trained object cooperative feature, and a second to-be-trained user cooperative feature are output.
And 302, calculating a target loss value by adopting the first to-be-trained article cooperative characteristic, the first to-be-trained user cooperative characteristic, the second to-be-trained article cooperative characteristic and the second to-be-trained user cooperative characteristic.
In order to guide the multiple views (heterogeneous graphs) to realize data augmentation, a self-supervision contrast learning framework is combined, an InfoNCE function (a loss function based on an information theory) is introduced as a loss function to guide a self-supervision signal, an enhanced view with stronger robustness is constructed, a global collaboration relationship is utilized to enhance the representation of a local view, and the global view characteristic is obtainedAnd partial view feature->Constructing contrast loss; specifically, a target loss value is calculated by adopting a first to-be-trained article cooperative feature, a first to-be-trained user cooperative feature, a second to-be-trained article cooperative feature and a second to-be-trained user cooperative feature, wherein the calculation process of the target loss value specifically comprises the following steps:
In the method, in the process of the invention,is the first loss value; />A second loss value; />Collaborative features for a first user to be trained; />Collaborative features for a second user to be trained; />Cooperative characteristics of the first article to be trained; />Cooperative characteristics of the second to-be-trained object; τ is the temperature coefficient; i is the number of user sets (node sample pairs) of the articles to be trained; sim (·) is a similarity function representing the computation of sample feature similarity using a cosine function; l is the number of layers of the initial heterogeneous graph neural network model; i' is the index of the negative sample; />A negative sample of the collaborative features of the second user to be trained; />A negative sample of collaborative features of a second item to be trained; />Is a target loss value; alpha is a first super parameter of the initial heterogeneous graph neural network model and is used for representing different contrast loss rates; beta is a second super parameter of the initial heterogeneous graph neural network model and is used for representing different contrast loss rates; lambda is an overfitting prevention parameter used for controlling an L2 regularization term to prevent overfitting; />Is an L2 regularization term.
In this embodiment, the target loss value is calculated by using the first to-be-trained article cooperative feature, the first to-be-trained user cooperative feature, the second to-be-trained article cooperative feature, and the second to-be-trained user cooperative feature.
And 303, if the target loss value is converged, using the trained initial heterogeneous graph neural network model as a preset heterogeneous graph neural network model.
If the target loss value is converged, the initial heterogeneous graph neural network model is trained, and then the trained initial heterogeneous graph neural network model is used as a preset heterogeneous graph neural network model; if the target loss value is not converged, continuing training the initial heterogeneous graph neural network model by adopting an Adam optimizer (self-adaptive learning rate optimizer) in combination with a preset parameter range until the target loss value is converged, wherein the preset parameter range comprises a first super-parameter alpha range {0.001,0.01,0.01,0.1,1}, a second super-parameter beta range {0.001,0.001,0.01,0.1,1}, and an overfitting prevention parameter lambda range {10 } -2 ,10 -3 ,10 -4 ,10 -5 ,10 -6 The number of layers L of the global view module and the enhanced view module in the heterogeneous graph neural network model is {1,2,3,4}, the initial heterogeneous graph neural network model is trained through the combination of an Adam optimizer and a preset super-parameter range, and the model with the best convergence effect is selected as the preset heterogeneous graph neural network model.
In this embodiment, if the target loss value has converged, the trained initial heterogeneous graph neural network model is used as the preset heterogeneous graph neural network model.
Step 304, acquiring a user set of the object to be detected, and inputting the user set of the object to be detected into a preset heterogeneous graph neural network model, wherein the preset heterogeneous graph neural network model comprises an embedding layer, a global view module and a local view module.
In this embodiment, a set of users of an object to be detected is obtained, and the set of users of the object to be detected is input into a preset heterogeneous graph neural network model, where the preset heterogeneous graph neural network model includes an embedding layer, a global view module and a local view module.
And 305, executing coding operation on the user set of the object to be detected by adopting the embedded layer to generate a heterogeneous graph, and sub-sampling the heterogeneous graph by using a small-batch graph sampling algorithm to generate node sub-graph data.
In this embodiment, an embedded layer is adopted to perform coding operation on a user set of an object to be detected, a heterogeneous graph is generated, sub-sampling is performed on the heterogeneous graph through a small-batch graph sampling algorithm, and node sub-graph data is generated.
Step 306, inputting the node sub-graph data to the global view module for self-adaptive enhancement, and outputting the first article cooperative feature and the first user cooperative feature.
The node sub-graph data includes a heterogeneous sub-graph, a source node, and a target node.
The global view module includes a path aware network layer, a heterogeneous mutual attention layer, a heterogeneous messaging layer, and a target message aggregation layer.
It should be noted that, referring to fig. 4, the heterogeneous mutual attention layer, the heterogeneous message transmission layer and the target message aggregation layer form a heterogeneous graph convolution module (transform layer), firstly, a path space is output through the path perception network layer, then, a first object feature set and a first user feature set are output through the heterogeneous mutual attention layer, the heterogeneous message transmission layer and the target message aggregation layer, and finally, a first object feature set and the first user feature set are respectively processed in a nonlinear manner by adopting a leak ReLU nonlinear function, so as to generate a first object cooperative feature and a first user cooperative feature.
Further, step 306 may include the sub-steps of:
s61, performing space conversion operation on the heterogeneous subgraphs through a path perception network layer to generate a path space;
it should be noted that, the heterogeneous subgraph is input to the path-aware network layer (path-aware graph neural network), and a path space is generated, where an edge from the source node s to the target node t may be represented as e= (s, t), and then is represented as < phi(s), phi (e), phi (t) >, in the form of triples, on the path space.
Further, the aggregation of information from a source node to a target node can be regarded as an aggregation from the (l-1) th hop node to the first hop nodeAnd (5) processing. We use H l [t]The node representation of the first layer of node t in the heterogram space. That (l-1) layer path to the first layer path can be represented by an aggregate relationship as:
wherein Extract (·) represents the neighbor information extractor, let H l-1 [t]And e as query parameters, from source node H l -1 [s]Extracting useful information, wherein Aggregate (·) represents that neighbor features of a source node are collected through aggregation operation such as mean, sum, max.
S62, inputting a path space, a source node and a target node into a heterogeneous mutual attention layer for attention splicing, and outputting multi-head attention;
the processing procedure of outputting the attention of multiple heads is as follows:
wherein K is i (s) is a first vector; q (Q) i (t) is a second vector;linearly projecting a Key vector for the ith preset Key; />Linear projection of a Query vector is preset for the ith; />Is a matrix of learnable parameters; h (l-1) [s]Is the source node of layer 1; h (l-1) [t]Is a target node of the first layer-1; ATT-head i (s, e, t) is the ith attention head;<φ(s),ψ(e),φ(t)>is a node relation triplet in path space; μ is a triplet importance parameter; / >Is a scaling factor; attention(s) HGT (s, e, t) is multi-headed attention; softmax (·) is the normalization function; i i∈[1,h] And performing splicing operation for the h attention heads.
S63, performing information splicing on the target node by adopting a heterogeneous message transfer layer to generate multi-head node information;
it should be noted that, a multi-head splicing method is also needed for node information (target node), specifically, a heterogeneous message transmission layer is used for carrying out information splicing on the target node to generate multi-head node information, wherein the processing procedure of the multi-head node information specifically includes:
in the formula, message (s, e, t) is multi-head node information; MSG-head i (s, e, t) is the i-th node information header; V-Linear is the Linear projection of a preset Value vector; h (l-1 )[s]Is the source node of layer 1;and fusing the edge weight matrix.
S64, aggregating and enhancing the multi-head attention and multi-head node information through a target message aggregation layer to generate a first object feature set and a first user feature set;
it should be noted that, aggregating all source nodes with different feature distributions to obtain a representation of a next-hop target node, specifically, aggregating and enhancing multi-head attention and multi-head node information through a target message aggregation layer to generate a target set, namely, a first article feature set and a first user feature set, where the processing procedures of the first article feature set and the first user feature set are specifically as follows:
H l [t]=S-Linear(Attention HGT (s,e,t)·Message(s,e,t));
Wherein H is l [t]Is the object ofA set, the target set comprising a first item feature set and a first user feature set; S-Linear is a Linear mapping function involving matrix multiplication; attention(s) HGT (s, e, t) is multi-headed attention; message (s, e, t) is multi-headed node information.
S65, respectively carrying out nonlinear processing on the first article feature set and the first user feature set by adopting a leak ReLU nonlinear function to generate a first article cooperative feature and a first user cooperative feature.
After the adaptive enhancement, the feature representation of the user and the project node, namely the first user feature set, is obtainedAnd a first item feature set +.> Then, respectively carrying out nonlinear processing on the first article feature set and the first user feature set by adopting a leak ReLU nonlinear function to generate a first article cooperative feature and a first user cooperative feature, wherein the processing procedure of the first article cooperative feature and the first user cooperative feature comprises the following specific steps:
in the method, in the process of the invention,collaborative features for a first user; sigma (·) is a leak ReLU nonlinear function; emb u Is a first set of user features; emb v Is a first set of article characteristics; />A cooperative feature for the first item.
In this embodiment, the node sub-graph data is input to the global view module for adaptive enhancement, and the first item coordination feature and the first user coordination feature are output.
And 307, carrying out structural data enhancement on the heterogeneous diagram through the local view module to generate a second object cooperative feature and a second user cooperative feature.
The partial view module includes a node drop layer, an edge perturbation layer, a first graph encoder, a second graph encoder, and a graph aggregator.
Further, step 307 may include the sub-steps of:
s71, adopting a node discarding layer to discard nodes of the node characteristic matrix in the heterogeneous graph to generate the heterogeneous discarding graph;
note that, for the input heterogeneous graph g= (N, E), it may also be expressed as g= (D, X), D E [0,1 ]] N×N Is an adjacency matrix of edges, x= [ X ] 1 ,x 2 ,...,x N ]The characteristic matrix of the node is obtained by different enhancement transformation, and the method specifically comprises the following steps:
wherein M is M different enhancement strategies; d (D) m An edge adjacency matrix for the mth enhancement view; x is X m A node characteristic matrix is obtained for the m enhanced view;is the m-th data enhancement mode; d is an edge adjacency matrix; x is a node feature matrix.
Specifically, by randomly deleting nodes in the node characteristic matrix in the heterogeneous graph, a new view (heterogeneous discard graph) is constructed, specificallyWherein g (1) Discarding the graph for heterogeneous; d (D) (1) Discarding the edge adjacency matrix obtained after the enhancement for the node; x is X (1) Discarding enhancements for nodesA node feature matrix; />The enhancement operation is discarded for the node.
S72, performing edge disturbance on an edge adjacent matrix in the heterogeneous map through an edge disturbance layer to generate a heterogeneous disturbance map;
it should be noted that, the edge-adjacent matrix in the heterogeneous graph is edge-perturbed by the edge perturbation layer, that is, a new view (heterogeneous perturbation graph), in particular, by deleting some edges or adding a small portion of edges, is constructedWherein g (2) Is a heterogeneous scrambling diagram; d (D) (2) Is that; x is X (2) Is that; />Is that; heterogeneous disturbance map g obtained by performing edge disturbance on original view g (2) It can also be expressed as:
in the formula, the ". As indicated matrix performs element-by-element multiplication, L is a disturbance matrix, if L i,j =1, then represents perturbation of the edge formed by node i and node j, L i,j =L j,i And=0 indicates that no edge perturbation operation is performed.
S73, inputting the heterogeneous discard image to a first image encoder for view enhancement, and outputting node enhancement features;
s74, performing view enhancement on the heterogeneous disturbance map by adopting a second map encoder, and outputting edge enhancement features;
it should be noted that enhancement to the new view (heterogeneous discard view, heterogeneous scramble view) can be regarded as user node H in the view structure u And project node H v In (2), the view enhancement is carried out on the heterogeneous discard graph and the heterogeneous disturbance graph through a graph encoder (node encoder), and the corresponding graph can be obtainedCharacteristic representation H of the node of (2) (k) Wherein the node enhancement features include node user enhancement features and node item enhancement features; the edge enhancement features include edge user enhancement features and edge article enhancement features; the view enhancement processing process specifically comprises the following steps:
wherein H is (k) Is the node characteristic of the k layer; g (m) Is a subgraph under the m-th data enhancement mode; f (·) is the graph encoder;is the m-th data enhancement mode; d is an edge adjacency matrix of the original graph; x is the node characteristic matrix of the original graph.
S75, respectively carrying out convolution operation on the node enhancement features and the edge enhancement features through a graph aggregator, and determining the node convolution features and the edge convolution features;
it should be noted that after the original view is enhanced by different data, a new view g is obtained (1) 、g (2) Performing aggregation operation on the node characteristics, introducing a lightweight graph rolling network LightGCN as a graph aggregator, performing the graph rolling operation iteratively, and aggregating to obtain new representations of the target user nodes and the target item nodes in a neighbor characteristic aggregation mode; specifically, the node enhancement feature and the edge enhancement feature are respectively subjected to convolution operation through a graph aggregator, and the node convolution feature and the edge convolution feature are determined; the node convolution features include node item convolution features and node user convolution features, and the edge convolution features include edge user convolution features and edge item convolution features. The specific process of convolution operation can be expressed as:
In the method, in the process of the invention,item node representation for the k-th layer, +.>Is a user node of the k+1 layer; />The number of project nodes; />The number of user nodes; />A user node representation for a k-th layer; />Item node representation for layer k+1; sigma is a sum operation;
s76, respectively carrying out nonlinear processing on the node convolution characteristic and the edge convolution characteristic by adopting a leak ReLU nonlinear function to generate a first target characteristic and a second target characteristic;
the node convolution feature and the edge convolution feature are respectively processed in a nonlinear manner by adopting a Leaky ReLU nonlinear function, and then a first target feature and a second target feature are generated, wherein the first target feature comprises a first object target featureAnd first user objective feature->The second target feature comprises a second object target feature +.>And a second user collaborative feature->
S77, summing the first object target characteristics in the first target characteristics and the second object target characteristics in the second target characteristics to generate second object cooperative characteristics;
the summation process of the first object target feature and the second object target feature is specifically as follows:
in the method, in the process of the invention,is a second article synergistic feature; />Targeting the first item; / >Is the target characteristic of the second object.
And S78, performing summation operation on the first user target feature in the first target features and the second user target feature in the second target features to generate second user cooperative features.
It should be noted that, the summation process of the first user target feature and the second user target feature specifically includes:
in the method, in the process of the invention,collaborative features for a second user; />A first user target feature; />For a second user target feature.
In this embodiment, the local view module is used to enhance structural data of the heterogeneous graph, so as to generate the second article cooperative feature and the second user cooperative feature.
And 308, performing feature stitching on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature and the second user cooperative feature, and outputting an article recommendation score result.
Further, step 308 may include the sub-steps of:
s81, performing feature stitching on the first article cooperative feature and the second article cooperative feature, and outputting a target article cooperative feature;
s82, performing feature stitching on the first user cooperative feature and the second user cooperative feature, and outputting a target user cooperative feature;
s83, performing inner product operation on the target article cooperative feature and the target user cooperative feature, and outputting an article recommendation score result.
In this embodiment, feature stitching is performed on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature, and the second user cooperative feature, and an article recommendation score result is output.
As comparison of technical effects, reference can be made by combining the prior art, firstly, the existing main modules for comparison self-supervision learning can be divided into three modules, namely data enhancement, agent task design and comparison target construction, wherein the data enhancement method mainly adopts a characteristic data enhancement method based on random mask edge node characteristics, a graph structure data enhancement method based on edge disturbance and node insertion, a data enhancement method based on sampling strategies and a strategy based on self-adaptation enhancement, and meanwhile, the problem of data sparseness is solved by the graph structure enhancement method, and the pair of samples is enhanced; secondly, the existing method mainly adopts a homogeneous graph neural network and a self-supervision learning method enhanced by a graph structure to improve the data sparsity, but due to the enhanced randomness, noise data is easy to introduce when nodes are deleted and edge disturbance is carried out, only single node characteristics and node relations are considered, and semantic relations between users and projects cannot be effectively modeled; thirdly, the existing graph neural network model has the problems of data sparseness and bias distribution, the homogeneous graph neural network is mainly used, complex and various relations among different nodes are not considered, various relations among users and among objects are ignored, meanwhile, an enhancement strategy is designed from the internal structure of a subgraph based on the graph neural network algorithm of self-supervision learning, and a data enhancement mode is adopted to change the local node relation in the graph structure.
Aiming at the problems, the application provides a heterogeneous graph neural network recommendation method, wherein weight representation is adjusted by utilizing a heterogeneous graph, relation information between a user and a project is expressed in finer granularity, sub-graph enhancement is realized by utilizing a heterogeneous graph transducer architecture, meta-path auxiliary feature extraction is introduced, supervision training of a data enhancement stage is realized, a local view and a global view are comprehensively considered on feature expression, a heterogeneous sub-graph structure with higher robustness is modeled and used for extracting features, recommended performance and quality are improved, in a local view layer, a structure enhancement method can be changed, and data enhancement is realized by adopting other sub-graph structure enhancement methods such as attribute masks. In the global view, a better path attention weighting method is changed, self-adaptive enhancement is realized, a heterogeneous graph is adopted as auxiliary information, a rich graph represents learned data characteristic representation, and a self-supervision learning mode is combined to generate supervision signals in a heterogeneous graph neural network, so that the recommendation accuracy is improved, and meanwhile. And in the self-supervision contrast learning method, a graph data enhancement strategy of structure enhancement and self-adaptation attention enhancement is mixed, so that the heterogeneous graph neural network recommendation is realized.
In the embodiment of the invention, the application provides a heterogeneous graph neural network recommendation method, which comprises the steps of firstly acquiring a to-be-detected article user set, and inputting the to-be-detected article user set into a preset heterogeneous graph neural network model, wherein the preset heterogeneous graph neural network model comprises an embedded layer, a global view module and a local view module; then, an embedded layer is adopted to execute coding operation on the user set of the article to be detected, a heterogeneous graph is generated, sub-sampling is carried out on the heterogeneous graph through a small-batch graph sampling algorithm, and node sub-graph data are generated; inputting the node sub-graph data to a global view module for self-adaptive enhancement, and outputting a first article cooperative feature and a first user cooperative feature; carrying out structural data enhancement on the heterogeneous diagram through a local view module to generate a second object cooperative feature and a second user cooperative feature; finally, performing feature stitching on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature and the second user cooperative feature, and outputting an article recommendation score result; according to the technical scheme, the heterogeneous graph is adopted as auxiliary information, self-adaptive enhancement is realized on node sub-graph data through the global view module, structural data enhancement is realized on the heterogeneous graph through the local view module, and the output first article cooperative features, the first user cooperative features, the second article cooperative features and the second user cooperative features are subjected to feature stitching, so that an article recommendation score result is output. The method and the system better realize the aggregation of paths and the target node prediction by improving the graph annotation force mechanism as a traditional node attention distribution mode, thereby realizing more robust node sample enhancement. For the source node and the target node on each path, heterogeneous mutual attention is employed to evaluate the importance of each source node.
Referring to fig. 5, fig. 5 is a block diagram illustrating a heterogeneous neural network recommendation device according to an embodiment of the present invention.
The acquiring module 501 is configured to acquire a set of users of an object to be detected, and input the set of users of the object to be detected into a preset heterogeneous graph neural network model, where the preset heterogeneous graph neural network model includes an embedding layer, a global view module and a local view module;
the method comprises the steps that a module 502 is adopted, an embedded layer is adopted to execute coding operation on a user set of an article to be detected, a heterogeneous graph is generated, sub-sampling is carried out on the heterogeneous graph through a small-batch graph sampling algorithm, and node sub-graph data are generated;
the self-adaptation enhancement module 503 is configured to input the node sub-graph data to the global view module for self-adaptation enhancement, and output a first article cooperative feature and a first user cooperative feature;
the structural data enhancement module 504 is configured to perform structural data enhancement on the heterogeneous graph through the local view module, so as to generate a second article cooperative feature and a second user cooperative feature;
and the output result module 505 is configured to perform feature stitching on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature, and the second user cooperative feature, and output an article recommendation score result.
Further, the adaptive enhancement module 503 includes:
the first generation sub-module is used for executing space conversion operation on the heterogeneous subgraphs through the path perception network layer to generate a path space;
the first output sub-module is used for inputting the path space, the source node and the target node into the heterogeneous mutual attention layer for attention splicing and outputting multi-head attention;
the second generation submodule is used for carrying out information splicing on the target node by adopting the heterogeneous message transfer layer to generate multi-head node information;
the third generation submodule is used for carrying out aggregation enhancement on the multi-head attention and the multi-head node information through the target message aggregation layer to generate a first article feature set and a first user feature set;
and the fourth generation submodule is used for respectively carrying out nonlinear processing on the first article feature set and the first user feature set by adopting a leakage ReLU nonlinear function to generate a first article cooperative feature and a first user cooperative feature.
Further, the structural data enhancement module 504 includes:
a fifth generation submodule, configured to perform node discarding on a node feature matrix in the heterogeneous graph by using a node discarding layer, to generate a heterogeneous discarding graph;
a sixth generation submodule, configured to perform edge perturbation on an edge adjacency matrix in the heterogeneous graph through an edge perturbation layer, to generate a heterogeneous perturbation graph;
The second output sub-module is used for inputting the heterogeneous discard image to the first image encoder for view enhancement and outputting node enhancement characteristics;
the third output sub-module is used for carrying out view enhancement on the heterogeneous disturbance map by adopting a second map encoder and outputting edge enhancement characteristics;
the first determining submodule is used for respectively carrying out convolution operation on the node enhancement features and the edge enhancement features through the graph aggregator and determining the node convolution features and the edge convolution features;
a seventh generating sub-module, configured to perform nonlinear processing on the node convolution feature and the edge convolution feature by using a leak ReLU nonlinear function, to generate a first target feature and a second target feature;
an eighth generation sub-module for performing a summation operation on a first item target feature of the first target features and a second item target feature of the second target features, generating a second item synergistic feature;
and a ninth generation sub-module, configured to perform a summation operation on the first user target feature in the first target features and the second user target feature in the second target features, and generate a second user cooperative feature.
Further, the output result module 505 includes:
the fourth output submodule is used for carrying out feature stitching on the first article cooperative feature and the second article cooperative feature and outputting a target article cooperative feature;
The fifth output sub-module is used for carrying out feature stitching on the first user cooperative feature and the second user cooperative feature and outputting a target user cooperative feature;
and the sixth output sub-module is used for executing inner product operation on the target article cooperative characteristic and the target user cooperative characteristic and outputting an article recommendation score result.
In an alternative embodiment, the apparatus further comprises:
the training set acquisition module is used for acquiring a to-be-trained article user set, inputting the to-be-trained article user set into the initial heterogeneous graph neural network model, and outputting a first to-be-trained article cooperative feature, a first to-be-trained user cooperative feature, a second to-be-trained article cooperative feature and a second to-be-trained user cooperative feature;
the calculating module is used for calculating a target loss value by adopting the first to-be-trained article cooperative characteristic, the first to-be-trained user cooperative characteristic, the second to-be-trained article cooperative characteristic and the second to-be-trained user cooperative characteristic;
and the convergence module is used for taking the trained initial heterogeneous graph neural network model as a preset heterogeneous graph neural network model if the target loss value is converged.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus, modules and sub-modules described above may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again.
The embodiment of the invention also provides electronic equipment, which comprises a processor and a memory:
the memory is used for storing the program codes and transmitting the program codes to the processor;
the processor is configured to execute the heterogeneous neural network recommendation method according to the above embodiment of the present invention according to instructions in the program code.
In the several embodiments provided in this application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. The heterogeneous graph neural network recommendation method is characterized by comprising the following steps of:
acquiring an article user set to be detected, and inputting the article user set to be detected into a preset heterogeneous graph neural network model, wherein the preset heterogeneous graph neural network model comprises an embedding layer, a global view module and a local view module;
performing coding operation on the user set of the object to be detected by adopting the embedded layer to generate a heterogeneous graph, and sub-sampling the heterogeneous graph by using a small-batch graph sampling algorithm to generate node sub-graph data;
inputting the node sub-graph data to the global view module for self-adaptive enhancement, and outputting a first article cooperative feature and a first user cooperative feature;
Carrying out structural data enhancement on the heterogeneous graph through the local view module to generate a second object cooperative feature and a second user cooperative feature;
and performing feature stitching on the first article cooperative feature, the first user cooperative feature, the second article cooperative feature and the second user cooperative feature, and outputting an article recommendation score result.
2. The heterogeneous graph neural network recommendation method according to claim 1, wherein the node sub-graph data comprises a heterogeneous sub-graph, a source node and a target node; the global view module comprises a path perception network layer, a heterogeneous mutual attention layer, a heterogeneous message transmission layer and a target message aggregation layer; the step of inputting the node sub-graph data to the global view module for self-adaptive enhancement and outputting the first article cooperative characteristic and the first user cooperative characteristic comprises the following steps:
executing space conversion operation on the heterogeneous subgraphs through the path perception network layer to generate a path space;
inputting the path space, the source node and the target node into a heterogeneous mutual attention layer for attention splicing, and outputting multi-head attention;
information splicing is carried out on the target node by adopting a heterogeneous message transfer layer, and multi-head node information is generated;
The multi-head attention and the multi-head node information are aggregated and enhanced through a target message aggregation layer, and a first article feature set and a first user feature set are generated;
and respectively carrying out nonlinear processing on the first article feature set and the first user feature set by adopting a Leaky ReLU nonlinear function to generate a first article cooperative feature and a first user cooperative feature.
3. The heterogeneous graph neural network recommendation method of claim 1, wherein the local view module comprises a node discard layer, an edge perturbation layer, a first graph encoder, a second graph encoder, and a graph aggregator; the step of generating the second article cooperative feature and the second user cooperative feature by performing structural data enhancement on the heterogeneous map through the local view module includes:
adopting a node discarding layer to discard nodes of the node characteristic matrix in the heterogeneous graph to generate a heterogeneous discarding graph;
performing edge disturbance on an edge adjacent matrix in the heterogeneous graph through an edge disturbance layer to generate a heterogeneous disturbance graph;
inputting the heterogeneous discard diagram to a first diagram encoder for view enhancement, and outputting node enhancement characteristics;
performing view enhancement on the heterogeneous disturbance map by adopting a second map encoder, and outputting edge enhancement features;
Respectively carrying out convolution operation on the node enhancement features and the edge enhancement features through a graph aggregator, and determining node convolution features and edge convolution features;
respectively carrying out nonlinear processing on the node convolution characteristic and the edge convolution characteristic by adopting a leak ReLU nonlinear function to generate a first target characteristic and a second target characteristic;
performing a summation operation on a first item target feature of the first target features and a second item target feature of the second target features to generate a second item cooperative feature;
and executing summation operation on the first user target feature in the first target features and the second user target feature in the second target features to generate a second user cooperative feature.
4. The heterogeneous neural network recommendation method according to claim 1, wherein the step of performing feature stitching on the first item coordination feature, the first user coordination feature, the second item coordination feature, and the second user coordination feature, and outputting an item recommendation score result includes:
performing feature stitching on the first article cooperative features and the second article cooperative features, and outputting target article cooperative features;
Performing feature stitching on the first user cooperative feature and the second user cooperative feature, and outputting a target user cooperative feature;
and executing inner product operation on the target object cooperative characteristic and the target user cooperative characteristic, and outputting an object recommendation score result.
5. The heterogeneous neural network recommendation method according to claim 1, wherein before the step of acquiring a set of users of an item to be detected and inputting the set of users of the item to be detected into a preset heterogeneous neural network model, comprising:
acquiring a to-be-trained article user set, inputting the to-be-trained article user set into an initial heterogeneous graph neural network model, and outputting a first to-be-trained article cooperative feature, a first to-be-trained user cooperative feature, a second to-be-trained article cooperative feature and a second to-be-trained user cooperative feature;
calculating a target loss value by adopting the first to-be-trained article cooperative characteristic, the first to-be-trained user cooperative characteristic, the second to-be-trained article cooperative characteristic and the second to-be-trained user cooperative characteristic;
and if the target loss value is converged, taking the trained initial heterogeneous graph neural network model as the preset heterogeneous graph neural network model.
6. A heterogram neural network recommendation device, comprising:
the acquisition module is used for acquiring a user set of the article to be detected and inputting the user set of the article to be detected into a preset heterogeneous graph neural network model, wherein the preset heterogeneous graph neural network model comprises an embedding layer, a global view module and a local view module;
the adoption module is used for executing coding operation on the user set of the object to be detected by adopting the embedded layer to generate a heterogeneous graph, and sub-sampling the heterogeneous graph by a small-batch graph sampling algorithm to generate node sub-graph data;
the self-adaptive enhancement module is used for inputting the node sub-graph data to the global view module for self-adaptive enhancement and outputting a first article cooperative characteristic and a first user cooperative characteristic;
the structure data enhancement module is used for enhancing the structure data of the heterogeneous graph through the local view module and generating a second article cooperative feature and a second user cooperative feature;
and the output result module is used for carrying out characteristic splicing on the first article cooperative characteristic, the first user cooperative characteristic, the second article cooperative characteristic and the second user cooperative characteristic and outputting an article recommendation score result.
7. The heterogram neural network recommendation device of claim 6, wherein the node sub-graph data comprises a heterogram, a source node, and a target node; the global view module comprises a path perception network layer, a heterogeneous mutual attention layer, a heterogeneous message transmission layer and a target message aggregation layer; the adaptive enhancement module comprises:
the first generation sub-module is used for executing space conversion operation on the heterogeneous subgraphs through the path perception network layer to generate a path space;
the first output sub-module is used for inputting the path space, the source node and the target node into the heterogeneous mutual attention layer for attention splicing and outputting multi-head attention;
the second generation submodule is used for carrying out information splicing on the target node by adopting a heterogeneous message transfer layer to generate multi-head node information;
the third generation submodule is used for carrying out aggregation enhancement on the multi-head attention and the multi-head node information through a target message aggregation layer to generate a first object feature set and a first user feature set;
and the fourth generation submodule is used for respectively carrying out nonlinear processing on the first article feature set and the first user feature set by adopting a leakage ReLU nonlinear function to generate a first article cooperative feature and a first user cooperative feature.
8. The heterogeneous graph neural network recommendation device of claim 6, wherein the local view module includes a node drop layer, an edge perturbation layer, a first graph encoder, a second graph encoder, and a graph aggregator; the structural data enhancement module comprises:
a fifth generation sub-module, configured to perform node discarding on the node feature matrix in the heterogeneous graph by using a node discarding layer, to generate a heterogeneous discarding graph;
a sixth generation submodule, configured to perform edge perturbation on an edge adjacency matrix in the heterogeneous graph through an edge perturbation layer, to generate a heterogeneous perturbation graph;
the second output sub-module is used for inputting the heterogeneous discard image to the first image encoder for view enhancement and outputting node enhancement characteristics;
a third output sub-module, configured to perform view enhancement on the heterogeneous scramble map by using a second map encoder, and output an edge enhancement feature;
the first determining submodule is used for respectively carrying out convolution operation on the node enhancement feature and the edge enhancement feature through the graph aggregator and determining the node convolution feature and the edge convolution feature;
a seventh generating sub-module, configured to perform nonlinear processing on the node convolution feature and the edge convolution feature by using a leak ReLU nonlinear function, to generate a first target feature and a second target feature;
An eighth generation sub-module, configured to perform a summation operation on a first object target feature of the first target features and a second object target feature of the second target features, to generate a second object cooperative feature;
and a ninth generation sub-module, configured to perform a summation operation on a first user target feature of the first target features and a second user target feature of the second target features, to generate a second user cooperative feature.
9. The heterogram neural network recommendation device of claim 6, wherein the output result module comprises:
the fourth output submodule is used for carrying out feature stitching on the first article cooperative feature and the second article cooperative feature and outputting a target article cooperative feature;
the fifth output sub-module is used for carrying out feature stitching on the first user cooperative feature and the second user cooperative feature and outputting a target user cooperative feature;
and the sixth output sub-module is used for executing inner product operation on the target article cooperative characteristic and the target user cooperative characteristic and outputting an article recommendation score result.
10. An electronic device comprising a memory and a processor, wherein the memory stores a computer program that, when executed by the processor, causes the processor to perform the steps of the heterograph neural network recommendation method of any one of claims 1-5.
CN202311478823.0A 2023-11-08 2023-11-08 Heterogeneous graph neural network recommendation method, device and equipment Pending CN117391816A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311478823.0A CN117391816A (en) 2023-11-08 2023-11-08 Heterogeneous graph neural network recommendation method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311478823.0A CN117391816A (en) 2023-11-08 2023-11-08 Heterogeneous graph neural network recommendation method, device and equipment

Publications (1)

Publication Number Publication Date
CN117391816A true CN117391816A (en) 2024-01-12

Family

ID=89463003

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311478823.0A Pending CN117391816A (en) 2023-11-08 2023-11-08 Heterogeneous graph neural network recommendation method, device and equipment

Country Status (1)

Country Link
CN (1) CN117391816A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117788122A (en) * 2024-02-23 2024-03-29 山东科技大学 Goods recommendation method based on heterogeneous graph neural network

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117788122A (en) * 2024-02-23 2024-03-29 山东科技大学 Goods recommendation method based on heterogeneous graph neural network
CN117788122B (en) * 2024-02-23 2024-05-10 山东科技大学 Goods recommendation method based on heterogeneous graph neural network

Similar Documents

Publication Publication Date Title
Liu et al. Tcgl: Temporal contrastive graph for self-supervised video representation learning
CN106649659B (en) Social network-oriented link prediction system and method
CN112989064B (en) Recommendation method for aggregating knowledge graph neural network and self-adaptive attention
CN112035746A (en) Session recommendation method based on space-time sequence diagram convolutional network
CN111797321B (en) Personalized knowledge recommendation method and system for different scenes
CN111881350B (en) Recommendation method and system based on mixed graph structured modeling
CN113536383B (en) Method and device for training graph neural network based on privacy protection
CN112364976B (en) User preference prediction method based on session recommendation system
CN113420232B (en) Privacy protection-oriented federated recommendation method for neural network of graph
CN112862001A (en) Decentralized data modeling method under privacy protection
CN113918832B (en) Graph convolution collaborative filtering recommendation system based on social relationship
CN117391816A (en) Heterogeneous graph neural network recommendation method, device and equipment
Jian et al. Metric-based auto-instructor for learning mixed data representation
CN113627479B (en) Graph data anomaly detection method based on semi-supervised learning
CN112884045A (en) Classification method of random edge deletion embedded model based on multiple visual angles
CN113919440A (en) Social network rumor detection system integrating dual attention mechanism and graph convolution
CN116204719A (en) Knowledge enhancement multitask recommendation method under hyperbolic space
CN115409155A (en) Information cascade prediction system and method based on Transformer enhanced Hooke process
Yang et al. Gradient leakage attacks in federated learning: Research frontiers, taxonomy and future directions
CN111259264A (en) Time sequence scoring prediction method based on generation countermeasure network
CN116306780B (en) Dynamic graph link generation method
CN116467415A (en) Bidirectional cross-domain session recommendation method based on GCNsformer hybrid network and multi-channel semantics
Yan et al. Membership inference attacks against deep learning models via logits distribution
Liang et al. TransAM: Transformer appending matcher for few-shot knowledge graph completion
Zhou et al. Trajectory-user linking via graph neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination