CN116128575A - Item recommendation method, device, computer apparatus, storage medium, and program product - Google Patents

Item recommendation method, device, computer apparatus, storage medium, and program product Download PDF

Info

Publication number
CN116128575A
CN116128575A CN202310027099.3A CN202310027099A CN116128575A CN 116128575 A CN116128575 A CN 116128575A CN 202310027099 A CN202310027099 A CN 202310027099A CN 116128575 A CN116128575 A CN 116128575A
Authority
CN
China
Prior art keywords
graph
initial
nodes
undirected
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310027099.3A
Other languages
Chinese (zh)
Inventor
陈亮
廖婕
周克涌
郑子彬
张文锋
邓文强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Merchants Union Consumer Finance Co Ltd
Sun Yat Sen University
Original Assignee
Merchants Union Consumer Finance Co Ltd
Sun Yat Sen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Merchants Union Consumer Finance Co Ltd, Sun Yat Sen University filed Critical Merchants Union Consumer Finance Co Ltd
Priority to CN202310027099.3A priority Critical patent/CN116128575A/en
Publication of CN116128575A publication Critical patent/CN116128575A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/06Asset management; Financial planning or analysis

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Operations Research (AREA)
  • Technology Law (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application relates to an item recommendation method, an item recommendation device, a computer device, a storage medium and a computer program product. The method comprises the following steps: constructing an initial undirected weight graph according to the user nodes, the object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users; inputting the initial undirected graph into a preset graph self-encoder model, and updating the relation between long tail nodes and other nodes to generate a target undirected graph; and recommending the articles for the long-tail user according to the initial undirected unauthorized image and the target undirected unauthorized image, and generating article recommendation results. By adopting the method, the article recommendation can be more accurately performed on the long-tail user according to the constructed initial undirected graph and the target undirected graph containing more comprehensive feedback information, so that a more accurate article recommendation result is generated.

Description

Item recommendation method, device, computer apparatus, storage medium, and program product
Technical Field
The present application relates to the field of artificial intelligence technology, and in particular, to an article recommendation method, apparatus, computer device, storage medium, and computer program product.
Background
When recommending articles, each financial institution generally needs to recommend personalized features for different users, so that the accuracy of article recommendation is improved, and the satisfaction degree of the users on services is improved.
Typically, when making item recommendations, the users targeted may include head users as well as long tail users. The head user refers to a user to be recommended containing sufficient feedback information. The long-tail user refers to a user to be recommended which only contains a few feedback information or has imperfect feedback information. Therefore, since the long-tail user lacks enough feedback information such as browsing history information and tag information, there is a problem that the article recommendation is inaccurate when the long-tail user is recommended for the article based on a small number of feedback information or imperfect feedback information.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an item recommendation method, apparatus, computer device, computer-readable storage medium, and computer program product that can improve item recommendation accuracy.
In a first aspect, the present application provides a method of item recommendation. The method comprises the following steps:
constructing an initial undirected unowned graph according to user nodes, object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users;
inputting the initial undirected graph into a preset graph self-encoder model, and updating the relation between the long tail node and other nodes to generate a target undirected graph;
and recommending the long-tail user according to the initial undirected graph and the target undirected graph to generate an article recommendation result.
In one embodiment, the constructing an initial undirected weight graph according to a user node, an item node corresponding to the user node, a relationship between the user nodes, and a relationship between the user node and the item node includes:
taking user nodes and object nodes corresponding to the user nodes as nodes, and taking the relation between the user nodes and the object nodes as edges to construct an initial adjacency matrix;
Acquiring the feature vector of the user node and the feature vector of the object node, and generating a feature matrix according to the feature vector of the user node and the feature vector of the object node;
and constructing an initial undirected weight graph according to the initial adjacency matrix and the feature matrix.
In one embodiment, the method further comprises:
deleting the connecting edges of the head nodes in the initial undirected graph to generate a new undirected graph;
inputting the new undirected weight map into an initial map self-encoder model for processing to generate a new adjacency matrix;
and calculating the value of a loss function of the initial graph self-encoder model according to the new adjacency matrix, and updating the model parameters of the initial graph self-encoder model according to the value of the loss function to generate a preset graph self-encoder model.
In one embodiment, the inputting the initial undirected graph into a preset graph self-encoder model, updating the relationship between the long tail node and other nodes, and generating the target undirected graph includes:
inputting the initial undirected weight-free graph into a preset graph self-encoder model, and predicting the connecting edges to be added of the long tail nodes to generate a target adjacent matrix;
And generating the target undirected unowned graph according to the target adjacency matrix and the feature matrix.
In one embodiment, the method further comprises:
inputting the initial undirected graph into an initial graph neural network model to perform graph convolution processing, and generating a first embedded characterization matrix corresponding to the initial undirected graph;
inputting the target undirected graph into an initial graph neural network model to perform graph convolution processing, and generating a second embedded characterization matrix corresponding to the target undirected graph;
calculating the value of a loss function of the initial graph neural network model according to the first embedded characterization matrix and the second embedded characterization matrix, and updating model parameters of the initial graph neural network model according to the value of the loss function to generate a preset graph neural network model.
In one embodiment, the recommending the article for the long-tail user according to the initial undirected weight graph and the target undirected weight graph, and generating an article recommendation result includes:
inputting the initial undirected unauthorized graph into a preset graph neural network model to carry out graph convolution processing to generate a target embedded characterization matrix; the preset graph neural network model is trained based on the initial undirected graph and the target undirected graph;
And recommending the long-tail user with the article according to the target embedded characterization matrix, and generating the article recommendation result.
In a second aspect, the present application further provides an article recommendation device. The device comprises:
the initial undirected weight map construction module is used for constructing an initial undirected weight map according to user nodes, object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users;
the target undirected graph generating module is used for inputting the initial undirected graph into a preset graph self-encoder model, updating the relation between the long tail node and other nodes, and generating a target undirected graph;
and the article recommendation result generation module is used for recommending the article to the long-tail user according to the initial undirected graph and the target undirected graph to generate an article recommendation result.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the steps of the method in any of the embodiments of the first aspect described above when the processor executes the computer program.
In a fourth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method in any of the embodiments of the first aspect described above.
In a fifth aspect, the present application also provides a computer program product. The computer program product comprising a computer program which, when executed by a processor, implements the steps of the method in any of the embodiments of the first aspect described above.
The article recommending method, the article recommending device, the computer equipment, the storage medium and the computer program product construct an initial undirected unowned graph according to the user nodes, the article nodes corresponding to the user nodes, the relation among the user nodes and the article nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users; inputting the initial undirected graph into a preset graph self-encoder model, and updating the relation between long tail nodes and other nodes to generate a target undirected graph; and recommending the articles for the long-tail user according to the initial undirected unauthorized image and the target undirected unauthorized image, and generating article recommendation results. According to the method and the device, the established initial undirected graph is input into the preset graph self-encoder model, and the relation between the long-tail nodes and other nodes is updated, so that imperfect feedback information of the long-tail user can be expanded, and the target undirected graph containing more comprehensive feedback information is generated. And then, according to the constructed initial undirected graph and the target undirected graph containing more comprehensive feedback information, the long-tail user can be more accurately recommended with the article, so that a more accurate article recommendation result is generated.
Drawings
FIG. 1 is an application environment diagram of an item recommendation method in one embodiment;
FIG. 2 is a flow chart of an item recommendation method in one embodiment;
FIG. 3 is a flow diagram of the initial undirected weight map construction steps in one embodiment;
FIG. 4 is a schematic diagram of the structure of an initial undirected weight graph in one embodiment;
FIG. 5 is a flowchart illustrating a preset map self-encoder model generation step in another embodiment;
FIG. 6 is a training schematic of a preset map self-encoder model in one embodiment;
FIG. 7 is a flow chart of a target undirected weight graph generation step in one embodiment;
FIG. 8 is a flowchart illustrating a pre-map neural network model generation step according to another embodiment;
FIG. 9 is a training schematic diagram of a pre-set graph neural network model in one embodiment;
FIG. 10 is a flow chart of an item recommendation result generation step in one embodiment;
FIG. 11 is a flow chart of an alternative embodiment item recommendation method;
FIG. 12 is a schematic diagram of an item recommendation system in one embodiment;
FIG. 13 is a block diagram of an article recommendation device in one embodiment;
fig. 14 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
When recommending articles, each financial institution generally needs to recommend personalized features for different users, so that the accuracy of article recommendation is improved, and the satisfaction degree of the users on services is improved.
Typically, when making item recommendations, the users targeted may include head users as well as long tail users. The head user refers to a user to be recommended containing sufficient feedback information. The long-tail user refers to a user to be recommended which only contains a few feedback information or has imperfect feedback information. Therefore, since the long-tail user lacks enough feedback information such as browsing history information and tag information, there is a problem that the article recommendation is inaccurate when the long-tail user is recommended for the article based on a small number of feedback information or imperfect feedback information.
The article recommending method provided by the embodiment of the application can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104 or may be located on a cloud or other network server. The server 104 constructs an initial undirected weight graph according to the user nodes, the object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users; the server 104 inputs the initial undirected graph into a preset graph self-encoder model, and updates the relation between the long tail node and other nodes to generate a target undirected graph; and the server 104 recommends the long-tail user according to the initial undirected graph and the target undirected graph to generate an article recommendation result. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices, and portable wearable devices, where the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart vehicle devices, and the like. The portable wearable device may be a smart watch, smart bracelet, headset, or the like. The server 104 may be implemented as a stand-alone server or as a server cluster of multiple servers.
In one embodiment, as shown in fig. 2, a method for recommending items is provided, and the method is applied to the server 104 in fig. 1 for illustration, and includes the following steps:
step 220, constructing an initial undirected weight graph according to the user nodes, the object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long tail nodes corresponding to long tail users.
The nodes refer to vertexes on the graph and can comprise user nodes and article nodes, wherein the user nodes comprise head nodes corresponding to head users and long tail nodes corresponding to long tail users. The head user refers to a user to be recommended that contains sufficient feedback information. The long-tail user refers to a user to be recommended which only contains a few feedback information or has imperfect feedback information. The feedback information is collected historical behavior information of the user, and the feedback information comprises browsing, praying and other behaviors when the user browses the video software, clicking, collecting, adding shopping carts and other behaviors when the user browses the commodity software. The undirected and unowned graph refers to a graph which has no direction and no weight in the relationship between user nodes and object nodes in the graph.
Optionally, the server 104 may obtain the historical behavior information of the user from the terminal 102, and determine the relationship among the user, the articles used by the user when performing the historical behavior, the relationships among different users, and the relationships among the users and the articles according to the historical behavior information of the user, so as to determine the user node, the article node corresponding to the user node, the relationship among the user nodes, and the relationship among the user node and the article node. Server 104 may then construct an initial undirected graph based on the user nodes, the item nodes corresponding to the user nodes, the relationships between the user nodes, and the relationships between the user nodes and the item nodes.
And 240, inputting the initial undirected graph into a preset graph self-encoder model, and updating the relation between the long tail node and other nodes to generate a target undirected graph.
Optionally, the server 104 may input the constructed initial undirected graph into a preset graph self-encoder model for processing, and update the relationship between the long-tail node and other user nodes or the relationship between the long-tail node and other object nodes, so as to expand the relationship information between the long-tail node and other nodes, and further generate the target undirected graph. The preset map self-encoder model (Graph Autoencoders, GAE) is a model that uses input information as an encoder de-reconstructed (reconstract) original map. The target undirected unauthorized graph refers to an undirected unauthorized graph after information expansion is carried out on long tail nodes in the initial undirected unauthorized graph.
And 260, recommending the articles for the long-tail user according to the initial undirected graph and the target undirected graph, and generating article recommendation results.
Optionally, based on the initial undirected graph and the target undirected graph, the server 104 may recommend an item to the long-tail user, and generate an item recommendation result. For example, the server 104 may input the initial undirected weight graph and the target undirected weight graph into the neural network model of the preset graph together for processing, so as to generate a processing result; and recommending the articles according to the processing result to generate article recommending results. The server 104 can also train the initial undirected graph and the target undirected graph to obtain a preset graph neural network model, and then input the initial undirected graph into the preset graph neural network model for processing to generate a processing result; then, the server 104 may recommend the item according to the processing result, and generate an item recommendation result. The graph neural network model (graph neural network, GNN) refers to a model that uses a neural network to learn graph structure data, thereby meeting the requirements of graph learning tasks such as clustering, classification, prediction, segmentation, generation, and the like. Alternatively, the graph roll-up neural network model may also be used for processing in the embodiments of the present application, which is not limited in this application.
In the article recommendation method, an initial undirected graph is constructed according to the user nodes, the article nodes corresponding to the user nodes, the relation among the user nodes and the article nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users; inputting the initial undirected graph into a preset graph self-encoder model, and updating the relation between long tail nodes and other nodes to generate a target undirected graph; and recommending the articles for the long-tail user according to the initial undirected unauthorized image and the target undirected unauthorized image, and generating article recommendation results. According to the method and the device, the established initial undirected graph is input into the preset graph self-encoder model, and the relation between the long-tail nodes and other nodes is updated, so that imperfect feedback information of the long-tail user can be expanded, and the target undirected graph containing more comprehensive feedback information is generated. And then, according to the constructed initial undirected graph and the target undirected graph containing more comprehensive feedback information, personalized article recommendation can be more accurately performed on the long-tail user, so that a more accurate article recommendation result is generated. Finally, user satisfaction can be improved.
In one embodiment, as shown in fig. 3, constructing an initial undirected weight graph according to user nodes, item nodes corresponding to the user nodes, relationships between the user nodes, and relationships between the user nodes and the item nodes, includes:
step 320, constructing an initial adjacency matrix by taking the user nodes and the object nodes corresponding to the user nodes as nodes and taking the relationships between the user nodes and the object nodes as edges.
Alternatively, first, the server 104 may obtain the historical behavior information of the user, and determine, according to the historical behavior information of the user, the items used by the user when performing the historical behavior, the relationships between different users, and the relationships between the user and the items. Then, the server 104 may construct the initial adjacency matrix based on each node and each side by using each user as each user node, each item used by the user when performing the history behavior as each item node, the user node and the item node corresponding to the user node as nodes, and the relationship between the user nodes and the relationship between the user node and the item node as sides. Wherein the initial adjacency matrix can be denoted as A and A ε {0,1} n×n N represents the total number of nodes. Each element a in the initial adjacency matrix a ij An edge indicating whether there is a connection between node i and node j. If there is a connected edge between node i and node j, then A ij The element value of (2) is 1; if there is no connected edge between node i and node j, then A ij The element value of (2) is 0.
Step 340, obtaining the feature vector of the user node and the feature vector of the object node, and generating a feature matrix according to the feature vector of the user node and the feature vector of the object node.
Alternatively, the server 104 may obtain the historical behavior information of the user, and obtain the feature information of each user node and the feature information of each item node according to the historical behavior information of the user and the determined each user node and each item node. Then, the server 104 may unithermally encode the feature information of each user node and the feature information of each item node into a feature vector of each user node and a feature vector of each item node, and generate a feature matrix X according to the feature vector of each user node and the feature vector of each item node. Wherein the feature matrix may be denoted as X, and
Figure BDA0004045547950000071
n represents the total number of nodes, f represents the dimension of the feature vector, each row x of the matrix i Representing the f-dimensional feature vector of node i. The one-time thermal encoding may digitize discrete feature information to facilitate server operations. For example, a single heat code may use a two-digit binary number to indicate a gender feature, and if the first digit indicates "whether female" and the second digit indicates "whether male", then female may be single heat coded as "10" and male single heat coded as "01".
Step 360, constructing an initial undirected weight graph according to the initial adjacency matrix and the feature matrix.
Alternatively, the server 104 may construct an initial undirected weight graph based on the initial adjacency matrix a and the feature matrix X. The undirected and unowned graph refers to a graph in which the relationship between user nodes and object nodes are unoriented and unordered, i.e. the edges of the graph are unoriented and unordered. The initial undirected weight graph may be denoted G, and g= (a, X), a representing the initial adjacency matrix, and X representing the feature matrix. Illustratively, as shown in fig. 4, fig. 4 is a schematic structural diagram of an initial undirected weight graph in one embodiment. The nodes include head node v 0 Long tail node v 1 Item node v 2 Item node v 3 Item node v 4 Item node v 5 Each node comprises a feature vector corresponding to each node, and the edge comprises a head node v 0 And long tail node v 1 Concern relation between head nodes v 0 With item node v 2 Sharing relationship between head node v 0 With item node v 3 Praise relation between head node v 0 With item node v 4 Praise relation between and head node v 0 With item node v 5 A praise relationship between them.
In the embodiment, the user nodes and the object nodes corresponding to the user nodes are taken as nodes, and the relation between the user nodes and the object nodes are taken as edges, so that an initial adjacency matrix is constructed; acquiring feature vectors of user nodes and feature vectors of object nodes, and generating a feature matrix according to the feature vectors of the user nodes and the feature vectors of the object nodes; and constructing an initial undirected weight graph according to the initial adjacency matrix and the feature matrix. Therefore, the initial undirected weight map is constructed through the initial adjacency matrix and the feature matrix, and the relationship between users and articles can be obtained conveniently and accurately.
In one embodiment, as shown in fig. 5, there is provided an item recommendation method, further comprising:
And step 520, deleting the connection edges of the head nodes in the initial undirected graph to generate a new undirected graph.
Exemplary, as shown in fig. 6, fig. 6 is a training schematic of the preset map self-encoder model in one embodiment. Alternatively, the server 104 may delete the connection edge of the head node in the initial undirected weight graph to generate a new adjacency matrix a after deleting the edge 1 Thereby generating a new undirected unauthorized graph G after deleting edges 1 . Wherein G is 1 =(A 1 X). Any one or more connecting edges connected with the head node can be randomly deleted when deleting the connecting edges of the head node in the initial undirected weight graph. Of course, the present application is not limited thereto.
Step 540, the new undirected weight map is input to the initial map self-encoder model for processing to generate a new adjacency matrix.
Alternatively, as shown in connection with FIG. 6, the server 104 may update the new undirected unbiased graph G after puncturing 1 Input into the initial graph self-encoder model for processing, i.e., the server 104 can process the new undirected unauthorized graph G 1 Inputting the initial graph self-encoder model into the initial graph self-encoder model, predicting the connecting edges to be added of the head nodes with deleted connecting edges, and generating a new adjacent matrix A after predicting the adding edges 2
Step 560, calculating the value of the loss function of the initial graph self-encoder model according to the new adjacency matrix, and updating the model parameters of the initial graph self-encoder model according to the value of the loss function to generate the preset graph self-encoder model.
Alternatively, as shown in connection with FIG. 6, the server 104 may border the new adjacency matrix A based on the prediction 2 Computing an initial graph self-encoder modelLoss function Loss of (2) p And from the Loss function Loss of the encoder model according to the initial map p Updating model parameters Θ of the initial graph self-encoder model until the Loss function Loss of the initial graph self-encoder model p When the value of (a) is minimum, i.e. until the new adjacency matrix a after predictive addition 2 When the difference from the initial adjacency matrix A is minimum, the Loss function Loss of the initial graph self-encoder model at the moment is used p The corresponding model parameters are used as target model parameters of the initial graph self-encoder model, so that a trained preset graph self-encoder model is generated. Wherein Θ= (L) 1 ,L 2 ) Model parameters for the initial graph self-encoder model. Loss function Loss of initial graph self-encoder model p The calculation formulas of (a) are shown as formula (1), formula (2), and formula (3).
H=A 1 relu(A 1 XL 1 )L 2 (1)
A 2 =sigmoid(HH T ) (2)
Figure BDA0004045547950000081
Wherein H represents intermediate model parameters of the initial graph self-encoder model; a is that 1 Representing a new adjacent matrix after deleting edges, and X represents a feature matrix; relu () represents a first nonlinear activation function, defined as h (x) =max (0, x); l (L) 1 A first model parameter representing an initial graph self-encoder model; l (L) 2 A second model parameter representing the initial map from the encoder model; a is that 2 Representing a new adjacency matrix after predictive edging; sigmoid () represents a second nonlinear activation function, defined as h (x) =1/(1+e) -x );H T A transpose of intermediate model parameters representing the initial map from the encoder model; loss (Low Density) p Representing a loss function of the initial graph from the encoder model; a represents an initial adjacency matrix; n represents the total number of nodes and i and j represent different nodes. In the embodiment of the present application, the reu activation function may be replaced by an ELU activation function or a leak reu activation function.
In the embodiment, deleting the connecting edge of the head node in the initial undirected graph to generate a new undirected graph; inputting the new undirected weight map into the initial map self-encoder model for processing to generate a new adjacency matrix; and calculating the value of a loss function of the initial graph self-encoder model according to the new adjacency matrix, and updating the model parameters of the initial graph self-encoder model according to the value of the loss function to generate a preset graph self-encoder model. According to the embodiment, the preset graph self-encoder model is trained through the initial adjacent matrix, the new adjacent matrix after edge deletion and the new adjacent matrix after prediction and edge addition, so that the accurate preset graph self-encoder model can be generated.
In one embodiment, as shown in fig. 7, the method for inputting the initial undirected graph into the preset graph self-encoder model, updating the relationship between the long tail node and other nodes, and generating the target undirected graph includes:
step 720, inputting the initial undirected weight-free graph into a preset graph self-encoder model, and predicting the connecting edges of the long tail nodes to be added to generate a target adjacent matrix.
Optionally, after the training of the preset graph self-encoder model is completed, the server 104 may input the initial undirected unbiased graph G into the preset graph self-encoder model to predict the connection edge to be added of the long-tail node, so as to update the relationship between the long-tail node and other user nodes or the relationship between the long-tail node and other object nodes, that is, expand the relationship information between the long-tail node and other nodes, thereby generating the target adjacency matrix. The preset map self-encoder model (Graph Autoencoders, GAE) is a model that uses input information as an encoder de-reconstructed (reconstract) original map. The target adjacency matrix may be denoted as a'. The calculation formulas of the target adjacency matrix a' are shown as formula (4), formula (5), formula (6) and formula (7).
H=Arelu(AXL 1 )L 2 (4)
AP=softmax(HH T ) (5)
AM=Bernoulli(AP) (6)
A =Clamp(AM+A) (7)
Wherein AP represents a first intermediate variable of the target adjacency matrix; softmax () represents the normalized exponential function; AM represents a second intermediate variable of the target adjacency matrix; bernoulli () represents Bernoulli samples; clamp () means to truncate the matrix element value between 0, 1; a' represents a target adjacency matrix.
And 740, generating a target undirected weight graph according to the target adjacency matrix and the feature matrix.
Alternatively, the server 104 may generate the target undirected weight graph G 'from the target adjacency matrix a' and the feature matrix X. Wherein G '= (a', X). The target adjacency matrix refers to an adjacency matrix obtained by expanding relation information between long tail nodes and other nodes. The target undirected graph refers to an undirected graph after information expansion is carried out on the relationship between long tail nodes and other nodes in the initial undirected graph.
In the embodiment, an initial undirected weight-free graph is input into a preset graph self-encoder model, and a connecting edge to be added of a long tail node is predicted to generate a target adjacent matrix; according to the target adjacency matrix and the feature matrix, a target undirected unauthorized graph is generated, and imperfect feedback information of long-tail users can be expanded, so that the target undirected unauthorized graph containing more comprehensive feedback information is generated.
In one embodiment, as shown in fig. 8, there is provided an item recommendation method, further comprising:
and step 820, inputting the initial undirected graph into the initial graph neural network model to perform graph convolution processing, and generating a first embedded characterization matrix corresponding to the initial undirected graph.
Exemplary, as shown in fig. 9, fig. 9 is a training schematic diagram of a preset map neural network model in one embodiment. Alternatively, the server 104 may input the initial undirected graph Z into the initial graph neural network model to perform the graph convolution processing, and generate the first embedded token matrix corresponding to the initial undirected graph G. Wherein the embedded token matrix may be represented as
Figure BDA0004045547950000101
And n represents the total number of nodes, c represents the embedded characterization dimension, each row z of the matrix i The c-dimensional embedded representation of node i is represented, and c is much smaller than f. Phi= (W) 1 ,W 2 ) Model parameters representing an initial graph neural network model. First embedded characterization matrix Z 1 The calculation formula of (2) is shown as formula (8).
Z 1 =Arelu(AXW 1 )W 2 (8)
Wherein Z is 1 Representing a first embedded token matrix; a represents an initial adjacency matrix; relu () represents a first nonlinear activation function, defined as h (x) =max (0, x); w (W) 1 A first model parameter representing an initial graph neural network model; w (W) 2 And second model parameters representing the initial graph neural network model.
And 840, inputting the target undirected graph into the initial graph neural network model for graph convolution processing, and generating a second embedded characterization matrix corresponding to the target undirected graph.
Alternatively, as shown in conjunction with fig. 9, the server 104 may input the target undirected graph G 'into the initial graph neural network model to perform the graph convolution processing, so as to generate a second embedded token matrix corresponding to the target undirected graph G'. Wherein the second embedded characterization matrix Z 2 The calculation formula of (2) is shown in formula (9).
Z 2 =A relu(A XW 1 )W 2 (9)
Wherein Z is 2 Representing a second embedded token matrix; a' represents a target adjacency matrix; relu () represents a first nonlinear activation function, defined as h (x) =max (0, x); w (W) 1 A first model parameter representing an initial graph neural network model; w (W) 2 And second model parameters representing the initial graph neural network model.
Step 860, calculating the value of the loss function of the initial graph neural network model according to the first embedded characterization matrix and the second embedded characterization matrix, and updating the model parameters of the initial graph neural network model according to the value of the loss function to generate a preset graph neural network model.
Alternatively, as shown in connection with FIG. 9, the server 104 may characterize the matrix Z according to a first embedding 1 Second embedded characterization matrix Z 2 Calculating Loss function Loss of initial graph neural network model q And according to the Loss function Loss of the initial graph neural network model q Updating model parameters phi of the initial graph neural network model until a Loss function Loss of the initial graph neural network model q At minimum, use the Loss function Loss of the initial graph neural network model at this time q And the corresponding model parameters are used as target model parameters of the initial graph neural network model, so that a trained preset graph neural network model is generated. Wherein, the Loss function Loss of the initial graph neural network model q The calculation formula of (2) is shown as formula (10).
Loss q =CrossEntropy(Z 1 ,Y)+CrossEntropy(Z 2 ,Y) (10)
Wherein, loss q Representing a loss function of the initial graph neural network model; cross Entropy () represents a cross entropy loss function, which is a commonly used deep learning model loss function for measuring similarity between two probability distributions; z is Z 1 Representing a first embedded token matrix; y is a known and unchanged node label; z is Z 2 Representing a second embedded token matrix.
In the embodiment, an initial undirected graph is input into an initial graph neural network model to carry out graph convolution processing, and a first embedded characterization matrix corresponding to the initial undirected graph is generated; inputting the target undirected unweighted graph into an initial graph neural network model for graph convolution processing, and generating a second embedded characterization matrix corresponding to the target undirected unweighted graph; calculating the value of a loss function of the initial graph neural network model according to the first embedded characterization matrix and the second embedded characterization matrix, and updating model parameters of the initial graph neural network model according to the value of the loss function to generate a preset graph neural network model. In the embodiment, the preset graph neural network model is trained by the target undirected graph G' which is expanded by the initial undirected graph and the long tail node information, and compared with the method for training the preset graph neural network model by using the initial undirected graph G in the traditional method, the preset graph neural network model training method in the embodiment of the application is more accurate, so that the embodiment of the application can train and generate the more accurate preset graph neural network model.
In one embodiment, as shown in fig. 10, according to the initial undirected graph and the target undirected graph, recommending the article to the long-tail user, and generating an article recommendation result includes:
step 1020, inputting the initial undirected unauthorized graph into a preset graph neural network model to perform graph convolution processing, and generating a target embedded characterization matrix; the preset graph neural network model is trained based on the initial undirected graph and the target undirected graph.
Optionally, after the training of the preset graph neural network model is completed, the server 104 may input the initial undirected unauthorized graph G into the preset graph neural network model to perform graph convolution processing, so as to generate the target embedded characterization matrix Z. The preset graph neural network model is a graph neural network model obtained by training based on the initial undirected graph and the target undirected graph. The graph neural network model (graph neural network, GNN) refers to a model that uses a neural network to learn graph structure data, thereby meeting the needs of graph learning tasks such as clustering, classification, prediction, segmentation, generation, and the like. Alternatively, the graph roll-up neural network model may also be used for processing in the embodiments of the present application, which is not limited in this application. Each row Z of the target embedding characterization matrix Z i Representing the embedded token vector for node i.
And step 1040, recommending the articles for the long-tail users according to the target embedded characterization matrix, and generating article recommendation results.
Alternatively, first, the server 104 may obtain the long tail user number i to be recommended, that is, determine the long tail user i to be recommended. Second, based on the target embedded token matrix Z, the server 104 can calculate the embedded token Z for the long tail user i i Embedding representation z with all items to be recommended (item 1, item 2, … …, item m) j Cosine similarity between them, and generates a cosine similarity list (s 1, s2, s3,..sm). Wherein m represents a preset valueThe length of the placed item list to be recommended, m represents the length of the cosine similarity list. sj represents cosine similarity between the embedded representation of long tail user i and the embedded representation of item j; cosine similarity is also called cosine similarity, and the cosine similarity is the similarity of two vectors is estimated by calculating the cosine value of the included angle of the two vectors. In the embodiment of the present application, the cosine similarity may be replaced by pearson correlation coefficient or Jaccard similarity coefficient (Jaccard similarity coefficient). And then ordering the cosine similarity in the cosine similarity list (s 1, s2, s3,..sm) to generate a cosine similarity ordering result. And then selecting a preset number (k) of articles with the highest cosine similarity corresponding to the long-tail user i from the cosine similarity sorting result, and determining the preset number (k) of articles with the highest cosine similarity corresponding to the long-tail user i as a target article. And then outputting the target object corresponding to the long-tail user i, namely outputting an object recommendation result of the long-tail user i. The preset number (k) is not limited in the embodiment of the present application.
In this embodiment, since the embodiment of the present application trains a more accurate preset graph neural network model based on the initial undirected graph and the target undirected graph, the initial undirected graph is input to the more accurate preset graph neural network model to perform graph convolution processing, so as to generate a more accurate target embedded characterization matrix. And then, recommending the articles for the long-tail users according to the more accurate target embedded characterization matrix, so that more accurate article recommendation results can be generated.
In an alternative embodiment, as shown in fig. 11, there is provided an item recommendation method, which is described by taking the server 104 in fig. 1 as an example, and includes the following steps:
step 1102, constructing an initial adjacency matrix by taking user nodes and object nodes corresponding to the user nodes as nodes and taking the relation between the user nodes and the object nodes as edges;
step 1104, obtaining feature vectors of user nodes and feature vectors of object nodes, and generating a feature matrix according to the feature vectors of the user nodes and the feature vectors of the object nodes;
step 1106, constructing an initial undirected weight graph according to the initial adjacency matrix and the feature matrix; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users;
Step 1108, deleting the connection edge of the head node in the initial undirected graph to generate a new undirected graph;
step 1110, inputting the new undirected weight map into the initial map self-encoder model for processing, and generating a new adjacency matrix;
step 1112, calculating a value of a loss function of the initial graph self-encoder model according to the new adjacency matrix, and updating model parameters of the initial graph self-encoder model according to the value of the loss function to generate a preset graph self-encoder model;
step 1114, inputting the initial undirected weight-free graph into a preset graph self-encoder model, and predicting the connecting edges of the long tail nodes to be added to generate a target adjacency matrix;
step 1116, generating a target undirected weight graph according to the target adjacency matrix and the feature matrix;
step 1118, inputting the initial undirected graph into the initial graph neural network model to perform graph convolution processing, and generating a first embedded characterization matrix corresponding to the initial undirected graph;
step 1120, inputting the target undirected graph into the initial graph neural network model for graph convolution processing, and generating a second embedded characterization matrix corresponding to the target undirected graph;
step 1122, calculating the value of the loss function of the initial graph neural network model according to the first embedded characterization matrix and the second embedded characterization matrix, and updating the model parameters of the initial graph neural network model according to the value of the loss function to generate a preset graph neural network model;
Step 1124, inputting the initial undirected unauthorized graph into a preset graph neural network model to perform graph convolution processing, and generating a target embedded characterization matrix; the preset map neural network model is trained based on the initial undirected graph and the target undirected graph;
and 1126, recommending the articles for the long-tail user according to the target embedded characterization matrix, and generating article recommendation results.
According to the article recommending method, an initial undirected graph is constructed according to the user nodes, the article nodes corresponding to the user nodes, the relation among the user nodes and the article nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users; inputting the initial undirected graph into a preset graph self-encoder model, and updating the relation between long tail nodes and other nodes to generate a target undirected graph; and recommending the articles for the long-tail user according to the initial undirected unauthorized image and the target undirected unauthorized image, and generating article recommendation results. According to the method and the device, the established initial undirected graph is input into the preset graph self-encoder model, and the relation between the long-tail nodes and other nodes is updated, so that imperfect feedback information of the long-tail user can be expanded, and the target undirected graph containing more comprehensive feedback information is generated. And then, according to the constructed initial undirected graph and the target undirected graph containing more comprehensive feedback information, the long-tail user can be more accurately recommended with the article, so that a more accurate article recommendation result is generated.
It should be understood that, although the steps in the flowcharts related to the above embodiments are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
In an alternative embodiment, as shown in fig. 12, an item recommendation system 1200 is provided, and the server 104 in fig. 1 is taken as an example for operation of the system, where the item recommendation system 1200 includes a data processing module 1220, a long tail node enhancement module 1240, a graph neural network module 1260, and a recommendation module 1280.
The data processing module 1220 is configured to generate an initial undirected weight graph according to information of each user and each article. The data processing module 1220 includes: modeling the user and the object as nodes, and modeling the feedback behavior of the user on the object and the interaction behavior between the users as edges to obtain an initial adjacency matrix A; the feature information of the user and the article is subjected to independent heat coding to obtain an f-dimensional feature vector, so as to obtain a feature matrix X; and constructing an initial undirected unbiased graph G according to the adjacent matrix A and the feature matrix X.
The long-tail node enhancement module 1240 is configured to perform long-tail node enhancement based on the initial undirected graph, and generate an enhanced target undirected graph. The long tail node enhancement module 1240 includes: performing random edge deletion processing on the head node in the initial undirected graph G; training and updating a preset graph self-encoder model according to the undirected graph after random edge deletion; and predicting the connecting edges added to the long tail nodes from the encoder model by using a preset graph, and outputting a target undirected unauthorized graph G'.
The graph neural network module 1260 is configured to predict the embedded token matrix according to the initial undirected weight graph and the target undirected weight graph, and generate a target embedded token matrix. The graph neural network module 1260 includes: training and updating a preset map neural network model according to the initial undirected unauthorized map G and the target undirected unauthorized map G'; and inputting the initial undirected unauthorized graph G into a preset graph neural network model for processing, and outputting a target embedded characterization matrix Z.
The recommendation module 1280 is configured to recommend personalized articles to the long-tail user according to the user number i and the target embedded characterization matrix corresponding to the long-tail user. Recommendation module 1280 includes: calculating cosine similarity between the appointed long tail user i and embedded characterizations of all articles; and sorting the articles according to the cosine similarity, and selecting the first k articles as personalized recommendation results to be output.
Based on the same inventive concept, the embodiment of the application also provides an article recommending device for realizing the article recommending method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation of the embodiment of the one or more article recommendation devices provided below may refer to the limitation of the article recommendation method hereinabove, and will not be repeated herein.
In one embodiment, as shown in fig. 13, there is provided an item recommendation device 1300 comprising: an initial undirected weight graph construction module 1320, a target undirected weight graph generation module 1340, and an item recommendation result generation module 1360, wherein:
an initial undirected weight graph construction module 1320, configured to construct an initial undirected weight graph according to user nodes, object nodes corresponding to the user nodes, relationships between the user nodes, and relationships between the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long tail nodes corresponding to long tail users.
The target undirected graph generating module 1340 is configured to input the initial undirected graph into a preset graph self-encoder model, update the relationship between the long tail node and other nodes, and generate the target undirected graph.
And an article recommendation result generating module 1360, configured to recommend articles to the long-tail user according to the initial undirected graph and the target undirected graph, and generate article recommendation results.
In one embodiment, the initial undirected weight graph construction module 1320 includes:
the initial adjacency matrix construction unit is used for constructing an initial adjacency matrix by taking user nodes and object nodes corresponding to the user nodes as nodes and taking the relation between the user nodes and the object nodes as edges;
the feature matrix generation unit is used for acquiring the feature vector of the user node and the feature vector of the object node, and generating a feature matrix according to the feature vector of the user node and the feature vector of the object node;
the initial undirected graph construction unit is used for constructing the initial undirected graph according to the initial adjacency matrix and the feature matrix.
In one embodiment, the item recommendation device 1300 further includes:
the new undirected graph generating module is used for deleting the connecting edges of the head nodes in the initial undirected graph to generate a new undirected graph;
The new adjacency matrix generation module is used for inputting the new undirected weight graph into the initial graph self-encoder model for processing to generate a new adjacency matrix;
the preset graph self-encoder model generation module is used for calculating the value of the loss function of the initial graph self-encoder model according to the new adjacency matrix, updating the model parameters of the initial graph self-encoder model according to the value of the loss function, and generating the preset graph self-encoder model.
In one embodiment, the target undirected weight graph generation module 1340 includes:
the target adjacent matrix generation unit is used for inputting the initial undirected graph into a preset graph self-encoder model, predicting the connecting edges to be added of the long tail nodes, and generating a target adjacent matrix;
and the target undirected graph generating unit is used for generating a target undirected graph according to the target adjacency matrix and the feature matrix.
In one embodiment, the item recommendation device 1300 further includes:
the first embedded characterization matrix generation module is used for inputting the initial undirected graph into the initial graph neural network model to carry out graph convolution processing and generating a first embedded characterization matrix corresponding to the initial undirected graph;
the second embedded characterization matrix generation module is used for inputting the target undirected unauthorized graph into the initial graph neural network model to perform graph convolution processing and generating a second embedded characterization matrix corresponding to the target undirected unauthorized graph;
The preset graph neural network model generation module is used for calculating the value of a loss function of the initial graph neural network model according to the first embedded characterization matrix and the second embedded characterization matrix, updating model parameters of the initial graph neural network model according to the value of the loss function, and generating the preset graph neural network model.
In one embodiment, item recommendation result generation module 1360 includes:
the target embedded characterization matrix generation unit is used for inputting the initial undirected unauthorized graph into a preset graph neural network model to perform graph convolution processing, so as to generate a target embedded characterization matrix; the preset map neural network model is trained based on the initial undirected graph and the target undirected graph;
and the article recommendation result generation unit is used for recommending articles for the long-tail users according to the target embedded characterization matrix and generating article recommendation results.
The respective modules in the above-described item recommendation apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure of which may be as shown in fig. 14. The computer device includes a processor, a memory, an Input/Output interface (I/O) and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is for storing item recommendation data. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an item recommendation method.
It will be appreciated by those skilled in the art that the structure shown in fig. 14 is merely a block diagram of a portion of the structure associated with the present application and is not limiting of the computer device to which the present application applies, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of:
constructing an initial undirected weight graph according to the user nodes, the object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users;
inputting the initial undirected graph into a preset graph self-encoder model, and updating the relation between long tail nodes and other nodes to generate a target undirected graph;
and recommending the articles for the long-tail user according to the initial undirected unauthorized image and the target undirected unauthorized image, and generating article recommendation results.
In one embodiment, the initial undirected graph is constructed according to the user nodes, the object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes, and the following steps are realized when the processor executes the computer program:
taking user nodes and object nodes corresponding to the user nodes as nodes, and taking the relation between the user nodes and the object nodes as edges to construct an initial adjacency matrix;
acquiring feature vectors of user nodes and feature vectors of object nodes, and generating a feature matrix according to the feature vectors of the user nodes and the feature vectors of the object nodes;
and constructing an initial undirected weight graph according to the initial adjacency matrix and the feature matrix.
In one embodiment, the processor when executing the computer program further performs the steps of:
deleting the connecting edges of the head nodes in the initial undirected graph to generate a new undirected graph;
inputting the new undirected weight map into the initial map self-encoder model for processing to generate a new adjacency matrix;
and calculating the value of a loss function of the initial graph self-encoder model according to the new adjacency matrix, and updating the model parameters of the initial graph self-encoder model according to the value of the loss function to generate a preset graph self-encoder model.
In one embodiment, the initial undirected graph is input into a preset graph self-encoder model, the relationship between the long tail node and other nodes is updated, a target undirected graph is generated, and the following steps are further implemented when the processor executes the computer program:
inputting the initial undirected weight-free graph into a preset graph self-encoder model, and predicting the connecting edges to be added of the long tail nodes to generate a target adjacency matrix;
and generating a target undirected weight graph according to the target adjacency matrix and the feature matrix.
In one embodiment, the processor when executing the computer program further performs the steps of:
inputting the initial undirected graph into an initial graph neural network model to perform graph convolution processing, and generating a first embedded characterization matrix corresponding to the initial undirected graph;
inputting the target undirected unweighted graph into an initial graph neural network model for graph convolution processing, and generating a second embedded characterization matrix corresponding to the target undirected unweighted graph;
calculating the value of a loss function of the initial graph neural network model according to the first embedded characterization matrix and the second embedded characterization matrix, and updating model parameters of the initial graph neural network model according to the value of the loss function to generate a preset graph neural network model.
In one embodiment, according to the initial undirected graph and the target undirected graph, recommending the article to the long-tail user, generating an article recommendation result, and when the processor executes the computer program, further implementing the following steps:
inputting the initial undirected unauthorized graph into a preset graph neural network model to carry out graph convolution processing to generate a target embedded characterization matrix; the preset map neural network model is trained based on the initial undirected graph and the target undirected graph;
and recommending the articles for the long-tail users according to the target embedded characterization matrix, and generating article recommendation results.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
constructing an initial undirected weight graph according to the user nodes, the object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users;
inputting the initial undirected graph into a preset graph self-encoder model, and updating the relation between long tail nodes and other nodes to generate a target undirected graph;
And recommending the articles for the long-tail user according to the initial undirected unauthorized image and the target undirected unauthorized image, and generating article recommendation results.
In one embodiment, the initial undirected graph is constructed according to the user nodes, the item nodes corresponding to the user nodes, the relation between the user nodes and the item nodes, and the computer program when executed by the processor further realizes the following steps:
taking user nodes and object nodes corresponding to the user nodes as nodes, and taking the relation between the user nodes and the object nodes as edges to construct an initial adjacency matrix;
acquiring feature vectors of user nodes and feature vectors of object nodes, and generating a feature matrix according to the feature vectors of the user nodes and the feature vectors of the object nodes;
and constructing an initial undirected weight graph according to the initial adjacency matrix and the feature matrix.
In one embodiment, the computer program when executed by the processor further performs the steps of:
deleting the connecting edges of the head nodes in the initial undirected graph to generate a new undirected graph;
inputting the new undirected weight map into the initial map self-encoder model for processing to generate a new adjacency matrix;
And calculating the value of a loss function of the initial graph self-encoder model according to the new adjacency matrix, and updating the model parameters of the initial graph self-encoder model according to the value of the loss function to generate a preset graph self-encoder model.
In one embodiment, the initial undirected graph is input into a preset graph self-encoder model, the relationship between the long tail node and other nodes is updated, a target undirected graph is generated, and the computer program when executed by the processor further realizes the following steps:
inputting the initial undirected weight-free graph into a preset graph self-encoder model, and predicting the connecting edges to be added of the long tail nodes to generate a target adjacency matrix;
and generating a target undirected weight graph according to the target adjacency matrix and the feature matrix.
In one embodiment, the computer program when executed by the processor further performs the steps of:
inputting the initial undirected graph into an initial graph neural network model to perform graph convolution processing, and generating a first embedded characterization matrix corresponding to the initial undirected graph;
inputting the target undirected unweighted graph into an initial graph neural network model for graph convolution processing, and generating a second embedded characterization matrix corresponding to the target undirected unweighted graph;
Calculating the value of a loss function of the initial graph neural network model according to the first embedded characterization matrix and the second embedded characterization matrix, and updating model parameters of the initial graph neural network model according to the value of the loss function to generate a preset graph neural network model.
In one embodiment, according to the initial undirected graph and the target undirected graph, recommending the article to the long-tail user, generating an article recommendation result, and executing the computer program by the processor further comprises the following steps:
inputting the initial undirected unauthorized graph into a preset graph neural network model to carry out graph convolution processing to generate a target embedded characterization matrix; the preset map neural network model is trained based on the initial undirected graph and the target undirected graph;
and recommending the articles for the long-tail users according to the target embedded characterization matrix, and generating article recommendation results.
In one embodiment, a computer program product is provided comprising a computer program which, when executed by a processor, performs the steps of:
constructing an initial undirected weight graph according to the user nodes, the object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users;
Inputting the initial undirected graph into a preset graph self-encoder model, and updating the relation between long tail nodes and other nodes to generate a target undirected graph;
and recommending the articles for the long-tail user according to the initial undirected unauthorized image and the target undirected unauthorized image, and generating article recommendation results.
In one embodiment, the initial undirected graph is constructed according to the user nodes, the item nodes corresponding to the user nodes, the relation between the user nodes and the item nodes, and the computer program when executed by the processor further realizes the following steps:
taking user nodes and object nodes corresponding to the user nodes as nodes, and taking the relation between the user nodes and the object nodes as edges to construct an initial adjacency matrix;
acquiring feature vectors of user nodes and feature vectors of object nodes, and generating a feature matrix according to the feature vectors of the user nodes and the feature vectors of the object nodes;
and constructing an initial undirected weight graph according to the initial adjacency matrix and the feature matrix.
In one embodiment, the computer program when executed by the processor further performs the steps of:
deleting the connecting edges of the head nodes in the initial undirected graph to generate a new undirected graph;
Inputting the new undirected weight map into the initial map self-encoder model for processing to generate a new adjacency matrix;
and calculating the value of a loss function of the initial graph self-encoder model according to the new adjacency matrix, and updating the model parameters of the initial graph self-encoder model according to the value of the loss function to generate a preset graph self-encoder model.
In one embodiment, the initial undirected graph is input into a preset graph self-encoder model, the relationship between the long tail node and other nodes is updated, a target undirected graph is generated, and the computer program when executed by the processor further realizes the following steps:
inputting the initial undirected weight-free graph into a preset graph self-encoder model, and predicting the connecting edges to be added of the long tail nodes to generate a target adjacency matrix;
and generating a target undirected weight graph according to the target adjacency matrix and the feature matrix.
In one embodiment, the computer program when executed by the processor further performs the steps of:
inputting the initial undirected graph into an initial graph neural network model to perform graph convolution processing, and generating a first embedded characterization matrix corresponding to the initial undirected graph;
inputting the target undirected unweighted graph into an initial graph neural network model for graph convolution processing, and generating a second embedded characterization matrix corresponding to the target undirected unweighted graph;
Calculating the value of a loss function of the initial graph neural network model according to the first embedded characterization matrix and the second embedded characterization matrix, and updating model parameters of the initial graph neural network model according to the value of the loss function to generate a preset graph neural network model.
In one embodiment, according to the initial undirected graph and the target undirected graph, recommending the article to the long-tail user, generating an article recommendation result, and executing the computer program by the processor further comprises the following steps:
inputting the initial undirected unauthorized graph into a preset graph neural network model to carry out graph convolution processing to generate a target embedded characterization matrix; the preset map neural network model is trained based on the initial undirected graph and the target undirected graph;
and recommending the articles for the long-tail users according to the target embedded characterization matrix, and generating article recommendation results.
It should be noted that, the user information (including, but not limited to, user equipment information, user personal information, etc.) and the data (including, but not limited to, data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data are required to comply with the related laws and regulations and standards of the related countries and regions.
Those skilled in the art will appreciate that implementing all or part of the above-described methods in accordance with the embodiments may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples represent only a few embodiments of the present application, which are described in more detail and are not thereby to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (10)

1. A method of recommending items, the method comprising:
constructing an initial undirected unowned graph according to user nodes, object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users;
Inputting the initial undirected graph into a preset graph self-encoder model, and updating the relation between the long tail node and other nodes to generate a target undirected graph;
and recommending the long-tail user according to the initial undirected graph and the target undirected graph to generate an article recommendation result.
2. The method of claim 1, wherein constructing an initial undirected graph based on user nodes, item nodes corresponding to the user nodes, relationships between the user nodes, and relationships between the user nodes and the item nodes comprises:
taking user nodes and object nodes corresponding to the user nodes as nodes, and taking the relation between the user nodes and the object nodes as edges to construct an initial adjacency matrix;
acquiring the feature vector of the user node and the feature vector of the object node, and generating a feature matrix according to the feature vector of the user node and the feature vector of the object node;
and constructing an initial undirected weight graph according to the initial adjacency matrix and the feature matrix.
3. The method according to claim 2, wherein the method further comprises:
deleting the connecting edges of the head nodes in the initial undirected graph to generate a new undirected graph;
inputting the new undirected weight map into an initial map self-encoder model for processing to generate a new adjacency matrix;
and calculating the value of a loss function of the initial graph self-encoder model according to the new adjacency matrix, and updating the model parameters of the initial graph self-encoder model according to the value of the loss function to generate a preset graph self-encoder model.
4. A method according to claim 2 or 3, wherein the inputting the initial undirected graph into a preset graph self-encoder model updates the relationship between the long tail node and other nodes to generate a target undirected graph, and comprises:
inputting the initial undirected weight-free graph into a preset graph self-encoder model, and predicting the connecting edges to be added of the long tail nodes to generate a target adjacent matrix;
and generating the target undirected unowned graph according to the target adjacency matrix and the feature matrix.
5. A method according to any one of claims 1-3, characterized in that the method further comprises:
inputting the initial undirected graph into an initial graph neural network model to perform graph convolution processing, and generating a first embedded characterization matrix corresponding to the initial undirected graph;
inputting the target undirected graph into an initial graph neural network model to perform graph convolution processing, and generating a second embedded characterization matrix corresponding to the target undirected graph;
calculating the value of a loss function of the initial graph neural network model according to the first embedded characterization matrix and the second embedded characterization matrix, and updating model parameters of the initial graph neural network model according to the value of the loss function to generate a preset graph neural network model.
6. The method of claim 5, wherein the recommending the item to the long-tailed user based on the initial undirected weight graph and the target undirected weight graph, generating an item recommendation result, comprises:
inputting the initial undirected unauthorized graph into a preset graph neural network model to carry out graph convolution processing to generate a target embedded characterization matrix; the preset graph neural network model is trained based on the initial undirected graph and the target undirected graph;
And recommending the long-tail user with the article according to the target embedded characterization matrix, and generating the article recommendation result.
7. An item recommendation device, the device comprising:
the initial undirected weight map construction module is used for constructing an initial undirected weight map according to user nodes, object nodes corresponding to the user nodes, the relation among the user nodes and the object nodes; the user nodes comprise head nodes corresponding to head users and long-tail nodes corresponding to long-tail users;
the target undirected graph generating module is used for inputting the initial undirected graph into a preset graph self-encoder model, updating the relation between the long tail node and other nodes, and generating a target undirected graph;
and the article recommendation result generation module is used for recommending the article to the long-tail user according to the initial undirected graph and the target undirected graph to generate an article recommendation result.
8. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 6 when the computer program is executed.
9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 6.
CN202310027099.3A 2023-01-09 2023-01-09 Item recommendation method, device, computer apparatus, storage medium, and program product Pending CN116128575A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310027099.3A CN116128575A (en) 2023-01-09 2023-01-09 Item recommendation method, device, computer apparatus, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310027099.3A CN116128575A (en) 2023-01-09 2023-01-09 Item recommendation method, device, computer apparatus, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN116128575A true CN116128575A (en) 2023-05-16

Family

ID=86307651

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310027099.3A Pending CN116128575A (en) 2023-01-09 2023-01-09 Item recommendation method, device, computer apparatus, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN116128575A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116955836A (en) * 2023-09-21 2023-10-27 腾讯科技(深圳)有限公司 Recommendation method, recommendation device, recommendation apparatus, recommendation computer readable storage medium, and recommendation program product

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116955836A (en) * 2023-09-21 2023-10-27 腾讯科技(深圳)有限公司 Recommendation method, recommendation device, recommendation apparatus, recommendation computer readable storage medium, and recommendation program product
CN116955836B (en) * 2023-09-21 2024-01-02 腾讯科技(深圳)有限公司 Recommendation method, recommendation device, recommendation apparatus, recommendation computer readable storage medium, and recommendation program product

Similar Documents

Publication Publication Date Title
CN110796190B (en) Exponential modeling with deep learning features
Qu et al. Product-based neural networks for user response prediction
CN110717098B (en) Meta-path-based context-aware user modeling method and sequence recommendation method
CN113705772A (en) Model training method, device and equipment and readable storage medium
CN112418292B (en) Image quality evaluation method, device, computer equipment and storage medium
CN114372573B (en) User portrait information recognition method and device, computer equipment and storage medium
CN107786943A (en) A kind of tenant group method and computing device
CN116128575A (en) Item recommendation method, device, computer apparatus, storage medium, and program product
Abada et al. An overview on deep leaning application of big data
CN114742210A (en) Hybrid neural network training method, traffic flow prediction method, apparatus, and medium
CN110674181A (en) Information recommendation method and device, electronic equipment and computer-readable storage medium
CN114417161A (en) Virtual article time sequence recommendation method, device, medium and equipment based on special-purpose map
CN112819154B (en) Method and device for generating pre-training model applied to graph learning field
US20240037133A1 (en) Method and apparatus for recommending cold start object, computer device, and storage medium
CN113779380A (en) Cross-domain recommendation method, device and equipment, and content recommendation method, device and equipment
Mishra et al. Unsupervised functional link artificial neural networks for cluster Analysis
CN116383441A (en) Community detection method, device, computer equipment and storage medium
CN114219184A (en) Product transaction data prediction method, device, equipment, medium and program product
CN114529399A (en) User data processing method, device, computer equipment and storage medium
Ko et al. Deep Gaussian process models for integrating multifidelity experiments with nonstationary relationships
Pankaja et al. A hybrid approach combining CUR matrix decomposition and weighted kernel sparse representation for plant leaf recognition
Hong et al. TimeKit: A time-series forecasting-based upgrade kit for collaborative filtering
CN117252665B (en) Service recommendation method and device, electronic equipment and storage medium
CN115455306B (en) Push model training method, information push device and storage medium
CN115658899A (en) Text classification method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Applicant after: Zhaolian Consumer Finance Co.,Ltd.

Applicant after: SUN YAT-SEN University

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Applicant before: MERCHANTS UNION CONSUMER FINANCE Co.,Ltd.

Country or region before: China

Applicant before: SUN YAT-SEN University