WO2020114122A1 - Système de réseau neuronal et procédé d'analyse de graphe de réseau de relations - Google Patents

Système de réseau neuronal et procédé d'analyse de graphe de réseau de relations Download PDF

Info

Publication number
WO2020114122A1
WO2020114122A1 PCT/CN2019/112564 CN2019112564W WO2020114122A1 WO 2020114122 A1 WO2020114122 A1 WO 2020114122A1 CN 2019112564 W CN2019112564 W CN 2019112564W WO 2020114122 A1 WO2020114122 A1 WO 2020114122A1
Authority
WO
WIPO (PCT)
Prior art keywords
output
graph
neural network
weighting factor
network
Prior art date
Application number
PCT/CN2019/112564
Other languages
English (en)
Chinese (zh)
Inventor
常晓夫
宋乐
Original Assignee
阿里巴巴集团控股有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 阿里巴巴集团控股有限公司 filed Critical 阿里巴巴集团控股有限公司
Publication of WO2020114122A1 publication Critical patent/WO2020114122A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • G06F16/288Entity relationship models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Definitions

  • One or more embodiments of this specification relate to a neural network system executed by a computer, and particularly to a neural network system and method for analyzing a relational network graph.
  • the graph is a powerful tool for modeling relational data. Therefore, at present, the data in which the relationship exists is often expressed and modeled in the form of a graph.
  • a graph-based neural network using deep learning methods Graph Neural Network (Graph Neural Network, Graph NN or GNN) is proposed to learn graph information.
  • the graph neural network GNN can effectively use the information transfer on the graph and fuse the feature information of the nodes or edges to complete the machine learning tasks such as the classification or regression of the nodes or edges on the graph.
  • One or more embodiments of this specification describe a neural network system and method executed by a computer for analyzing a relationship network graph, which can learn, analyze, and predict the relationship network graph more effectively.
  • a neural network system for analyzing a relationship network graph executed by a computer including:
  • the feature extraction layer is used to extract the feature vectors of the nodes in the relationship network graph
  • a deep neural network which is used to perform first processing on the feature vector to obtain a first output
  • a graph neural network used to combine the adjacency information of the relational network graph to perform a second processing on the feature vector to obtain a second output; wherein the adjacency information is used to represent each of the nodes included in the relational network graph Connection between
  • a fusion layer is used to fuse the first output and the second output, and output a prediction result for the node based on the fusion result.
  • each node included in the relationship network diagram corresponds to each user, and the connection relationship between the various nodes includes one or more of the following: social relationship, media relationship, and fund relationship between users .
  • the relationship network graph is a directed graph
  • the adjacency information includes an adjacency list or a cross-linked list corresponding to the directed graph.
  • the adjacency information includes the adjacency matrix of the relationship network graph.
  • the graph neural network is a graph convolutional network
  • the graph convolutional network includes multiple network layers to perform the second processing, and the second processing includes at least using the adjacency matrix.
  • the element is a weighting factor, which performs a weighted sum operation on the feature vectors of the node and its neighbors.
  • the above fusion layer is specifically used for weighted summation of the first output and the second output, where the first output corresponds to a first weighting factor and the second output corresponds to a second Weighting factor.
  • the first weighting factor is a function of the first output
  • the second weighting factor is a function of the second output
  • the sum of the first weighting factor and the second weighting factor is 1, and:
  • the first weighting factor is a function of the first output
  • the second weighting factor is a function of the second output.
  • the sum of the first weighting factor and the second weighting factor is 1, and:
  • the first weighting factor is a function of the first output and the second output; or,
  • the second weighting factor is a function of the first output and the second output.
  • the neural network system is trained in an end-to-end manner.
  • a computer-implemented method for analyzing a relationship network graph including:
  • a deep neural network is used to perform the first processing on the feature vector to obtain a first output
  • a graph neural network is used, combined with the adjacency information of the relational network graph, to perform a second processing on the feature vector to obtain a second output; wherein the adjacency information is used to represent the relationship between each node included in the relational network graph Connection
  • the first output and the second output are fused, and a prediction result for the node is output based on the fusion result.
  • a computer-readable storage medium on which a computer program is stored, and when the computer program is executed in a computer, the computer is caused to perform the method of the second aspect.
  • a computing device including a memory and a processor, wherein the memory stores executable code, and when the processor executes the executable code, the method of the second aspect is implemented .
  • a deep neural network DNN and a graph neural network GNN are combined, and the single-node feature processing capability of DNN and the relational feature processing capability of GNN are fused to make the combined neural network system Can effectively analyze and learn a variety of relationship network diagrams.
  • the graph neural network GNN can play a major role, and the deep neural network DNN supplements the single node analysis; and if the relationship features are missing or the effect is limited, the deep neural network can still be used
  • the branch of DNN effectively analyzes and processes the nodes in the graph to give a more ideal prediction result.
  • FIG. 1 is a schematic diagram of a relationship network diagram disclosed in this specification
  • FIG. 2 shows a schematic diagram of a neural network system according to an embodiment
  • FIG. 3 shows a schematic diagram of a deep neural network DNN according to an embodiment
  • FIG. 4 shows a schematic diagram of a graph convolutional network GCN according to an embodiment
  • FIG. 5 shows a flowchart of a method for analyzing a relationship network diagram according to an embodiment.
  • a neural network system for processing relational data is proposed.
  • the neural network system can be used for learning and predicting relational network graphs.
  • FIG. 1 is a schematic diagram of a relationship network diagram disclosed in this specification. It can be seen that the relational network graph includes multiple nodes, and the nodes with associated relationships are connected to each other by connecting edges. Nodes that are not associated with other nodes form isolated nodes, such as nodes A, B, and C in Figure 1.
  • the type and/or strength of the association relationship can also be distinguished, so as to assign certain attributes or weights to the connected edges. For example, in FIG. 1, thick lines indicate strong connections, and thin lines indicate weak connections. However, this is not necessary.
  • the relationship network graph can reflect the association relationship between various entities.
  • the nodes may represent sellers or buyers, and the edges between the nodes may represent that a transaction has occurred, thereby reflecting the transaction relationship between entities through a relationship network graph.
  • the nodes represent individual users, and the edges between the nodes represent the association between users. More specifically, in different examples, a connection relationship may be established for nodes based on different types of association relationships between users.
  • the association relationship between users may include a social relationship between users.
  • a social relationship between users In a relationship network formed based on social relationships, if two users have common followers (such as Weibo accounts following the same person together), or they have previous contacts, or join a common group (such as QQ group, WeChat group) Etc.), or interactions in activities such as red envelopes and lottery tickets, then it can be considered that there is a social relationship between the two nodes, and an edge can be established to connect.
  • the association relationship between users may include a media relationship between users.
  • a media relationship between users In the relationship network formed based on the media relationship, if two users have used the same media, such as encrypted bank card, ID card, mailbox, account number, mobile phone number, physical address (such as MAC address), terminal device number ( For example, UMID, TID, UTDID, etc., then there is a media relationship between these two users, you can establish an edge to connect.
  • the association relationship between users may include the financial relationship between users.
  • Fund transactions can include charge, charge, bar code collection, bar code payment, AA collection, C2C mobile phone face-to-face payment, gift money, rent, red envelope, credit card repayment, purchase, intimate payment, subscription service, etc.
  • the nodes in the relationship network graph may also represent other entities, and the connections between the nodes may be based on various types of association relationships.
  • the graph neural network GNN can generally be used for learning and prediction.
  • the learning process generally corresponds to the training process of graph neural network GNN.
  • the graph neural network GNN When training the graph neural network GNN, it is necessary to add labels to at least some nodes in FIG. 1 according to the purpose of the prediction business, so as to perform supervised learning. For example, assuming that the nodes in the relational network graph in FIG. 1 correspond to individual users, if you want to learn and predict the credit risk of each user, at least some nodes need to add "high risk users” (users suspected of fraud) and "normal users” ", provide these labeled nodes, together with the connection relationship of these nodes in the graph, to the graph neural network GNN for GNN to train and learn. After the training is completed, the unknown user can be input to the graph neural network GNN, so that the graph neural network GNN uses the trained network parameters to predict the unknown user.
  • the relationship network graph is constructed based on the association relationship.
  • association relationship there are various association relationships between nodes. If the relationship between the selected association relationship and the business purpose is not high enough when constructing the relationship network graph, then the relationship network graph constructed based on such association relationship Learning, the results are often not ideal. For example, in the case where the business purpose is to learn the user’s purchase intention of a certain type of product, if the association relationship is selected as whether it has a social relationship with a common object of interest, such relationship data may not be effective for determining the business purpose of the purchase intention .
  • a combined neural network system which can learn the relationship network graph more effectively.
  • FIG. 2 shows a schematic diagram of a neural network system according to an embodiment.
  • the neural network system is executed by a computer and is used to learn a relationship network graph and process relationship data.
  • the neural network system includes a feature extraction layer 21 for extracting feature vectors of nodes in the relational network graph; a deep neural network 22 for performing first processing on the feature vectors to obtain An output; graph neural network 23, used to combine the adjacency information of the relational network graph, and perform a second processing on the feature vector to obtain a second output; a fusion layer 24, used for the first output and the The second output is fused, and a prediction result for the node is output based on the fusion result.
  • the following specifically describes the implementation of each of the above network parts.
  • the feature extraction layer 21 is used to extract feature vectors of nodes in the relationship network graph.
  • the relationship network diagram is, for example, the relationship network diagram shown in FIG. 1, which includes multiple nodes, and there is a connection relationship between nodes having an association relationship.
  • the feature extraction layer 21 performs feature extraction on the nodes in the relational network graph, and the extracted features are the features of the relevant nodes themselves, so the extracted features constitute a feature vector.
  • each node in the above relational network diagram corresponds to each user, for example, Alipay user.
  • the feature extraction layer 21 may extract the user's basic attribute features to form a feature vector.
  • the basic attribute features include, for example, user ID, registration duration, gender, age, and so on.
  • the feature extraction layer 21 also extracts features related to the business purpose according to the business purpose. For example, in the case where the business purpose is to predict the user's purchase intention, the feature extraction layer 21 also obtains the user's purchase records, and performs feature extraction based on the purchase records, such as extracting multiple features such as the number of purchases, purchase categories, and purchase amount. For another example, when the business purpose is to predict the borrowing risk of the user, the feature extraction layer 21 also obtains the borrowing record of the user, and extracts the features based on the borrowing record, such as extracting the number of loans, the amount of loan, the amount of repayment, and the number of trustworthiness , Overdue times and other characteristics.
  • the feature extraction layer 21 extracts the feature vectors of the nodes, the feature vectors are input to the deep neural network 22 and the graph neural network 23 in parallel for processing.
  • Deep Neural Network is a multi-layer fully connected forward-structured artificial neural network.
  • Figure 3 shows a schematic diagram of a deep neural network according to one embodiment.
  • the deep neural network DNN contains multiple network layers. These network layers can be divided into an input layer, a hidden layer, and an output layer. Between the input layer and the output layer are hidden layers. Generally, when there are many hidden layers in a neural network, we call it a deep neural network.
  • Each network layer in the DNN contains several neurons, and all neurons except the input layer operate on the input data through the activation function.
  • the network layers are fully connected, that is, any neuron in layer i is connected to any neuron in layer i+1.
  • Deep neural network DNN can be designed and trained for analysis and prediction of various business scenarios.
  • the deep neural network DNN is taken as a branch part of the neural network system of an example.
  • the feature extraction layer 21 extracts the feature vector of the node
  • the feature vector is provided to the input layer of the deep neural network DNN 22, which is processed through the hidden layer, and the processing result is output through the output layer of the DNN.
  • the process of processing the feature vectors by the hidden layer of the DNN is called first processing, and the processing result output by the DNN output layer is called the first output.
  • the node feature vector extracted by the feature extraction layer 21 is also input to the graph neural network GNN23.
  • the graph neural network GNN is used to analyze and learn the relational network graph. Similar to the conventional neural network, the graph neural network GNN also has a multi-layer network structure, which operates and processes input data through the function mapping of neurons.
  • GNN will process the feature vectors of nodes in the process of combining the connection information between the nodes in the relational network graph.
  • the connection relationship information between the nodes in the above relationship network graph is also called adjacency information.
  • the process of processing the feature vectors of the nodes in the GNN in conjunction with the adjacency information is called second processing, and the result of the GNN processing is called the second output.
  • the adjacency information of the relationship network graph is obtained in advance and provided to the graph neural network GNN23.
  • connection relationship between nodes in the relational network graph can be represented by an adjacency matrix.
  • the adjacency matrix is an n*n-dimensional matrix.
  • the above relational network graph is a directed graph, that is, the connection between nodes is directed.
  • an adjacency table is also used to record the adjacency information of the relationship network graph, where the adjacency table may further include a forward adjacency table and a reverse adjacency table.
  • a cross-linked list is generated based on the positive adjacency list and the reverse adjacency list, and the cross-linked list is used to record the connection relationship between the nodes in the directed graph.
  • the adjacency information may also be recorded in other forms.
  • the graph neural network GNN 23 can determine the neighbor nodes of the current node and the connection edge information between the current node and these neighbor nodes based on the adjacency information, and extract the node information of these neighbor nodes and the edge information and features of the connection edges.
  • the feature vectors of the current node input at layer 21 are integrated, and a second output is obtained.
  • the graph neural network GNN 23 is implemented using a graph convolutional network GCN (Graph Convolutional Network).
  • GCN Graph Convolutional Network
  • Fig. 4 shows a schematic diagram of a graph convolution network GCN according to one embodiment.
  • the graph convolutional network GCN contains multiple network layers, and each network layer defines a neural network model f(X,A) through the neurons therein, where X is the input feature vector, that is, the aforementioned features
  • the feature vector of the current node input to the GCN by the extraction layer 21, A is the adjacency matrix of the relational network graph, and the neural network model f(X, A) can be more specifically expressed as:
  • I a degree matrix of A+ ⁇ I
  • is a hyperparameter, used to control the weight of a node relative to its neighbors, and is set to 1 in the original model.
  • H (l+1) represents the output of each network layer.
  • H 0 X, that is, the input layer receives the feature vector X of the node.
  • the d*d dimension vector W (l) and the d*1 dimension parameter b are both trainable network layer parameters, and ⁇ is a nonlinear function.
  • the ⁇ function may be a Relu function, a sigmoid function, a tanh function, a softmax function, and so on.
  • the first layer of the network uses the elements in the adjacency matrix as the weighting factors to sum the feature vectors of the current node and its neighbors (labeled or unlabeled), and then use W (l) and b Perform a linear transformation operation, and then apply a nonlinear activation function ⁇ .
  • the subsequent operations of each network layer also include at least, using the elements in the adjacency matrix as a weighting factor, and performing a weighted sum operation on the node vector output by the previous network layer and its neighbor node vectors; in addition, using W (l ) And b linear transformation operation, and the applied nonlinear activation function ⁇ operation.
  • W l and b l are the trainable network layer parameters of layer l
  • nhood(i) represents the neighbor node of node i.
  • the feature vectors of the nodes are processed by combining the adjacency information expressed by the adjacency matrix.
  • the above adjacency matrix may be a matrix after normalization, so as to avoid the situation that the distribution of elements in some adjacency matrices is too large.
  • some relational network graphs contain some super nodes, which are connected to almost every node in the graph; on the other hand, some nodes are very isolated and have few connections, which will cause different nodes in the adjacency matrix
  • the number of corresponding connecting edges (for example, the sum of elements corresponding to a row or a column in the matrix) is very different.
  • the adjacency matrix can be normalized.
  • the use of normalized adjacency matrix for weighted summation in GCN is equivalent to the average pooling operation of the current node and the adjacent nodes.
  • the graph neural network GNN 23 may also use other network structures and hidden layer algorithms.
  • the common point is that the second processing performed by the graph neural network GNN 23 needs to combine the adjacency information of the relational network graph to comprehensively process the feature vector of the current node to obtain the second output.
  • the first processing performed by the deep neural network DNN 22 is only for the feature vector of the current node, focusing on the analysis of the attribute characteristics of the node itself, that is, the single point feature
  • the second processing performed by the graph neural network GNN 23 needs to incorporate the adjacency information of the relational network graph and introduce the relationship characteristics between the current node and other nodes.
  • the first output of the deep neural network DNN 22 and the second output of the graph neural network 23 are fused through the fusion layer 24, and the prediction result for the current node is output based on the fusion result.
  • the first output be H1 and the second output H2.
  • the fusion layer 24 may fuse H1 and H2 in various ways to obtain a fusion result H.
  • the fusion layer 24 fuses the first output H1 and the second output H2 through a fusion function F:
  • the fusion function F can be various linear or nonlinear functions.
  • the fusion layer 24 performs weighted summation on the first output H1 and the second output H2 (corresponding to the case where the fusion function is linear summation), that is:
  • w1 is the first weighting factor corresponding to the first output
  • w2 is the second weighting factor corresponding to the second output
  • the first output H1 and the second output H2 are both in the form of output vectors; and the weighting factors w1 and w2 may be scalars, vectors, or even matrices.
  • the values of the weighting factors w1 and w2 are optimized and determined through the training process of the neural network system.
  • the above weight factor is set to a function corresponding to the output, for example, the first weight factor w1 is set to a function of the first output H1, and the second weight factor w2 is set to the second output H2 function:
  • the function g is preferably a non-linear function, such as sigmoid function and tanh function.
  • the weighting factors w1 and w2 are trained and determined, that is, the training and determination parameters u1, b1, u2, b2.
  • the weighting factors w1 and w2 are trained independently of each other, and the value range of the final result H is not guaranteed.
  • first the sum of the first weighting factor and the second weighting factor is set to 1, and then only one of them is set and adjusted.
  • set the fusion result H as:
  • the first weighting factor ⁇ may be set as a function of the first output, or a function of the first output and the second output, namely:
  • the second weighting factor ⁇ can also be set and adjusted so that the first weighting factor is (1- ⁇ ), that is:
  • the second weighting factor may also be set as a function of the second output, or a function of the first output and the second output, namely:
  • the fusion layer 24 obtains a fusion result H, and outputs a prediction result for the current node based on the fusion result H.
  • the prediction result is for the predicted value of the labeled node in the training phase; in the use phase, it is the final classification prediction for the unknown result.
  • the following describes the execution process of the neural network system shown in FIG. 2 in the training phase and the use phase.
  • the end-to-end training can be used.
  • labeled node information is input on the input side of the entire neural network system, that is, the feature extraction layer 21 extracts feature vectors of several labeled nodes.
  • the labels may be various types of labels, for example, labels used to represent risk levels, such as 1 for high-risk users, 0 for ordinary users, and so on. Then, the prediction results for each node are obtained on the output side of the entire neural network system.
  • the prediction result is output by the fusion layer 24 according to the fusion result, and may be embodied as a prediction value for each node. Compare the predicted value of each node with its label, according to the comparison result and the preset loss function, get the error of this batch of samples, and then through the error back propagation, adjust the network parameters of the entire neural network system, and finally Identify the network parameters that minimize the error. Once the optimal network parameters are determined, it can be considered that the training of the neural network system is completed, and the neural network system can be used for the prediction of unknown nodes.
  • the node information of the unknown node is input to the input side of the neural network system, that is, the feature extraction layer 21 extracts the feature vector of the unknown node.
  • the network parameters determined by the training stage in the neural network system are used, and the feature vector is processed by the parallel deep neural network DNN and the graph neural network GNN.
  • the fusion layer 24 outputs a prediction result based on the fusion result.
  • the prediction result is The output result of business prediction for the unknown node.
  • the neural network system of FIG. 2 combines the deep neural network DNN and the graph neural network GNN, which fuses the single-node feature processing ability of the DNN and the relational feature processing ability of the GNN, making the combined neural network
  • the system can effectively analyze and learn all kinds of relational network graphs.
  • Graph neural network GNN can play a major role in the case where the relationship features in the relationship network graph are perfect and effective.
  • the deep neural network DNN supplements the analysis of single nodes; and if the relationship features are missing or have limited effects, such as the existence of the relationship network graph A large number of isolated nodes, or the relationship based on the construction of the relationship network graph is not very effective for the business. In such a case, the nodes in the graph can be effectively analyzed and processed through the branch of the deep neural network DNN to give More ideal prediction results.
  • FIG. 5 shows a flowchart of a method for analyzing a relationship network diagram according to an embodiment. It can be understood that this method can be performed by any device, device, computing platform, or computing cluster with computing and processing capabilities. As shown in Figure 5, the method includes:
  • Step 51 Extract the feature vectors of the nodes in the relationship network graph
  • Step 52 Use a deep neural network to perform first processing on the feature vector to obtain a first output
  • Step 53 A graph neural network is used to combine the adjacency information of the relational network graph to perform a second processing on the feature vector to obtain a second output; wherein the adjacency information is used to represent each of the relations network graph The connection relationship between nodes;
  • Step 54 Fusion the first output and the second output, and output a prediction result for the node based on the fusion result.
  • step 52 and step 53 may be executed in any order, or in parallel, and are not limited herein.
  • each node included in the relationship network diagram corresponds to each user, and the connection relationship between the various nodes includes one or more of the following: social relationship, media relationship, and fund relationship between users .
  • the relationship network graph is a directed graph
  • the adjacency information includes an adjacency list or a cross-linked list corresponding to the directed graph.
  • the adjacency information includes the adjacency matrix of the relationship network graph.
  • the graph neural network described above is a graph convolutional network.
  • the graph convolutional network includes a plurality of network layers to perform the second processing.
  • the second processing includes at least using the adjacency matrix.
  • the element is a weighting factor, which performs a weighted sum operation on the feature vectors of the node and its neighbors.
  • the fusing the first output and the second output in step 54 specifically includes weighting and summing the first output and the second output, wherein the first output Corresponding to the first weighting factor, the second output corresponds to the second weighting factor.
  • the first weighting factor is a function of the first output
  • the second weighting factor is a function of the second output
  • the sum of the first weighting factor and the second weighting factor is 1, and:
  • the first weighting factor is a function of the first output
  • the second weighting factor is a function of the second output.
  • the sum of the first weighting factor and the second weighting factor is 1, and:
  • the first weighting factor is a function of the first output and the second output; or,
  • the second weighting factor is a function of the first output and the second output.
  • the relationship network graph is comprehensively analyzed.
  • a computer-readable storage medium on which a computer program is stored, and when the computer program is executed in a computer, the computer is caused to perform the method described in conjunction with FIG. 5.
  • a computing device including a memory and a processor, where executable code is stored in the memory, and when the processor executes the executable code, the implementation described in conjunction with FIG. 5 is implemented method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

La présente invention concerne un système de réseau neuronal et un procédé exécuté au moyen d'un ordinateur et utilisé pour analyser un graphe de réseau de relations. Le système de réseau neuronal comprend : une couche d'extraction de caractéristiques (21) destinée à extraire un vecteur de caractéristiques d'un nœud dans un graphe de réseau de relations ; un réseau neuronal profond (22) destiné à mettre en œuvre un premier traitement sur le vecteur de caractéristiques de manière à obtenir une première sortie ; un réseau neuronal de graphe (23) destiné à mettre en œuvre un second traitement sur le vecteur de caractéristiques en combinaison avec des informations de contiguïté du graphe de réseau de relations de manière à obtenir une seconde sortie, les informations de contiguïté étant utilisées pour représenter des relations de connexion entre divers nœuds inclus dans le graphe de réseau de relations ; et une couche de fusion (24) destinée à fusionner la première sortie et la seconde sortie et à délivrer, sur la base d'un résultat de fusion, un résultat de prédiction concernant le nœud.
PCT/CN2019/112564 2018-12-07 2019-10-22 Système de réseau neuronal et procédé d'analyse de graphe de réseau de relations WO2020114122A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811497595.0A CN110009093B (zh) 2018-12-07 2018-12-07 用于分析关系网络图的神经网络系统和方法
CN201811497595.0 2018-12-07

Publications (1)

Publication Number Publication Date
WO2020114122A1 true WO2020114122A1 (fr) 2020-06-11

Family

ID=67165074

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/112564 WO2020114122A1 (fr) 2018-12-07 2019-10-22 Système de réseau neuronal et procédé d'analyse de graphe de réseau de relations

Country Status (3)

Country Link
CN (1) CN110009093B (fr)
TW (1) TWI709086B (fr)
WO (1) WO2020114122A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085104A (zh) * 2020-09-10 2020-12-15 杭州中奥科技有限公司 一种事件特征提取方法、装置、存储介质及电子设备
CN112819152A (zh) * 2020-08-14 2021-05-18 腾讯科技(深圳)有限公司 一种神经网络训练方法及装置
CN113377656A (zh) * 2021-06-16 2021-09-10 南京大学 一种基于图神经网络的众测推荐方法
CN113792089A (zh) * 2021-09-16 2021-12-14 平安银行股份有限公司 基于人工智能的非法行为检测方法、装置、设备及介质
CN114169515A (zh) * 2020-08-20 2022-03-11 四川大学 一种基于高阶图神经网络的社交关系识别方法
CN114818973A (zh) * 2021-07-15 2022-07-29 支付宝(杭州)信息技术有限公司 一种基于隐私保护的图模型训练方法、装置及设备
US20220407879A1 (en) * 2020-10-16 2022-12-22 Visa International Service Association System, method, and computer program product for user network activity anomaly detection
CN116258504A (zh) * 2023-03-16 2023-06-13 广州信瑞泰信息科技有限公司 银行客户关系管理系统及其方法
WO2023165352A1 (fr) * 2022-03-03 2023-09-07 百果园技术(新加坡)有限公司 Procédé et appareil de classification d'objets de service, dispositif et support de stockage
WO2024021738A1 (fr) * 2022-07-29 2024-02-01 腾讯科技(深圳)有限公司 Procédé et appareil d'incorporation de graphe de réseau de données, dispositif informatique et support de stockage

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110009093B (zh) * 2018-12-07 2020-08-07 阿里巴巴集团控股有限公司 用于分析关系网络图的神经网络系统和方法
CN110555469B (zh) * 2019-08-15 2020-07-24 阿里巴巴集团控股有限公司 处理交互序列数据的方法及装置
US10936950B1 (en) 2019-08-15 2021-03-02 Advanced New Technologies Co., Ltd. Processing sequential interaction data
CN110659723B (zh) * 2019-09-03 2023-09-19 腾讯科技(深圳)有限公司 基于人工智能的数据处理方法、装置、介质及电子设备
CN110705613B (zh) * 2019-09-19 2021-06-11 创新奇智(青岛)科技有限公司 物体分类方法
CN110705709B (zh) * 2019-10-14 2021-03-23 支付宝(杭州)信息技术有限公司 训练图神经网络模型的方法和装置
CN110490274B (zh) * 2019-10-17 2020-02-07 支付宝(杭州)信息技术有限公司 评估交互事件的方法及装置
CN111079780B (zh) * 2019-11-06 2023-06-23 中国科学院深圳先进技术研究院 空间图卷积网络的训练方法、电子设备及存储介质
CN111027610B (zh) * 2019-12-03 2022-02-25 腾讯医疗健康(深圳)有限公司 图像特征融合方法、设备和介质
CN110991914B (zh) * 2019-12-09 2024-04-16 朱递 一种基于图卷积神经网络的设施选址方法
CN111210279B (zh) * 2020-01-09 2022-08-16 支付宝(杭州)信息技术有限公司 一种目标用户预测方法、装置和电子设备
CN115081589A (zh) * 2020-01-09 2022-09-20 支付宝(杭州)信息技术有限公司 利用lstm神经网络模型处理交互数据的方法及装置
CN111277433B (zh) * 2020-01-15 2021-02-12 同济大学 基于属性网络表征学习的网络服务异常检测方法及装置
CN111414989B (zh) * 2020-02-13 2023-11-07 山东师范大学 基于门控机制的用户信任关系网络链路预测方法及系统
CN113283921A (zh) * 2020-02-19 2021-08-20 华为技术有限公司 一种业务数据的处理方法、装置及云服务器
CN111340611B (zh) * 2020-02-20 2024-03-08 中国建设银行股份有限公司 一种风险预警方法和装置
CN111581488B (zh) * 2020-05-14 2023-08-04 上海商汤智能科技有限公司 一种数据处理方法及装置、电子设备和存储介质
CN111798934B (zh) * 2020-06-23 2023-11-14 苏州浦意智能医疗科技有限公司 一种基于图神经网络的分子性质预测方法
CN112036418A (zh) * 2020-09-04 2020-12-04 京东数字科技控股股份有限公司 用于提取用户特征的方法和装置
CN112464292B (zh) * 2021-01-27 2021-08-20 支付宝(杭州)信息技术有限公司 基于隐私保护训练图神经网络的方法及装置
CN112801288A (zh) * 2021-02-05 2021-05-14 厦门市美亚柏科信息股份有限公司 一种图网络的向量表示方法及装置
CN112766500B (zh) * 2021-02-07 2022-05-17 支付宝(杭州)信息技术有限公司 图神经网络的训练方法及装置
CN113470828A (zh) * 2021-06-30 2021-10-01 上海商汤智能科技有限公司 分类方法及装置、电子设备和存储介质
CN113408706B (zh) * 2021-07-01 2022-04-12 支付宝(杭州)信息技术有限公司 训练用户兴趣挖掘模型、用户兴趣挖掘的方法和装置
CN113835899B (zh) * 2021-11-25 2022-02-22 支付宝(杭州)信息技术有限公司 针对分布式图学习的数据融合方法及装置
CN114154716A (zh) * 2021-12-03 2022-03-08 北京航天创智科技有限公司 一种基于图神经网络的企业能耗预测方法及装置
CN114677234B (zh) * 2022-04-26 2024-04-30 河南大学 一种融合多通道注意力机制的图卷积神经网络社交推荐方法及系统
CN115545189B (zh) * 2022-11-29 2023-04-18 支付宝(杭州)信息技术有限公司 训练图生成网络、训练图神经网络的方法及装置
CN115994373B (zh) * 2023-03-22 2023-05-30 山东中联翰元教育科技有限公司 基于大数据的高考志愿填报系统数据加密方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106708922A (zh) * 2016-10-21 2017-05-24 天津海量信息技术股份有限公司 一种基于海量数据的人物关系图谱分析方法
CN107145977A (zh) * 2017-04-28 2017-09-08 电子科技大学 一种对在线社交网络用户进行结构化属性推断的方法
CN107943967A (zh) * 2017-11-28 2018-04-20 华南理工大学 基于多角度卷积神经网络与循环神经网络的文本分类算法
CN108733792A (zh) * 2018-05-14 2018-11-02 北京大学深圳研究生院 一种实体关系抽取方法
US20180341719A1 (en) * 2017-05-24 2018-11-29 International Business Machines Corporation Neural Bit Embeddings for Graphs
CN110009093A (zh) * 2018-12-07 2019-07-12 阿里巴巴集团控股有限公司 用于分析关系网络图的神经网络系统和方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10536357B2 (en) * 2015-06-05 2020-01-14 Cisco Technology, Inc. Late data detection in data center
JP6719727B2 (ja) * 2016-03-23 2020-07-08 富士ゼロックス株式会社 購買行動分析装置およびプログラム
US10489887B2 (en) * 2017-04-10 2019-11-26 Samsung Electronics Co., Ltd. System and method for deep learning image super resolution
CN107808168B (zh) * 2017-10-31 2023-08-01 北京科技大学 一种基于强弱关系的社交网络用户行为预测方法
CN108921566B (zh) * 2018-05-03 2021-11-05 创新先进技术有限公司 一种基于图结构模型的虚假交易识别方法和装置
CN108874914B (zh) * 2018-05-29 2021-11-02 吉林大学 一种基于图卷积与神经协同过滤的信息推荐方法
CN108805203A (zh) * 2018-06-11 2018-11-13 腾讯科技(深圳)有限公司 图像处理及对象再识别方法、装置、设备和存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106708922A (zh) * 2016-10-21 2017-05-24 天津海量信息技术股份有限公司 一种基于海量数据的人物关系图谱分析方法
CN107145977A (zh) * 2017-04-28 2017-09-08 电子科技大学 一种对在线社交网络用户进行结构化属性推断的方法
US20180341719A1 (en) * 2017-05-24 2018-11-29 International Business Machines Corporation Neural Bit Embeddings for Graphs
CN107943967A (zh) * 2017-11-28 2018-04-20 华南理工大学 基于多角度卷积神经网络与循环神经网络的文本分类算法
CN108733792A (zh) * 2018-05-14 2018-11-02 北京大学深圳研究生院 一种实体关系抽取方法
CN110009093A (zh) * 2018-12-07 2019-07-12 阿里巴巴集团控股有限公司 用于分析关系网络图的神经网络系统和方法

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112819152A (zh) * 2020-08-14 2021-05-18 腾讯科技(深圳)有限公司 一种神经网络训练方法及装置
CN112819152B (zh) * 2020-08-14 2024-03-01 腾讯科技(深圳)有限公司 一种神经网络训练方法及装置
CN114169515A (zh) * 2020-08-20 2022-03-11 四川大学 一种基于高阶图神经网络的社交关系识别方法
CN114169515B (zh) * 2020-08-20 2023-04-28 四川大学 一种基于高阶图神经网络的社交关系识别方法
CN112085104B (zh) * 2020-09-10 2024-04-12 杭州中奥科技有限公司 一种事件特征提取方法、装置、存储介质及电子设备
CN112085104A (zh) * 2020-09-10 2020-12-15 杭州中奥科技有限公司 一种事件特征提取方法、装置、存储介质及电子设备
US11711391B2 (en) * 2020-10-16 2023-07-25 Visa International Service Association System, method, and computer program product for user network activity anomaly detection
US20220407879A1 (en) * 2020-10-16 2022-12-22 Visa International Service Association System, method, and computer program product for user network activity anomaly detection
CN113377656B (zh) * 2021-06-16 2023-06-23 南京大学 一种基于图神经网络的众测推荐方法
CN113377656A (zh) * 2021-06-16 2021-09-10 南京大学 一种基于图神经网络的众测推荐方法
CN114818973A (zh) * 2021-07-15 2022-07-29 支付宝(杭州)信息技术有限公司 一种基于隐私保护的图模型训练方法、装置及设备
CN113792089B (zh) * 2021-09-16 2024-03-22 平安银行股份有限公司 基于人工智能的非法行为检测方法、装置、设备及介质
CN113792089A (zh) * 2021-09-16 2021-12-14 平安银行股份有限公司 基于人工智能的非法行为检测方法、装置、设备及介质
WO2023165352A1 (fr) * 2022-03-03 2023-09-07 百果园技术(新加坡)有限公司 Procédé et appareil de classification d'objets de service, dispositif et support de stockage
WO2024021738A1 (fr) * 2022-07-29 2024-02-01 腾讯科技(深圳)有限公司 Procédé et appareil d'incorporation de graphe de réseau de données, dispositif informatique et support de stockage
CN116258504A (zh) * 2023-03-16 2023-06-13 广州信瑞泰信息科技有限公司 银行客户关系管理系统及其方法

Also Published As

Publication number Publication date
CN110009093A (zh) 2019-07-12
TW202032422A (zh) 2020-09-01
TWI709086B (zh) 2020-11-01
CN110009093B (zh) 2020-08-07

Similar Documents

Publication Publication Date Title
TWI709086B (zh) 用於分析關係網路圖的神經網路系統和方法
Bellotti et al. Forecasting recovery rates on non-performing loans with machine learning
Zhou et al. Stock market prediction on high‐frequency data using generative adversarial nets
CN109102393B (zh) 训练和使用关系网络嵌入模型的方法及装置
Byanjankar et al. Predicting credit risk in peer-to-peer lending: A neural network approach
CN109918454B (zh) 对关系网络图进行节点嵌入的方法及装置
WO2019196546A1 (fr) Procédé et appareil de détermination de probabilité de risque d'un événement de demande de service
CN107194723B (zh) 网络小额贷款中借款项目与出借人的双向匹配推荐方法
Dahiya et al. A feature selection enabled hybrid‐bagging algorithm for credit risk evaluation
US20100169137A1 (en) Methods and systems to analyze data using a graph
Suryanarayana et al. Machine learning approaches for credit card fraud detection
Yotsawat et al. A novel method for credit scoring based on cost-sensitive neural network ensemble
Anowar et al. Detection of auction fraud in commercial sites
CN112528110A (zh) 确定实体业务属性的方法及装置
Aphale et al. Predict loan approval in banking system machine learning approach for cooperative banks loan approval
CN114187112A (zh) 账户风险模型的训练方法和风险用户群体的确定方法
US11823026B2 (en) Artificial intelligence system employing graph convolutional networks for analyzing multi-entity-type multi-relational data
Eddy et al. Credit scoring models: Techniques and issues
CN112541575A (zh) 图神经网络的训练方法及装置
Byanjankar et al. Data‐driven optimization of peer‐to‐peer lending portfolios based on the expected value framework
CN114240659A (zh) 一种基于动态图卷积神经网络的区块链异常节点识别方法
Motwani et al. Predictive modelling for credit risk detection using ensemble method
Zhou et al. FinBrain 2.0: when finance meets trustworthy AI
Rahman et al. Nearest neighbor classifier method for making loan decision in commercial bank
Zhou et al. Forecasting credit default risk with graph attention networks

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19893494

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19893494

Country of ref document: EP

Kind code of ref document: A1