CN109614975A - A kind of figure embedding grammar, device and storage medium - Google Patents
A kind of figure embedding grammar, device and storage medium Download PDFInfo
- Publication number
- CN109614975A CN109614975A CN201811258705.8A CN201811258705A CN109614975A CN 109614975 A CN109614975 A CN 109614975A CN 201811258705 A CN201811258705 A CN 201811258705A CN 109614975 A CN109614975 A CN 109614975A
- Authority
- CN
- China
- Prior art keywords
- node
- destination node
- order
- neighbors
- neighborhood
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
- G06V10/422—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation for representing the structure of the pattern or shape of an object therefor
- G06V10/426—Graphical representations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/048—Activation functions
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The present invention provides a kind of figure embedding grammar, device and storage medium, and method includes: to read graph structure data and node diagnostic value in target figure, structure figures structural model;Each node in graph structure model is considered as destination node, the single order neighbor node of each destination node is sampled according to non-homogeneous neighbor node sampling function, obtains the first order neighbors of each destination node;The second order neighborhood of each destination node is constructed according to the first order neighbors of each destination node, and it polymerize to the corresponding first order neighbors of destination node, and input the feature of the second order neighborhood of polymerization in full Connection Neural Network, obtain the new feature of the first order neighbors of each destination node;New feature is aggregated on corresponding each destination node, and the new feature of the first order neighbors of polymerization is inputted in full Connection Neural Network, obtains the output feature of each destination node.Neighborhood flexibly and effectively can be constructed for node each in figure, be rapidly performed by characteristic aggregation, so as to improve the effect of the figure insertion based on figure neural network.
Description
Technical field
The invention mainly relates to figure embedded technology process fields, and in particular to a kind of figure embedding grammar, device and storage are situated between
Matter.
Background technique
Figure insertion is also known as internet startup disk, network representation study, and the purpose is to node each in figure is projected low-dimensional vector
In space, as the study of graph structure data is indicated or is encoded to effective, these are indicated or coding is " insertion " of figure.Figure insertion
Method it is varied, wherein the method based on figure neural network is slowly risen in recent years.
The insertion of figure can be understood as numerous nodes in embedded space, the tasks energy such as community discovery, intelligent recommendation
It is enough to be realized by being classified to these nodes, being predicted.Figure embedded technology greatly improves the numerous data in complex network field
The effect of mining task, has a wide range of applications.
Currently, figure embedding grammar is that each node constructs neighborhood frequently with uniform sampling function, i.e., saved in the neighbours of each node
Part of nodes is randomly selected in point, although lower on this sample mode computational complexity, each neighbor node had ignored and exists
Existing difference in nature is unfavorable for the generation of node insertion.
Summary of the invention
The technical problem to be solved by the present invention is in view of the deficiencies of the prior art, provide a kind of figure embedding grammar, device
And storage medium.
The technical scheme to solve the above technical problems is that a kind of figure embedding grammar, includes the following steps:
Graph structure data and node diagnostic value in target figure are read, is constructed according to the graph structure data and node diagnostic value
Graph structure model;
Each node in the graph structure model is considered as destination node, is adopted according to non-homogeneous neighbor node sampling function
The single order neighbor node of each destination node of sample, obtains the first order neighbors of each destination node;
The second order neighborhood of each destination node is constructed according to the first order neighbors of each destination node;
By on the characteristic aggregation of the second order neighborhood of each destination node to the corresponding first order neighbors of the destination node,
And in the full Connection Neural Network for constructing the input of the feature of the second order neighborhood of polymerization in advance, each destination node is obtained
The new feature of first order neighbors;
The new feature of the first order neighbors of each destination node is aggregated on corresponding each destination node, and
The new feature of the first order neighbors of polymerization is inputted in the full Connection Neural Network, the output for obtaining each destination node is special
Sign.
Another technical solution that the present invention solves above-mentioned technical problem is as follows: a kind of figure flush mounting includes the following steps:
Model construction module, for reading graph structure data and node diagnostic value in target figure, according to the graph structure number
According to node diagnostic value structure figures structural model;
First order neighbors construct module, for each node in the graph structure model to be considered as destination node, according to non-
Uniform neighbor node sampling function samples the single order neighbor node of each destination node, obtains each destination node
First order neighbors;
Second order neighborhood constructs module, for constructing each target section according to the first order neighbors of each destination node
The second order neighborhood of point;
Second order neighborhood aggregation module, for by the characteristic aggregation of the second order neighborhood of each destination node to the target
On the corresponding first order neighbors of node, and in the full Connection Neural Network that the input of the feature of the second order neighborhood of polymerization is constructed in advance,
Obtain the new feature of the first order neighbors of each destination node;
First order neighbors aggregation module, it is corresponding for the new feature of the first order neighbors of each destination node to be aggregated to
On each destination node, and the new feature of the first order neighbors of polymerization is inputted in the full Connection Neural Network, is obtained each
The output feature of a destination node.
Another technical solution that the present invention solves above-mentioned technical problem is as follows: a kind of figure flush mounting, including memory, place
The computer program managing device and storage in the memory and can running on the processor, the processor execute institute
It realizes when stating computer program such as the step of the method.
Another technical solution that the present invention solves above-mentioned technical problem is as follows: a kind of computer readable storage medium, described
Computer-readable recording medium storage has computer program, and such as the method is realized when the computer program is executed by processor
The step of.
The beneficial effects of the present invention are: being conducive to the generation of node insertion, the non-homogeneous neighbour of proposition by graph structure model
Occupying node sample function flexibly and effectively can construct neighborhood for node each in figure, can be rapidly performed by characteristic aggregation, and
And the feature of destination node is exported by full Connection Neural Network, so as to improve the effect of the figure insertion based on figure neural network.
Detailed description of the invention
Fig. 1 is the method flow diagram for the figure embedding grammar that one embodiment of the invention provides;
Fig. 2 is the module frame chart for the figure flush mounting that one embodiment of the invention provides.
Specific embodiment
The principle and features of the present invention will be described below with reference to the accompanying drawings, and the given examples are served only to explain the present invention, and
It is non-to be used to limit the scope of the invention.
Fig. 1 is the method flow diagram for the figure embedding grammar that one embodiment of the invention provides;
As shown in Figure 1, a kind of figure embedding grammar, includes the following steps:
Graph structure data and node diagnostic value in target figure are read, is constructed according to the graph structure data and node diagnostic value
Graph structure model;
Each node in the graph structure model is considered as destination node, is adopted according to non-homogeneous neighbor node sampling function
The single order neighbor node of each destination node of sample, obtains the first order neighbors of each destination node;
The second order neighborhood of each destination node is constructed according to the first order neighbors of each destination node;
By on the characteristic aggregation of the second order neighborhood of each destination node to the corresponding first order neighbors of the destination node,
And in the full Connection Neural Network for constructing the input of the feature of the second order neighborhood of polymerization in advance, each destination node is obtained
The new feature of first order neighbors;
The new feature of the first order neighbors of each destination node is aggregated on corresponding each destination node, and
The new feature of the first order neighbors of polymerization is inputted in the full Connection Neural Network, the output for obtaining each destination node is special
Sign.
Specifically, the graph structure model is G=(V, E), wherein V indicates the set of node, and E indicates to connect side between node
Set.
In above-described embodiment, be conducive to the generation of node insertion, the non-homogeneous neighbor node of proposition by graph structure model
Sampling function flexibly and effectively can construct neighborhood for node each in figure, can be rapidly performed by characteristic aggregation, and pass through
The feature of full Connection Neural Network output destination node, so as to improve the effect of the figure insertion based on figure neural network.
Preferably, as an embodiment of the present invention, the new feature of the first order neighbors of each destination node is obtained
Afterwards, further include the steps that optimizing the full Connection Neural Network:
Optimize the full Connection Neural Network according to stochastic gradient descent method, the full connection nerve net after being optimized
Network.
In above-described embodiment, the feature of the destination node of the full Connection Neural Network output after optimization is more acurrate.
Preferably, as an embodiment of the present invention, the first order neighbors of each destination node of building include:
The single order neighbor node of destination node v is ranked up by the sequence of angle value from small to large, the neighbour after being sorted
Node collection [v (1), v (2) ..., v (D)] is occupied, wherein D is the total number of the single order neighbor node of destination node v;
The biggish P node of angle value in the neighbor node collection [v (1), v (2) ..., v (D)] is preferentially sampled, is saved
Point set [v (1), v (2) ..., v (D-P)];
The lesser node of H angle value in the node collection [v (1), v (2) ..., v (D-P)] is hidden, node collection [v is obtained
(H+1),...,v(D-P)];
S-P node in node collection described in stochastical sampling [v (H+1) ..., v (D-P)], and by the node collection [v
(1), v (2) ..., v (D-P)] in P node and S-P of the node collection [v (H+1) ..., v..., v (D-P)] section
Point group builds up the first order neighbors of destination node.
Preferably, as an embodiment of the present invention, the feature of the second order neighborhood by each destination node
Include: on polymerization to the corresponding first order neighbors of the destination node
The characteristic information of the first order neighbors N (u) of any node u in the first order neighbors of destination node is polymerize:
Wherein, q is any node in the first order neighbors N (u) of node u, hqFor the feature vector of node q, hN(u)For u's
The feature vector obtained after first order neighbors N (u) polymerization, MEAN, that is, average polymerization device;
Step-by-step is carried out to the feature vector of first order neighbors N (u) interior joint and averagely obtains aggregation information;
The feature vector of the feature vector and node u that obtain after first order neighbors N (u) polymerization is connected, and is transmitted to
In K=1 layers of neural network, the new feature of node u is obtained:
hu←σ(W1·CONCAT(hN(u),hu))
The wherein weight matrix that W1 is K=1 layers of figure neural network, CONCAT are attached operation to feature vector, and σ is
The nonlinear activation factor in the figure neural network.
In above-described embodiment, using angle value as the foundation of nonuniform sampling, will preferential sampling with it is random after concealed nodes
Sampling combines, and flexibly and effectively can construct neighborhood for node each in figure.
Preferably, as an embodiment of the present invention, the new spy of the first order neighbors by each destination node
Sign, which is aggregated on corresponding each destination node, includes:
The new feature of the first order neighbors N (v) of each destination node v is aggregated to corresponding each destination node
It is upper:
Wherein, u is any node in the first order neighbors N (v) of destination node v, huIt is obtained in step D for node u
New feature, hN(v)For the feature vector obtained after first order neighbors N (v) polymerization of destination node, MEAN is average polymerization device;
The feature vector of the feature vector and node v that obtain after first order neighbors N (v) polymerization is attached, and is transmitted to
In K=2 layers of neural network, the output feature of node v is obtained:
hv←σ(W2·CONCAT(hN(v),hv))
Wherein W2For K=2 layers of weight matrix of figure neural network, CONCAT is attached operation to feature vector, and σ is
The nonlinear activation factor in the figure neural network.
Fig. 2 is the module frame chart for the figure flush mounting that one embodiment of the invention provides;
Preferably, as another embodiment of the invention, a kind of figure flush mounting includes the following steps:
Model construction module, for reading graph structure data and node diagnostic value in target figure, according to the graph structure number
According to node diagnostic value structure figures structural model G=(V, E), wherein V indicate node set, E indicate node between connect side
Set;
First order neighbors construct module, for each node in the graph structure model G=(V, E) to be considered as target section
Point samples the single order neighbor node of each destination node according to non-homogeneous neighbor node sampling function, obtains each described
The first order neighbors of destination node;
Second order neighborhood constructs module, for constructing each target section according to the first order neighbors of each destination node
The second order neighborhood of point;
Second order neighborhood aggregation module, for by the characteristic aggregation of the second order neighborhood of each destination node to the target
On the corresponding first order neighbors of node, and in the full Connection Neural Network that the input of the feature of the second order neighborhood of polymerization is constructed in advance,
Obtain the new feature of the first order neighbors of each destination node;
First order neighbors aggregation module, it is corresponding for the new feature of the first order neighbors of each destination node to be aggregated to
On each destination node, and the new feature of the first order neighbors of polymerization is inputted in the full Connection Neural Network, is obtained each
The output feature of a destination node.
Preferably, as an embodiment of the present invention, further include optimization module, be used for:
After obtaining the new feature of the first order neighbors of each destination node, according to the optimization of stochastic gradient descent method
Full Connection Neural Network, the full Connection Neural Network after being optimized.
Preferably, as an embodiment of the present invention, the first order neighbors construct module, are specifically used for:
The single order neighbor node of destination node v is ranked up by the sequence of angle value from small to large, the neighbour after being sorted
Node collection [v (1), v (2) ..., v (D)] is occupied, wherein D is the total number of the single order neighbor node of destination node v;
The biggish P node of angle value in the neighbor node collection [v (1), v (2) ..., v (D)] is preferentially sampled, node is obtained
Collection [v (1), v (2) ..., v (D-P)];
The lesser node of H angle value in the node collection [v (1), v (2) ..., v (D-P)] is hidden, node collection [v (H is obtained
+1),...,v(D-P)];
S-P node in node collection described in stochastical sampling [v (H+1) ..., v (D-P)], and by the node collection [v
(1), (2) v ..., v (D-P)] in P node and S-P node of the node collection [v (H+1) ..., v (D-P)] set up
At the first order neighbors of destination node.
Preferably, as another embodiment of the invention, a kind of figure flush mounting, including memory, processor and
The computer program that can be run in the memory and on the processor is stored, the processor executes the computer
It realizes when program such as the step of the method.
Preferably, as another embodiment of the invention, a kind of computer readable storage medium is described computer-readable
Storage medium is stored with computer program, realizes when the computer program is executed by processor such as the step of the method.
It is apparent to those skilled in the art that for convenience of description and succinctly, the dress of foregoing description
The specific work process with unit is set, can refer to corresponding processes in the foregoing method embodiment, details are not described herein.
In several embodiments provided herein, it should be understood that disclosed device and method can pass through it
Its mode is realized.For example, the apparatus embodiments described above are merely exemplary, for example, the division of unit, only
A kind of logical function partition, there may be another division manner in actual implementation, for example, multiple units or components can combine or
Person is desirably integrated into another system, or some features can be ignored or not executed.
Unit may or may not be physically separated as illustrated by the separation member, shown as a unit
Component may or may not be physical unit, it can and it is in one place, or may be distributed over multiple networks
On unit.It can select some or all of unit therein according to the actual needs to realize the mesh of the embodiment of the present invention
's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit
It is that each unit physically exists alone, is also possible to two or more units and is integrated in one unit.It is above-mentioned integrated
Unit both can take the form of hardware realization, can also realize in the form of software functional units.
It, can if integrated unit is realized in the form of SFU software functional unit and when sold or used as an independent product
To be stored in a computer readable storage medium.Based on this understanding, technical solution of the present invention substantially or
Say that all or part of the part that contributes to existing technology or the technical solution can embody in the form of software products
Out, which is stored in a storage medium, including some instructions are used so that a computer equipment
(can be personal computer, server or the network equipment etc.) executes all or part of each embodiment method of the present invention
Step.And storage medium above-mentioned include: USB flash disk, it is mobile hard disk, read-only memory (ROM, Read-Only Memory), random
Access various Jie that can store program code such as memory (RAM, Random Access Memory), magnetic or disk
Matter.
More than, only a specific embodiment of the invention, but scope of protection of the present invention is not limited thereto, and it is any to be familiar with
Those skilled in the art in the technical scope disclosed by the present invention, can readily occur in various equivalent modifications or substitutions,
These modifications or substitutions should be covered by the protection scope of the present invention.Therefore, protection scope of the present invention should be wanted with right
Subject to the protection scope asked.
Claims (10)
1. a kind of figure embedding grammar, which comprises the steps of:
Graph structure data and node diagnostic value in target figure are read, according to the graph structure data and node diagnostic value structure figures knot
Structure model;
Each node in the graph structure model is considered as destination node, is sampled according to non-homogeneous neighbor node sampling function each
The single order neighbor node of a destination node obtains the first order neighbors of each destination node;
The second order neighborhood of each destination node is constructed according to the first order neighbors of each destination node;
By on the characteristic aggregation of the second order neighborhood of each destination node to the corresponding first order neighbors of the destination node, and will
In the full Connection Neural Network that the feature input of the second order neighborhood of polymerization constructs in advance, the single order of each destination node is obtained
The new feature of neighborhood;
The new feature of the first order neighbors of each destination node is aggregated on corresponding each destination node, and will be gathered
The new feature of the first order neighbors of conjunction inputs in the full Connection Neural Network, obtains the output feature of each destination node.
2. a kind of figure embedding grammar according to claim 1, which is characterized in that obtain the single order of each destination node
After the new feature of neighborhood, further include the steps that optimizing the full Connection Neural Network:
Optimize the full Connection Neural Network according to stochastic gradient descent method, the full Connection Neural Network after being optimized.
3. a kind of figure embedding grammar according to claim 1, which is characterized in that each destination node of building
First order neighbors include:
The single order neighbor node of destination node v is ranked up by the sequence of angle value from small to large, neighbours' section after being sorted
Point set [v (1), v (2) ..., v (D)], wherein D is the total number of the single order neighbor node of destination node v;
The biggish P node of angle value in the neighbor node collection [v (1), v (2) ..., v (D)] is preferentially sampled, node collection is obtained
[v(1),v(2),...,v(D-P)];
The lesser node of H angle value in the node collection [v (1), v (2) ..., v (D-P)] is hidden, node collection [v (H+ is obtained
1),...,v(D-P)];
S-P node in node collection described in stochastical sampling [v (H+1) ..., v (D-P)], and by the node collection [v (1), v
..., (2), v (D-P)] in P node and S-P node group of the node collection [v (H+1) ..., v (D-P)] build up mesh
Mark the first order neighbors of node.
4. a kind of figure embedding grammar according to claim 1, which is characterized in that described by the two of each destination node
Include: on the characteristic aggregation of rank neighborhood to the corresponding first order neighbors of the destination node
The characteristic information of the first order neighbors N (u) of any node u in the first order neighbors of destination node is polymerize:
Wherein, q is any node in the first order neighbors N (u) of node u, hqFor the feature vector of node q, hN(u)For the single order of u
The feature vector obtained after neighborhood N (u) polymerization, MEAN, that is, average polymerization device;
Step-by-step is carried out to the feature vector of first order neighbors N (u) interior joint and averagely obtains aggregation information;
The feature vector of the feature vector and node u that obtain after first order neighbors N (u) polymerization is connected, and is transmitted to nerve
In K=1 layers of network, the new feature of node u is obtained:
hu←σ(W1·CONCAT(hN(u),hu))
Wherein W1For K=1 layers of weight matrix of figure neural network, CONCAT is attached operation to feature vector, and σ is described
The nonlinear activation factor in figure neural network.
5. a kind of figure embedding grammar according to claim 4, which is characterized in that described by the one of each destination node
The new feature of rank neighborhood is aggregated on corresponding each destination node
The new feature of the first order neighbors N (v) of each destination node v is aggregated on corresponding each destination node:
Wherein, u is any node in the first order neighbors N (v) of destination node v, huThe new spy obtained in step D for node u
Sign, hN(v)For the feature vector obtained after first order neighbors N (v) polymerization of destination node, MEAN is average polymerization device;
The feature vector of the feature vector and node v that obtain after first order neighbors N (v) polymerization is attached, and is transmitted to nerve
In K=2 layers of network, the output feature of node v is obtained:
hv←σ(W2·CONCAT(hN(v),hv))
Wherein W2For K=2 layers of weight matrix of figure neural network, CONCAT is attached operation to feature vector, and σ is described
The nonlinear activation factor in figure neural network.
6. a kind of figure flush mounting, which comprises the steps of:
Model construction module, for reading graph structure data and node diagnostic value in target figure, according to the graph structure data and
Node diagnostic value structure figures structural model;
First order neighbors construct module, for each node in the graph structure model to be considered as destination node, according to non-homogeneous
Neighbor node sampling function samples the single order neighbor node of each destination node, obtains the single order of each destination node
Neighborhood;
Second order neighborhood constructs module, for constructing each destination node according to the first order neighbors of each destination node
Second order neighborhood;
Second order neighborhood aggregation module, for by the characteristic aggregation of the second order neighborhood of each destination node to the destination node
On corresponding first order neighbors, and in the full Connection Neural Network that the input of the feature of the second order neighborhood of polymerization is constructed in advance, obtain
The new feature of the first order neighbors of each destination node;
First order neighbors aggregation module, it is corresponding each for the new feature of the first order neighbors of each destination node to be aggregated to
On the destination node, and the new feature of the first order neighbors of polymerization is inputted in the full Connection Neural Network, obtains each institute
State the output feature of destination node.
7. a kind of figure flush mounting according to claim 1, which is characterized in that further include optimization module, be used for:
After obtaining the new feature of the first order neighbors of each destination node, connect entirely according to the optimization of stochastic gradient descent method is described
Neural network is connect, the full Connection Neural Network after being optimized.
8. a kind of figure flush mounting according to claim 1, which is characterized in that the first order neighbors construct module, specifically
For:
The single order neighbor node of destination node v is ranked up by the sequence of angle value from small to large, neighbours' section after being sorted
Point set [v (1), v (2) ..., v (D)], wherein D is the total number of the single order neighbor node of destination node v;
The biggish P node of angle value in the neighbor node collection [v (1), v (2) ..., v (D)] is preferentially sampled, node collection is obtained
[v(1),v(2),...,v(D-P)];
The lesser node of H angle value in the node collection [v (1), v (2) ..., v (D-P)] is hidden, node collection [v (H+ is obtained
1),...,v(D-P)];
S-P node in node collection described in stochastical sampling [v (H+1) ..., v (D-P)], and by the node collection [v (1), v
..., (2), v (D-P)] in P node and S-P node group of the node collection [v (H+1) ..., v (D-P)] build up mesh
Mark the first order neighbors of node.
9. a kind of figure flush mounting, including memory, processor and storage are in the memory and can be in the processor
The computer program of upper operation, which is characterized in that the processor realized when executing the computer program as claim 1 to
The step of any one of 5 the method.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, and feature exists
In when the computer program is executed by processor the step of any one of such as claim 1 to 5 of realization the method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811258705.8A CN109614975A (en) | 2018-10-26 | 2018-10-26 | A kind of figure embedding grammar, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811258705.8A CN109614975A (en) | 2018-10-26 | 2018-10-26 | A kind of figure embedding grammar, device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109614975A true CN109614975A (en) | 2019-04-12 |
Family
ID=66002836
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811258705.8A Pending CN109614975A (en) | 2018-10-26 | 2018-10-26 | A kind of figure embedding grammar, device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109614975A (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110378413A (en) * | 2019-07-17 | 2019-10-25 | Oppo广东移动通信有限公司 | Neural network model processing method, device and electronic equipment |
CN110659723A (en) * | 2019-09-03 | 2020-01-07 | 腾讯科技(深圳)有限公司 | Data processing method, device, medium and electronic equipment based on artificial intelligence |
CN110705709A (en) * | 2019-10-14 | 2020-01-17 | 支付宝(杭州)信息技术有限公司 | Method and device for training neural network model of graph |
CN110826700A (en) * | 2019-11-13 | 2020-02-21 | 中国科学技术大学 | Method for realizing and classifying bilinear graph neural network model for modeling neighbor interaction |
CN110866190A (en) * | 2019-11-18 | 2020-03-06 | 支付宝(杭州)信息技术有限公司 | Method and device for training neural network model for representing knowledge graph |
CN111488400A (en) * | 2019-04-28 | 2020-08-04 | 北京京东尚科信息技术有限公司 | Data classification method, device and computer readable storage medium |
CN112085172A (en) * | 2020-09-16 | 2020-12-15 | 支付宝(杭州)信息技术有限公司 | Method and device for training graph neural network |
CN113536383A (en) * | 2021-01-27 | 2021-10-22 | 支付宝(杭州)信息技术有限公司 | Method and device for training neural network based on privacy protection |
WO2023279674A1 (en) * | 2021-07-08 | 2023-01-12 | Huawei Technologies Co.,Ltd. | Memory-augmented graph convolutional neural networks |
-
2018
- 2018-10-26 CN CN201811258705.8A patent/CN109614975A/en active Pending
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111488400A (en) * | 2019-04-28 | 2020-08-04 | 北京京东尚科信息技术有限公司 | Data classification method, device and computer readable storage medium |
CN111488400B (en) * | 2019-04-28 | 2021-03-30 | 北京京东尚科信息技术有限公司 | Data classification method, device and computer readable storage medium |
CN110378413A (en) * | 2019-07-17 | 2019-10-25 | Oppo广东移动通信有限公司 | Neural network model processing method, device and electronic equipment |
CN110659723B (en) * | 2019-09-03 | 2023-09-19 | 腾讯科技(深圳)有限公司 | Data processing method and device based on artificial intelligence, medium and electronic equipment |
CN110659723A (en) * | 2019-09-03 | 2020-01-07 | 腾讯科技(深圳)有限公司 | Data processing method, device, medium and electronic equipment based on artificial intelligence |
CN110705709A (en) * | 2019-10-14 | 2020-01-17 | 支付宝(杭州)信息技术有限公司 | Method and device for training neural network model of graph |
CN110705709B (en) * | 2019-10-14 | 2021-03-23 | 支付宝(杭州)信息技术有限公司 | Method and device for training neural network model of graph |
CN110826700A (en) * | 2019-11-13 | 2020-02-21 | 中国科学技术大学 | Method for realizing and classifying bilinear graph neural network model for modeling neighbor interaction |
CN110866190A (en) * | 2019-11-18 | 2020-03-06 | 支付宝(杭州)信息技术有限公司 | Method and device for training neural network model for representing knowledge graph |
CN112085172A (en) * | 2020-09-16 | 2020-12-15 | 支付宝(杭州)信息技术有限公司 | Method and device for training graph neural network |
CN112085172B (en) * | 2020-09-16 | 2022-09-16 | 支付宝(杭州)信息技术有限公司 | Method and device for training graph neural network |
CN113536383A (en) * | 2021-01-27 | 2021-10-22 | 支付宝(杭州)信息技术有限公司 | Method and device for training neural network based on privacy protection |
CN113536383B (en) * | 2021-01-27 | 2023-10-27 | 支付宝(杭州)信息技术有限公司 | Method and device for training graph neural network based on privacy protection |
WO2023279674A1 (en) * | 2021-07-08 | 2023-01-12 | Huawei Technologies Co.,Ltd. | Memory-augmented graph convolutional neural networks |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109614975A (en) | A kind of figure embedding grammar, device and storage medium | |
Tejani et al. | Multiobjective adaptive symbiotic organisms search for truss optimization problems | |
De Bacco et al. | Community detection, link prediction, and layer interdependence in multilayer networks | |
Ma et al. | Nonnegative matrix factorization algorithms for link prediction in temporal networks using graph communicability | |
CN109840154B (en) | Task dependency-based computing migration method in mobile cloud environment | |
CN107273412B (en) | A kind of clustering method of text data, device and system | |
CN111414987A (en) | Training method and training device for neural network and electronic equipment | |
Wang et al. | TRC‐YOLO: A real‐time detection method for lightweight targets based on mobile devices | |
CN104391879B (en) | The method and device of hierarchical clustering | |
CN106056211A (en) | Neuron computing unit, neuron computing module and artificial neural network computing core | |
CN104750780A (en) | Hadoop configuration parameter optimization method based on statistic analysis | |
Cucuringu et al. | Localization on low-order eigenvectors of data matrices | |
Chan et al. | A convex formulation of modularity maximization for community detection | |
Li et al. | Research on QoS service composition based on coevolutionary genetic algorithm | |
CN104361462B (en) | Social network influence maximization approach based on cultural gene algorithm | |
CN108805174A (en) | clustering method and device | |
CN107679553A (en) | Clustering method and device based on density peaks | |
Mozejko et al. | Superkernel neural architecture search for image denoising | |
CN111198897A (en) | Scientific research hotspot topic analysis method and device and electronic equipment | |
Cannavo et al. | Variational method for image denoising by distributed genetic algorithms on grid environment | |
CN114625477A (en) | Service node capacity adjusting method, equipment and computer readable storage medium | |
CN107908696A (en) | A kind of parallel efficiently multidimensional space data clustering algorithm GRIDEN based on grid and density | |
CN110472272B (en) | Structural damage identification method based on multi-parameter and convolutional neural network | |
CN110298406A (en) | A kind of method of data balancing, system and equipment | |
CN106559290B (en) | The method and system of link prediction based on community structure |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190412 |
|
RJ01 | Rejection of invention patent application after publication |