CN108920712A - The representation method and device of nodes - Google Patents
The representation method and device of nodes Download PDFInfo
- Publication number
- CN108920712A CN108920712A CN201810828402.9A CN201810828402A CN108920712A CN 108920712 A CN108920712 A CN 108920712A CN 201810828402 A CN201810828402 A CN 201810828402A CN 108920712 A CN108920712 A CN 108920712A
- Authority
- CN
- China
- Prior art keywords
- node
- mentioned
- term
- short
- angle value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/14—Network analysis or design
- H04L41/145—Network analysis or design involving simulating, designing, planning or modelling of a network
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention discloses a kind of representation method of nodes and devices.Wherein, this method includes:Obtain the weighted value on the side between the first angle value of first node and the second angle value and first node and second node of second node in network;According to the first angle value, the second angle value and weighted value, the analog information between first node and second node is determined;The node in network is sampled based on institute's analog information, obtains sample sequence;Target sequence corresponding with sample sequence is determined using long memory models in short-term, wherein, length is provided with mapping model between the input layer and hidden layer of memory models in short-term, mapping model is used for during long memory models in short-term determine target sequence corresponding with sample sequence, the knot vector for learning node, also optimizes processing to knot vector by Feature Mapping space optimization technology.The present invention solves the representation method of nodes in the prior art, the lower technical problem of the quality of the knot vector learnt.
Description
Technical field
The present invention relates to field of computer technology, in particular to the representation method and device of a kind of nodes.
Background technique
With the fast development of Internet technology, the daily routines of people are increasingly dependent on internet and its production with exchanging
The rise of the social platforms such as product, especially wechat, microblogging, QQ and Facebook, greatly facilitates the exchange between people.It is more next
More people have been added to social networks, so that network data is increasing, these network datas mainly include, user it
Between correlativity, hobby of user etc., then how to excavate and using these data to current data mining
Technology proposes huge challenge.
The scheme for the random walk technology sampling network interior joint that DeepWalk in the prior art is used, is not particularly suited for
It has the right network, the dense network especially having the right.In addition, random walk can treat the node equality in network, DeepWalk
In the Skip-Gram model used the problem of can not effectively handling long-term dependence, therefore, sampling network in the prior art
The scheme of interior joint can not capture global information abundant.
For the representation method of above-mentioned nodes in the prior art, the quality of the knot vector learnt is lower
Problem, currently no effective solution has been proposed.
Summary of the invention
The embodiment of the invention provides a kind of representation method of nodes and devices, at least to solve in the prior art
Nodes representation method, the lower technical problem of the quality of the knot vector learnt.
According to an aspect of an embodiment of the present invention, a kind of representation method of nodes is provided, including:Obtain net
First angle value of the first node in network and the second angle value of second node and above-mentioned first node and above-mentioned second node it
Between side weighted value, wherein above-mentioned second node be the node adjacent with above-mentioned first node;According to above-mentioned first angle value,
Above-mentioned second angle value and above-mentioned weighted value, determine the analog information between above-mentioned first node and above-mentioned second node;Based on institute
Analog information samples the node in above-mentioned network, obtains sample sequence;Using long memory models in short-term it is determining with it is above-mentioned
The corresponding target sequence of sample sequence, wherein above-mentioned length is provided with mapping mould between the input layer and hidden layer of memory models in short-term
Type, above-mentioned mapping model are used for the process in above-mentioned length memory models determination in short-term target sequence corresponding with above-mentioned sample sequence
In, learn the knot vector of above-mentioned node.
Further, determine that target sequence corresponding with above-mentioned sample sequence includes using long memory models in short-term:It will be upper
State the long memory models in short-term of sample sequence input;According to above-mentioned length, memory models determine mesh corresponding with above-mentioned sample sequence in short-term
Mark sequence.
Further, after the knot vector for stating node in the study, the above method further includes:Pass through Feature Mapping space
Optimisation technique optimizes processing to above-mentioned knot vector, the above-mentioned knot vector after being optimized.
Further, processing is being optimized to above-mentioned knot vector by Feature Mapping space optimization technology, obtained excellent
After above-mentioned knot vector after change, the above method further includes:Classify according to above-mentioned knot vector to above-mentioned node, obtains
Classification results;Or above-mentioned node is clustered according to above-mentioned knot vector, obtain cluster result.
Further, the similar letter between above-mentioned first node and above-mentioned second node is calculated by following formula
Breath:Wherein, Si,jFor above-mentioned analog information, Wi.jFor above-mentioned weighted value, i is above-mentioned first segment
Point, j are above-mentioned second node, diFor above-mentioned first angle value, djFor above-mentioned second angle value.
According to another aspect of an embodiment of the present invention, a kind of expression device of nodes is additionally provided, including:It obtains
Module, for obtaining the first angle value of the first node in network and the second angle value and the above-mentioned first node of second node
The weighted value on the side between above-mentioned second node, wherein above-mentioned second node is the node adjacent with above-mentioned first node;The
One determining module, for according to above-mentioned first angle value, above-mentioned second angle value and above-mentioned weighted value, determining above-mentioned first node and upper
State the analog information between second node;Sampling module, for being adopted based on institute's analog information to the node in above-mentioned network
Sample obtains sample sequence;Second determining module, for determining mesh corresponding with above-mentioned sample sequence using long memory models in short-term
Mark sequence, wherein above-mentioned length is provided with mapping model in short-term between the input layer and hidden layer of memory models, above-mentioned mapping model is used
During memory models determine target sequence corresponding with above-mentioned sample sequence to above-mentioned length in short-term, learn above-mentioned node
Knot vector.
Further, above-mentioned second determining module includes:Input unit, for remembering above-mentioned sample sequence input length in short-term
Recall model;Determination unit, for memory models to determine target sequence corresponding with above-mentioned sample sequence in short-term according to above-mentioned length.
Further, above-mentioned apparatus further includes:Optimization module, for passing through Feature Mapping space optimization technology to above-mentioned section
Point vector optimizes processing, the above-mentioned knot vector after being optimized.
According to another aspect of an embodiment of the present invention, a kind of storage medium is additionally provided, above-mentioned storage medium includes storage
Program, wherein above procedure operation when control above-mentioned storage medium where equipment execute following steps:It obtains in network
Side between first angle value of first node and the second angle value of second node and above-mentioned first node and above-mentioned second node
Weighted value, wherein above-mentioned second node be the node adjacent with above-mentioned first node;According to above-mentioned first angle value, above-mentioned
Two angle value and above-mentioned weighted value, determine the analog information between above-mentioned first node and above-mentioned second node;Based on the similar letter of institute
Breath samples the node in above-mentioned network, obtains sample sequence;Using the long determination of memory models in short-term and above-mentioned sampling sequence
Arrange corresponding target sequence, wherein above-mentioned length is provided with mapping model between the input layer and hidden layer of memory models in short-term, above-mentioned
Mapping model is used for study during memory models determine target sequence corresponding with above-mentioned sample sequence to above-mentioned length in short-term
The knot vector of above-mentioned node.
According to another aspect of an embodiment of the present invention, a kind of processor is additionally provided, above-mentioned processor is used to run program,
Wherein, following steps are executed when above procedure is run:Obtain the first node in network the first angle value and second node the
The weighted value on the side between two angle value and above-mentioned first node and above-mentioned second node, wherein above-mentioned second node be with it is upper
State the adjacent node of first node;According to above-mentioned first angle value, above-mentioned second angle value and above-mentioned weighted value, above-mentioned first segment is determined
Analog information between point and above-mentioned second node;The node in above-mentioned network is sampled based on institute's analog information, is obtained
Sample sequence;Target sequence corresponding with above-mentioned sample sequence is determined using long memory models in short-term, wherein above-mentioned length is remembered in short-term
Recall and be provided with mapping model between the input layer of model and hidden layer, above-mentioned mapping model is for memory models to be true in short-term in above-mentioned length
During fixed target sequence corresponding with above-mentioned sample sequence, learn the knot vector of above-mentioned node.
In embodiments of the present invention, by obtain the first node in network the first angle value and second degree of second node
The weighted value on value and the side between above-mentioned first node and above-mentioned second node, wherein above-mentioned second node is and above-mentioned the
The adjacent node of one node;According to above-mentioned first angle value, above-mentioned second angle value and above-mentioned weighted value, determine above-mentioned first node and
Analog information between above-mentioned second node;The node in above-mentioned network is sampled based on institute's analog information, is sampled
Sequence;Target sequence corresponding with above-mentioned sample sequence is determined using long memory models in short-term, wherein above-mentioned long short-term memory mould
Be provided with mapping model between the input layer and hidden layer of type, above-mentioned mapping model be used for above-mentioned length in short-term memory models determine with
During the corresponding target sequence of above-mentioned sample sequence, learn the knot vector of above-mentioned node, by consider node degree and
The weight on side between node, and using the long study of memory models in short-term node expression, reached the quality for improving sample sequence
With the purpose of the quality of knot vector, and then solves the representation method of nodes in the prior art, the section learnt
The lower technical problem of the quality of point vector.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present invention, constitutes part of this application, this hair
Bright illustrative embodiments and their description are used to explain the present invention, and are not constituted improper limitations of the present invention.In the accompanying drawings:
Fig. 1 is a kind of flow chart of the representation method of nodes according to an embodiment of the present invention;
Fig. 2 is a kind of schematic diagram of optional length according to an embodiment of the present invention memory models in short-term;
Fig. 3 is a kind of block schematic illustration of optional network representation learning model according to an embodiment of the present invention;And
Fig. 4 is a kind of structural schematic diagram of the expression device of nodes according to an embodiment of the present invention.
Specific embodiment
In order to enable those skilled in the art to better understand the solution of the present invention, below in conjunction in the embodiment of the present invention
Attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is only
The embodiment of a part of the invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people
The model that the present invention protects all should belong in member's every other embodiment obtained without making creative work
It encloses.
It should be noted that description and claims of this specification and term " first " in above-mentioned attached drawing, "
Two " etc. be to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be understood that using in this way
Data be interchangeable under appropriate circumstances, so as to the embodiment of the present invention described herein can in addition to illustrating herein or
Sequence other than those of description is implemented.In addition, term " includes " and " having " and their any deformation, it is intended that cover
Cover it is non-exclusive include, for example, the process, method, system, product or equipment for containing a series of steps or units are not necessarily limited to
Step or unit those of is clearly listed, but may include be not clearly listed or for these process, methods, product
Or other step or units that equipment is intrinsic.
Embodiment 1
According to embodiments of the present invention, a kind of embodiment of the representation method of nodes is provided, it should be noted that
Step shown in the flowchart of the accompanying drawings can execute in a computer system such as a set of computer executable instructions, and
It, in some cases, can be to be different from sequence execution institute herein and although logical order is shown in flow charts
The step of showing or describing.
Fig. 1 is a kind of flow chart of the representation method of nodes according to an embodiment of the present invention, as shown in Figure 1, should
Method includes the following steps:
Step S102 obtains the first angle value of first node and the second angle value of second node and above-mentioned in network
The weighted value on the side between first node and above-mentioned second node, wherein above-mentioned second node is adjacent with above-mentioned first node
Node.
Optionally, above-mentioned first node and second node may each be the node (network node) in network, for example, work
It stands, client, the network user or personal computer, the equipment that can also be server, printer and other network connections.Each
Work station, server, terminal device, the network equipment, that is, the equipment for possessing oneself only network address is all network node, network
It is connected between node with communication line.
Specifically, the angle value of node refers to the item number on side associated with the node, for digraph, it is divided into in-degree and goes out
Degree, wherein in-degree refers to the item number into the side of the node, and out-degree refers to the item number from the side of the node.
It should be noted that above-mentioned angle value is a very important attribute, the in general degree of some node in network
Value is bigger, and locating status is more important in a network, for example, the public figure that some of microblogging is well-known.
Since the network in reality is much network of having the right, this attribute of weight has reacted associated between two nodes
Degree, for example, the weight if some microblog users often contacts some good friend, between the microblog users and its good friend
Should be able to be bigger, on the contrary, if the microblog users are infrequently contacted with its good friend, between the microblog users and its good friend
Weight is with regard to smaller.
Step S104, according to above-mentioned first angle value, above-mentioned second angle value and above-mentioned weighted value, determine above-mentioned first node and
Analog information between above-mentioned second node.
In the embodiment of the present application, in order to consider the angle value of node and the weighted value on side during random walk, this
Apply for that embodiment obtains the first angle value of first node in network and the second angle value of second node and above-mentioned first node
The weighted value on the side between above-mentioned second node, and define the analog information S between the first angle value and second nodei,jCome
Measure node locating status in a network.
Step S106 samples the node in above-mentioned network based on institute's analog information, obtains sample sequence.
In the embodiment of the present application, by a kind of improved random walk mode, based on institute's analog information to above-mentioned network
In node sampled, generate corpus to get arrive sample sequence, in an alternative embodiment, the embodiment of the present application
Above-mentioned sample sequence can be sent to training in long memory models LSTM in short-term, to obtain indicating for nodes.
Propose a kind of new random walk mode based on the embodiment of the present application, this random walk mode in sampling not
Only consider the weight between node and node, it is also contemplated that in a network institute's role, the embodiment of the present application propose node
New random walk mode not only can be adapted for network of having the right, moreover it is possible to capture status locating for nodes and role,
So as to save global information more abundant in network.
Optionally, for the sample mode in the embodiment of the present application, can with but be not limited to random walk using truncation
Mode has inclined random walk mode.
Step S108 determines target sequence corresponding with above-mentioned sample sequence using long memory models in short-term, wherein above-mentioned
Long to be provided with mapping model between the input layer and hidden layer of memory models in short-term, above-mentioned mapping model for remembering in short-term in above-mentioned length
During recalling the determining target sequence corresponding with above-mentioned sample sequence of model, learn the knot vector of above-mentioned node.
In an alternative embodiment, as expression study is in the rapid hair of image domains and natural language processing field
There is the representation method that many nodes by network are expressed as the dense vector of low-dimensional, in the topological structure of primitive network in exhibition
The properties of existing node, for example two nodes are connected or similar or two nodes belong to same community structure etc.
It Deng this all property, can be reflected by Euclidean distance or COS distance in the space of low-dimensional, later these low-dimensionals
Vector expression can be applied in some tasks in downstream, for example, node-classification, link prediction, interest recommend task dispatching etc..
In an alternative embodiment, by following formula be calculated above-mentioned first node and above-mentioned second node it
Between analog information:
Wherein, Si,jFor above-mentioned analog information, Wi.jFor above-mentioned weighted value, i is above-mentioned first node, and j is above-mentioned second section
Point, diFor above-mentioned first angle value, djFor above-mentioned second angle value.
As a kind of optional embodiment, target sequence corresponding with above-mentioned sample sequence is determined using long memory models in short-term
Column include following method and step:
Step S202, by the long memory models in short-term of above-mentioned sample sequence input;
Step S204, according to above-mentioned length, memory models determine target sequence corresponding with above-mentioned sample sequence in short-term.
In an alternative embodiment, above-mentioned length is provided with mapping mould between the input layer and hidden layer of memory models in short-term
Type, above-mentioned mapping model are used for the process in above-mentioned length memory models determination in short-term target sequence corresponding with above-mentioned sample sequence
In, learn the knot vector of above-mentioned node.
Optionally, memory models LSTM may be replaced by above-mentioned length in short-term:RNN model, GRU model.
Fig. 2 is a kind of schematic diagram of optional length according to an embodiment of the present invention memory models in short-term, as shown in Fig. 2, on
Stating long memory models in short-term includes:Input layer, embeding layer (i.e. above-mentioned mapping model, for obtain node insertion indicate, in fact
Matter is a matrix, the corresponding node of every row), LSTM unit (hidden layer), output layer.
Wherein, above-mentioned sample sequence can be v1, v2, v3, new random walk side in specifically the embodiment of the present application
The sample sequence v that formula samples1, v2, v3, which is substantially the relevant sequence of timing.
In the embodiment of the present application, by by above-mentioned sample sequence v1,v2, v3The input layer of the long memory models in short-term of input,
Long memory models in short-term determine target sequence v corresponding with above-mentioned sample sequence2, v3, v4。
But in the embodiment of the present application, of interest is not the content of output layer final output, but by input layer
A mapping model is added between hidden layer to obtain the expression of node, i.e., has been adopted in the memory models determination in short-term of above-mentioned length with above-mentioned
During the corresponding target sequence of sample sequence, learn the knot vector of above-mentioned node, passes through the above-mentioned length of continuous iteration optimization
When memory models prediction process, the expression of embeding layer can be constantly updated, to obtain the expression of network.
Fig. 3 is a kind of block schematic illustration of optional network representation learning model according to an embodiment of the present invention, such as Fig. 3 institute
Show, above-mentioned network representation learning model may include:Reflecting in the node (for example, i, j, k), long memory models in short-term in network
Model, the expression of network, adjacency matrix, optimization processing module, laplacian eigenmaps space optimization module and classifier are penetrated,
The connection relationship of modules, model etc. is as shown in Figure 3.
In a kind of optional embodiment, one of the embodiment of the present application is based on the node of long memory models (LSTM) in short-term
Coding method, by the way that embeding layer (mapping model) is added in traditional LSTM, for the knot vector of learning network, substitution is existing
Have a common Skip-Gram model in related work, improved LSTM method by present node predict next node come
The low-dimensional vector for constantly training node, not only can handle the problem of timing correlation, can also handle long-term Dependence Problem, base
The embodiment of the present application may learn global information abundant in network in the LSTM the characteristics of.
In an alternative embodiment, after the knot vector for stating node in the study, the above method further includes as follows
Method and step:
Step S302 optimizes processing to above-mentioned knot vector by Feature Mapping space optimization technology, is optimized
Above-mentioned knot vector afterwards.
Optionally, appeal Feature Mapping space optimization technology can be, but not limited to as laplacian eigenmaps space optimization
Technology.
As a kind of optional embodiment, since, there are a large amount of side, these sides have stored the part of network in network
Information.In order to more efficiently extract local message, that is, allow in network similar node after being mapped in lower dimensional space, this
Come with regard to related node, for example, between two nodes for thering is side to connect, from must more between two nodes after optimization
Closely, or can be understood as side between two nodes Euclidean distance it is smaller.
The embodiment of the present application can be, but not limited to optimize LSTM using laplacian eigenmaps space optimization technology
The knot vector practised carries out Laplce's constraint to the expression that LSTM is obtained, so that similar node is in low-dimensional in network
It is closer in space, so as to save the local message of network, keep the expression of the network node obtained more robust.
Based on the representation method of nodes provided by the embodiment of the present application, solves DeepWalk depth network not
The shortcomings that having the right network can be handled, also, when node of the embodiment of the present application in sampling network, not only allows for the power on side
Weight, it is also contemplated that the degree of node.
In an alternative embodiment, excellent to the progress of above-mentioned knot vector by Feature Mapping space optimization technology
Change is handled, and after the above-mentioned knot vector after being optimized, the above method further includes following method and step:
Step S402 classifies to above-mentioned node according to above-mentioned knot vector, obtains classification results;Or
Step S404 clusters above-mentioned node according to above-mentioned knot vector, obtains cluster result.
In the embodiment of the present application, place is being optimized to above-mentioned knot vector by Feature Mapping space optimization technology
It manages, after the above-mentioned knot vector after being optimized, the knot vector after optimization can be applied in some tasks in downstream,
For example, node-classification, node clustering, link prediction, interest recommend task dispatching etc..
The knot vector that nodes encoding mode in the embodiment of the present application can obtain robust indicates, next will use
To knot vector classify to node.For example, a part of node in existing network can be carried out to labor standard, benefit
Train classifier with the good node of this part of standards, predicted with trained classifier the node of remaining not label with
And newly generated node.
In an alternative embodiment, the classifier classified in the embodiment of the present application can be, but not limited to be following
Any one:Logistic regression device (Logistic Regression), support vector machines (Support Vector Machine),
Decision tree (Decision Tree), neural network (Neural Network), K- arest neighbors (K-Nearest Neighbor),
Naive Bayesian (Bayes) etc..
In an alternative embodiment, if there is the node of label is considerably less, so that it cannot enough training one point
Class device, or the not label information of node, then the embodiment of the present application can be clustered with obtained knot vector.It will obtain
Knot vector be sent to cluster device, obtain cluster result, in an alternative embodiment, the cluster work in the embodiment of the present application
Tool can for it is following any one:K- mean cluster (K-means), spectral clustering (Spectral Clustering), Gaussian Mixture
Cluster (Mixture of Gaussian Clustering) etc..
In embodiments of the present invention, by obtain the first node in network the first angle value and second degree of second node
The weighted value on value and the side between above-mentioned first node and above-mentioned second node, wherein above-mentioned second node is and above-mentioned the
The adjacent node of one node;According to above-mentioned first angle value, above-mentioned second angle value and above-mentioned weighted value, determine above-mentioned first node and
Analog information between above-mentioned second node;The node in above-mentioned network is sampled based on institute's analog information, is sampled
Sequence;Target sequence corresponding with above-mentioned sample sequence is determined using long memory models in short-term, wherein above-mentioned long short-term memory mould
Be provided with mapping model between the input layer and hidden layer of type, above-mentioned mapping model be used for above-mentioned length in short-term memory models determine with
During the corresponding target sequence of above-mentioned sample sequence, learn the knot vector of above-mentioned node, by consider node degree and
The weight on side between node, and using the long study of memory models in short-term node expression, reached the quality for improving sample sequence
With the purpose of the quality of knot vector, and then solves the representation method of nodes in the prior art, the section learnt
The lower technical problem of the quality of point vector.
Embodiment 2
According to embodiments of the present invention, it additionally provides a kind of for implementing a kind of dress of the representation method of above-mentioned nodes
Embodiment is set, Fig. 4 is a kind of structural schematic diagram of the expression device of nodes according to an embodiment of the present invention, such as Fig. 4 institute
Show, the expression device of above-mentioned nodes, including:Obtain module 40, the first determining module 42, sampling module 44 and second really
Cover half block 46, wherein
Module 40 is obtained, for obtaining the first angle value of the first node in network and the second angle value of second node, with
And the weighted value on the side between above-mentioned first node and above-mentioned second node, wherein above-mentioned second node be and above-mentioned first segment
The adjacent node of point;First determining module 42 is used for according to above-mentioned first angle value, above-mentioned second angle value and above-mentioned weighted value, really
Analog information between fixed above-mentioned first node and above-mentioned second node;Sampling module 44, for based on institute's analog information to
The node stated in network is sampled, and sample sequence is obtained;Second determining module 46, for being determined using long memory models in short-term
Target sequence corresponding with above-mentioned sample sequence, wherein above-mentioned length is provided between the input layer and hidden layer of memory models in short-term
Mapping model, above-mentioned mapping model is in above-mentioned length, memory models to determine target sequence corresponding with above-mentioned sample sequence in short-term
During, learn the knot vector of above-mentioned node.
Herein it should be noted that above-mentioned acquisition module 40, the first determining module 42, sampling module 44 and second determine mould
Block 46 corresponds to the step S102 to step S108 in embodiment 1, the example and answer that above-mentioned module and corresponding step are realized
It is identical with scene, but it is not limited to the above embodiments 1 disclosure of that.It should be noted that above-mentioned module as device one
Part may operate in terminal.
In an alternative embodiment, above-mentioned apparatus further includes:Second determining module includes:Input unit, being used for will
The above-mentioned long memory models in short-term of sample sequence input;Determination unit, for according to above-mentioned length in short-term memory models it is determining with it is above-mentioned
The corresponding target sequence of sample sequence.
In an alternative embodiment, above-mentioned apparatus further includes:Optimization module, for passing through Feature Mapping space optimization
Technology optimizes processing to above-mentioned knot vector, the above-mentioned knot vector after being optimized.
It should be noted that above-mentioned modules can be realized by software or hardware, for example, for the latter,
It can be accomplished by the following way:Above-mentioned modules can be located in same processor;Alternatively, above-mentioned modules are with any
Combined mode is located in different processors.
It should be noted that the optional or preferred embodiment of the present embodiment may refer to the associated description in embodiment 1,
Details are not described herein again.
The expression device of above-mentioned nodes can also include processor and memory, above-mentioned acquisition module 40, the
One determining module 42, sampling module 44 and second determining module 46 etc. are stored as program unit in memory, by handling
Device executes above procedure unit stored in memory to realize corresponding function.
Include kernel in processor, is gone in memory to transfer corresponding program unit by kernel, above-mentioned kernel can be set
One or more.Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM)
And/or the forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM), memory includes at least one
Storage chip.
According to the embodiment of the present application, a kind of storage medium embodiment is additionally provided.Optionally, in the present embodiment, above-mentioned
Storage medium includes the program of storage, wherein equipment where controlling above-mentioned storage medium in above procedure operation executes above-mentioned
The representation method of any one nodes.
Optionally, in the present embodiment, above-mentioned storage medium can be located in computer network in computer terminal group
In any one terminal, or in any one mobile terminal in mobile terminal group, above-mentioned storage medium packet
Include the program of storage.
Optionally, when program is run, equipment where control storage medium executes following functions:Obtain first in network
The power on the side between the first angle value of node and the second angle value of second node and above-mentioned first node and above-mentioned second node
Weight values, wherein above-mentioned second node is the node adjacent with above-mentioned first node;According to above-mentioned first angle value, second degree above-mentioned
Value and above-mentioned weighted value, determine the analog information between above-mentioned first node and above-mentioned second node;Based on institute's analog information pair
Node in above-mentioned network is sampled, and sample sequence is obtained;Using the long determination of memory models in short-term and above-mentioned sample sequence pair
The target sequence answered, wherein above-mentioned length is provided with mapping model, above-mentioned mapping in short-term between the input layer and hidden layer of memory models
Model is used to learn above-mentioned during memory models determine target sequence corresponding with above-mentioned sample sequence to above-mentioned length in short-term
The knot vector of node.
Optionally, when program is run, equipment where control storage medium executes following functions:Above-mentioned sample sequence is defeated
Enter long memory models in short-term;According to above-mentioned length, memory models determine target sequence corresponding with above-mentioned sample sequence in short-term.
Optionally, when program is run, equipment where control storage medium executes following functions:Pass through Feature Mapping space
Optimisation technique optimizes processing to above-mentioned knot vector, the above-mentioned knot vector after being optimized.
Optionally, when program is run, equipment where control storage medium executes following functions:According to above-mentioned knot vector
Classify to above-mentioned node, obtains classification results;Or above-mentioned node is clustered according to above-mentioned knot vector, it is clustered
As a result.
Optionally, when program is run, equipment where control storage medium executes following functions:It is calculated by following formula
Obtain the analog information between above-mentioned first node and above-mentioned second node:Wherein, Si,jIt is upper
State analog information, Wi.jFor above-mentioned weighted value, i is above-mentioned first node, and j is above-mentioned second node, diFor above-mentioned first angle value, dj
For above-mentioned second angle value.
According to the embodiment of the present application, a kind of processor embodiment is additionally provided.Optionally, in the present embodiment, above-mentioned place
Reason device is for running program, wherein above procedure executes the representation method of any one of the above nodes when running.
The embodiment of the present application provides a kind of equipment, equipment include processor, memory and storage on a memory and can
The program run on a processor, processor realize following steps when executing program:Obtain first of the first node in network
The weighted value on the side between angle value and the second angle value and above-mentioned first node and above-mentioned second node of second node, wherein
Above-mentioned second node is the node adjacent with above-mentioned first node;According to above-mentioned first angle value, above-mentioned second angle value and above-mentioned power
Weight values determine the analog information between above-mentioned first node and above-mentioned second node;Based on institute's analog information in above-mentioned network
Node sampled, obtain sample sequence;Target sequence corresponding with above-mentioned sample sequence is determined using long memory models in short-term
Column, wherein above-mentioned length is provided with mapping model in short-term between the input layer and hidden layer of memory models, above-mentioned mapping model is used for
Above-mentioned length during the determining target sequence corresponding with above-mentioned sample sequence of memory models, learns the node of above-mentioned node in short-term
Vector.
Optionally, when above-mentioned processor executes program, above-mentioned sample sequence can also be inputted to long memory models in short-term;According to
According to above-mentioned length, memory models determine target sequence corresponding with above-mentioned sample sequence in short-term.
It optionally, can also be by Feature Mapping space optimization technology to above-mentioned node when above-mentioned processor executes program
Vector optimizes processing, the above-mentioned knot vector after being optimized.
Optionally, when above-mentioned processor executes program, can also classify according to above-mentioned knot vector to above-mentioned node,
Obtain classification results;Or above-mentioned node is clustered according to above-mentioned knot vector, obtain cluster result.
Optionally, when above-mentioned processor executes program, can also be calculated by following formula above-mentioned first node and
Analog information between above-mentioned second node:Wherein, Si,jFor above-mentioned analog information, Wi.jIt is upper
Weighted value is stated, i is above-mentioned first node, and j is above-mentioned second node, diFor above-mentioned first angle value, djFor above-mentioned second angle value.
Present invention also provides a kind of computer program products, when executing on data processing equipment, are adapted for carrying out just
The program of beginningization there are as below methods step:The first angle value of the first node in network and the second angle value of second node are obtained,
And the weighted value on the side between above-mentioned first node and above-mentioned second node, wherein above-mentioned second node be and above-mentioned first
The adjacent node of node;According to above-mentioned first angle value, above-mentioned second angle value and above-mentioned weighted value, above-mentioned first node and upper is determined
State the analog information between second node;The node in above-mentioned network is sampled based on institute's analog information, obtains sampling sequence
Column;Target sequence corresponding with above-mentioned sample sequence is determined using long memory models in short-term, wherein above-mentioned length memory models in short-term
Input layer and hidden layer between be provided with mapping model, above-mentioned mapping model be used for above-mentioned length in short-term memory models determine with
During stating the corresponding target sequence of sample sequence, learn the knot vector of above-mentioned node.
Optionally, when above-mentioned computer program product executes program, above-mentioned sample sequence can also be inputted length and remembered in short-term
Recall model;According to above-mentioned length, memory models determine target sequence corresponding with above-mentioned sample sequence in short-term.
Optionally, when above-mentioned computer program product executes program, Feature Mapping space optimization technology pair can also be passed through
Above-mentioned knot vector optimizes processing, the above-mentioned knot vector after being optimized.
It optionally, can also be according to above-mentioned knot vector to above-mentioned node when above-mentioned computer program product executes program
Classify, obtains classification results;Or above-mentioned node is clustered according to above-mentioned knot vector, obtain cluster result.
Optionally, when above-mentioned computer program product executes program, above-mentioned the can also be calculated by following formula
Analog information between one node and above-mentioned second node:Wherein, Si,jFor above-mentioned similar letter
Breath, Wi.jFor above-mentioned weighted value, i is above-mentioned first node, and j is above-mentioned second node, diFor above-mentioned first angle value, djIt is above-mentioned
Two angle value.
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
In the above embodiment of the invention, it all emphasizes particularly on different fields to the description of each embodiment, does not have in some embodiment
The part of detailed description, reference can be made to the related descriptions of other embodiments.
In several embodiments provided herein, it should be understood that disclosed technology contents can pass through others
Mode is realized.Wherein, the apparatus embodiments described above are merely exemplary, such as the division of the unit, Ke Yiwei
A kind of logical function partition, there may be another division manner in actual implementation, for example, multiple units or components can combine or
Person is desirably integrated into another system, or some features can be ignored or not executed.Another point, shown or discussed is mutual
Between coupling, direct-coupling or communication connection can be through some interfaces, the INDIRECT COUPLING or communication link of unit or module
It connects, can be electrical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
On unit.It can some or all of the units may be selected to achieve the purpose of the solution of this embodiment according to the actual needs.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list
Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product
When, it can store in a computer readable storage medium.Based on this understanding, technical solution of the present invention is substantially
The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words
It embodies, which is stored in a storage medium, including some instructions are used so that a computer
Equipment (can for personal computer, server or network equipment etc.) execute each embodiment the method for the present invention whole or
Part steps.And storage medium above-mentioned includes:USB flash disk, read-only memory (ROM, Read-Only Memory), arbitrary access are deposited
Reservoir (RAM, Random Access Memory), mobile hard disk, magnetic or disk etc. be various to can store program code
Medium.
The above is only a preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art
For member, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications are also answered
It is considered as protection scope of the present invention.
Claims (10)
1. a kind of representation method of nodes, which is characterized in that including:
Obtain the first angle value of first node in network and the second angle value of second node and the first node and described
The weighted value on the side between second node, wherein the second node is the node adjacent with the first node;
According to first angle value, second angle value and the weighted value, the first node and the second node are determined
Between analog information;
The node in the network is sampled based on institute's analog information, obtains sample sequence;
Target sequence corresponding with the sample sequence is determined using long memory models in short-term, wherein the long short-term memory mould
Be provided with mapping model between the input layer and hidden layer of type, the mapping model be used for the length in short-term memory models determine with
During the corresponding target sequence of the sample sequence, learn the knot vector of the node.
2. the method according to claim 1, wherein using the long determination of memory models in short-term and the sample sequence
Corresponding target sequence includes:
By the long memory models in short-term of sample sequence input;
According to the length, memory models determine target sequence corresponding with the sample sequence in short-term.
3. according to the method described in claim 2, it is characterized in that, after the knot vector for learning the node, the side
Method further includes:
Processing is optimized to the knot vector by Feature Mapping space optimization technology, the node after being optimized to
Amount.
4. according to the method described in claim 3, it is characterized in that, passing through Feature Mapping space optimization technology to the node
Vector optimizes processing, after the knot vector after being optimized, the method also includes:
Classify according to the knot vector to the node, obtains classification results;Or
The node is clustered according to the knot vector, obtains cluster result.
5. method as claimed in any of claims 1 to 4, which is characterized in that institute is calculated by following formula
State the analog information between first node and the second node:
Wherein, Si,jFor the analog information, Wi.jFor the weighted value, i is the first node, and j is the second node, di
For first angle value, djFor second angle value.
6. a kind of expression device of nodes, which is characterized in that including:
Module is obtained, for obtaining the first angle value and the second angle value of second node and described of the first node in network
The weighted value on the side between first node and the second node, wherein the second node is adjacent with the first node
Node;
First determining module, for determining the first segment according to first angle value, second angle value and the weighted value
Analog information between point and the second node;
Sampling module obtains sample sequence for sampling based on institute's analog information to the node in the network;
Second determining module, for determining target sequence corresponding with the sample sequence using long memory models in short-term, wherein
The length is provided with mapping model in short-term between the input layer and hidden layer of memory models, the mapping model is used in the length
When memory models determine corresponding with sample sequence target sequence during, learn the knot vector of the node.
7. device according to claim 6, which is characterized in that second determining module includes:
Input unit, for the sample sequence to be inputted long memory models in short-term;
Determination unit, for memory models to determine target sequence corresponding with the sample sequence in short-term according to the length.
8. device according to claim 7, which is characterized in that described device further includes:
Optimization module is optimized for optimizing processing to the knot vector by Feature Mapping space optimization technology
The knot vector afterwards.
9. a kind of storage medium, which is characterized in that the storage medium includes the program of storage, wherein run in described program
When control the storage medium where equipment execute following steps:Obtain the first angle value and the second section of the first node in network
The weighted value on the side between the second angle value and the first node and the second node of point, wherein the second node
For the node adjacent with the first node;According to first angle value, second angle value and the weighted value, determine described in
Analog information between first node and the second node;
The node in the network is sampled based on institute's analog information, obtains sample sequence;Using long memory models in short-term
Determine target sequence corresponding with the sample sequence, wherein the length is set between the input layer and hidden layer of memory models in short-term
It is equipped with mapping model, the mapping model is for memory models to determine target corresponding with the sample sequence in short-term in the length
During sequence, learn the knot vector of the node.
10. a kind of processor, which is characterized in that the processor is for running program, wherein executed when described program is run with
Lower step:Obtain the first angle value of first node in network and the second angle value of second node and the first node and
The weighted value on the side between the second node, wherein the second node is the node adjacent with the first node;Foundation
First angle value, second angle value and the weighted value, determine the phase between the first node and the second node
Like information;The node in the network is sampled based on institute's analog information, obtains sample sequence;Using long short-term memory mould
Type determines target sequence corresponding with the sample sequence, wherein the length is in short-term between the input layer and hidden layer of memory models
It is provided with mapping model, the mapping model is for memory models to determine mesh corresponding with the sample sequence in short-term in the length
During marking sequence, learn the knot vector of the node.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810828402.9A CN108920712A (en) | 2018-07-25 | 2018-07-25 | The representation method and device of nodes |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810828402.9A CN108920712A (en) | 2018-07-25 | 2018-07-25 | The representation method and device of nodes |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108920712A true CN108920712A (en) | 2018-11-30 |
Family
ID=64417430
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810828402.9A Pending CN108920712A (en) | 2018-07-25 | 2018-07-25 | The representation method and device of nodes |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108920712A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109829500A (en) * | 2019-01-31 | 2019-05-31 | 华南理工大学 | A kind of position composition and automatic clustering method |
CN109918584A (en) * | 2019-03-25 | 2019-06-21 | 中国科学院自动化研究所 | Bit coin exchange Address Recognition method, system, device |
CN111064603A (en) * | 2019-12-04 | 2020-04-24 | 深圳大学 | Network link determination method, device and equipment |
CN112286996A (en) * | 2020-11-23 | 2021-01-29 | 天津大学 | Node embedding method based on network link and node attribute information |
-
2018
- 2018-07-25 CN CN201810828402.9A patent/CN108920712A/en active Pending
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109829500A (en) * | 2019-01-31 | 2019-05-31 | 华南理工大学 | A kind of position composition and automatic clustering method |
CN109829500B (en) * | 2019-01-31 | 2023-05-02 | 华南理工大学 | Position composition and automatic clustering method |
CN109918584A (en) * | 2019-03-25 | 2019-06-21 | 中国科学院自动化研究所 | Bit coin exchange Address Recognition method, system, device |
CN111064603A (en) * | 2019-12-04 | 2020-04-24 | 深圳大学 | Network link determination method, device and equipment |
CN112286996A (en) * | 2020-11-23 | 2021-01-29 | 天津大学 | Node embedding method based on network link and node attribute information |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Shi et al. | From semantic communication to semantic-aware networking: Model, architecture, and open problems | |
CN109685116B (en) | Image description information generation method and device and electronic device | |
CN108920712A (en) | The representation method and device of nodes | |
CN110263280B (en) | Multi-view-based dynamic link prediction depth model and application | |
US20180225549A1 (en) | Media content analysis system and method | |
CN109344884A (en) | The method and device of media information classification method, training picture classification model | |
US10204090B2 (en) | Visual recognition using social links | |
CN106649434A (en) | Cross-domain knowledge transfer tag embedding method and apparatus | |
Picard et al. | An application of swarm intelligence to distributed image retrieval | |
CN108595533B (en) | Article recommendation method based on collaborative filtering, storage medium and server | |
CN113326377B (en) | Name disambiguation method and system based on enterprise association relationship | |
CN111881350A (en) | Recommendation method and system based on mixed graph structured modeling | |
CN107391542A (en) | A kind of open source software community expert recommendation method based on document knowledge collection of illustrative plates | |
CN108449209A (en) | The social networks friend recommendation method merged based on routing information and nodal information | |
Barman et al. | Shape: A novel graph theoretic algorithm for making consensus-based decisions in person re-identification systems | |
CN110008999A (en) | Determination method, apparatus, storage medium and the electronic device of target account number | |
CN111143705A (en) | Recommendation method based on graph convolution network | |
CN108345661A (en) | A kind of Wi-Fi clustering methods and system based on extensive Embedding technologies | |
CN112559764A (en) | Content recommendation method based on domain knowledge graph | |
CN112381179A (en) | Heterogeneous graph classification method based on double-layer attention mechanism | |
CN112748941A (en) | Feedback information-based target application program updating method and device | |
CN111159242B (en) | Client reordering method and system based on edge calculation | |
CN111949885A (en) | Personalized recommendation method for scenic spots | |
CN110502701B (en) | Friend recommendation method, system and storage medium introducing attention mechanism | |
CN110147414B (en) | Entity characterization method and device of knowledge graph |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181130 |
|
RJ01 | Rejection of invention patent application after publication |