CN114050975A - Heterogeneous multi-node interconnection topology generation method and storage medium - Google Patents

Heterogeneous multi-node interconnection topology generation method and storage medium Download PDF

Info

Publication number
CN114050975A
CN114050975A CN202210024578.5A CN202210024578A CN114050975A CN 114050975 A CN114050975 A CN 114050975A CN 202210024578 A CN202210024578 A CN 202210024578A CN 114050975 A CN114050975 A CN 114050975A
Authority
CN
China
Prior art keywords
heterogeneous multi
node
topological structure
graph
node interconnection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210024578.5A
Other languages
Chinese (zh)
Other versions
CN114050975B (en
Inventor
杨宏斌
金良
胡克坤
赵雅倩
董刚
刘海威
蒋东东
晁银银
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Inspur Intelligent Technology Co Ltd
Original Assignee
Suzhou Inspur Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Inspur Intelligent Technology Co Ltd filed Critical Suzhou Inspur Intelligent Technology Co Ltd
Priority to CN202210024578.5A priority Critical patent/CN114050975B/en
Publication of CN114050975A publication Critical patent/CN114050975A/en
Application granted granted Critical
Publication of CN114050975B publication Critical patent/CN114050975B/en
Priority to PCT/CN2022/096236 priority patent/WO2023130656A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/12Discovery or management of network topologies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Signal Processing (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Design And Manufacture Of Integrated Circuits (AREA)

Abstract

The invention relates to a heterogeneous multi-node interconnection topology generation method and a storage medium, wherein the method comprises the following steps: extracting node information and a topological structure according to the characteristics based on the graph convolution network model so as to obtain low-dimensional vector representation of the node information and the topological structure; inputting performance parameters and the low-dimensional vector representation to a full-connection layer to generate feature integration information; inputting the feature integration information into a preset generation network to generate a heterogeneous multi-node interconnection topological structure; and acquiring a characteristic value of the heterogeneous multi-node interconnection topological structure, and ensuring that the heterogeneous multi-node interconnection topological structure meets a preset accuracy requirement. According to the method for generating the heterogeneous multi-node interconnection topology, the heterogeneous multi-node interconnection topology structure which meets the use requirement can be generated based on the input performance parameters and the low-dimensional vector expression, the cost can be reduced to the greatest extent on the premise that the performance requirement is met, and the resource utilization rate is improved.

Description

Heterogeneous multi-node interconnection topology generation method and storage medium
Technical Field
The invention relates to the technical field of computer networks, in particular to a heterogeneous multi-node interconnection topology generation method and a storage medium.
Background
Currently in the heterogeneous computing field, there are many types of computing devices, such as CPUs, GPUs, FPGAs, application specific ICs, and the like. The computing power of a single computing node is different, and the computing power of the single computing node is as large as that of a server, and the computing node comprises a plurality of CPUs and a plurality of GPU computing cards, and is as small as that of a single special computing chip and comprises hundreds of PEs; interfaces are different, such as QPI interface of Intel CPU, Nvlink of Nvidia GPU and SRIO of FPGA; the interconnection application scenes of the multiple nodes are different, such as a supercomputer, distributed computing, a super heterogeneous platform, a network on chip, a many-core CPU (Central processing Unit) multi-core architecture, heterogeneous acceleration chip multi-PE (provider edge) interconnection and the like, wherein the supercomputer, the distributed computing and the super heterogeneous platform use equipment nodes, such as a CPU (Central processing Unit), a GPU (graphics processing Unit), an FPGA (field programmable Gate array), a special IC (Integrated Circuit) and the like; the network on chip, the many-core CPU multi-core architecture and the heterogeneous acceleration chip use core nodes such as a CPU core, a CUDA core, a PE array and the like.
The interconnection mode of each computing node is also diversified. Typically, the switch matrix is shared bus, Crossbar (as shown in fig. 1 and fig. 2), Ring (as shown in fig. 3, a topology diagram of bus and Ring is shown), star connection, Mesh and Torus (as shown in fig. 4, a schematic diagram of 2D Mesh and 2D Torus distributed switch matrix is shown, and as shown in fig. 5, a schematic diagram of 2D Torus is shown), and so on. Networks connecting multiple independent computers are commonly referred to as networks, intra-chip or inter-chip interconnects are referred to as fabrics, and large-scale interconnects embedded within a single chip to connect multiple different modules within the chip are referred to as nocs. Different from computer networks, more featured technologies such as QoS (quality of service) can be implemented in nocs, which can decide which request is forwarded first and which is forwarded later, and queue buffering can be added to the receiving and transmitting ports of Crossbar of each node to realize QoS priority control; more advanced flow control strategies are realized, and queues are utilized more fully; and calculating which path to go to the target node more smoothly by using a more advanced routing algorithm and a congestion judgment algorithm. A commonly used Fabric topology such as HyperCube is also a topology used by Intel QPI, a Fat Tree (Fat Tree) topology is also a topology used by a sky river II supercomputer to connect to a large number of computer nodes, a Pyramid (Pyramid) topology, a Butterfly (Butterfly) topology, Intel concatenates a string Ring and a Triple Ring (Triple Ring) used by its 12 core in its 12 core Ivy Bridge CPU microarchitecture, a string Ring (as shown in fig. 6, it is a topological diagram of a string Ring and a Triple Ring), a string Ring and a ClosNetwork topology, etc. Fig. 7 is a schematic diagram of an internal architecture of an image processing dedicated chip, in which 6 Crossbar switches are used to form a star network, and at the same time, 4 Crossbar switches of 13 × 13 are connected in series to form a Ring, and the hybrid topology is adopted overall.
Because different computing devices or computing cores can provide different computing power, the types, the number and the bandwidth of interfaces are different, and the interconnection topology among the computing nodes is various. The computational complexity is high, the interfaces are multiple, the interconnection lines are multiple, the corresponding power consumption and cost are high, if the computational nodes and the interconnections are not matched, the computational nodes are idle or the interconnection lines are idle, so when a computational task is deployed or a heterogeneous multi-node hardware circuit is designed, how to select the type and the number of the nodes and how to interconnect the nodes by adopting which topology, the low power consumption and the low cost of the nodes are realized as much as possible under the condition of meeting the computational performance, and the optimization problem is solved.
That is, different from the unit and connection relation determined in the netlist of the chip placement and routing task, the heterogeneous multi-node topology selects the calculation unit and connection relation according to the calculation task. Because the performance of each computing node is different and the cost is different, the number of interconnection lines between two nodes is large, the communication bandwidth is high, but the cost is also high; conversely, if the cascade method is adopted, the more the stages are, the greater the corresponding delay may be, and the probability of channel congestion may also be increased.
Therefore, it is urgently needed to provide a heterogeneous multi-node interconnection topology generation method and a storage medium which can simultaneously meet the performance requirement and the topological relation requirement.
Disclosure of Invention
In order to solve the technical problems, the invention provides a heterogeneous multi-node interconnection topology generation method and a storage medium, which can realize flexible combination of a topology structure based on a computing node, can control cost to the maximum extent on the premise of meeting the performance requirement of the topology structure, and finally generate a computing architecture meeting the performance requirement and the topology structure.
In order to achieve the above object, the present application proposes a first technical solution:
a heterogeneous multi-node interconnection topology generation method comprises the following steps: extracting node information and a topological structure according to the characteristics based on the graph convolution network model so as to obtain low-dimensional vector representation of the node information and the topological structure; inputting performance parameters and the low-dimensional vector representation to a full-connection layer to generate feature integration information; inputting the feature integration information into a preset generation network to generate a heterogeneous multi-node interconnection topological structure; and acquiring a characteristic value of the heterogeneous multi-node interconnection topological structure, and ensuring that the heterogeneous multi-node interconnection topological structure meets a preset accuracy requirement.
In an embodiment of the present invention, the generating a network specifically includes: an upsampling layer, a convolutional layer, a full link layer, a batch normalization layer, a modified linear unit, and an sigmoid function.
In one embodiment of the invention, the low-dimensional vector representation comprises a node-embedded vector and a connection-embedded vector; acquiring a node embedding vector of the graph convolution network model based on the following formula:
Figure 993887DEST_PATH_IMAGE001
wherein the content of the first and second substances,vithe nodes are represented as a list of nodes,
Figure 603860DEST_PATH_IMAGE002
representing nodesviThe neighbor nodes of (a) are,e ij representing nodesviAndvjthe connection of (a) to (b),meanrepresents a mean function; obtaining a connection embedding vector of the graph convolution network model based on the following formula:
Figure 632996DEST_PATH_IMAGE003
wherein the content of the first and second substances,f c0 andf c1 two feed-forward networks of different sizes are shown,w ij e is a learnable 1x1 weight corresponding to the adjacent edge,concatrepresenting a splicing function, creating a node vector based on the node features,v i andv j each represents a node.
In an embodiment of the present invention, the upsampling process of the upsampling layer specifically includes: assuming that the feature integration information is a graph S (V, E) comprising V vertices and E adjacent edges; based on the graph S (V, E), the following operations are performed in sequence: mapping the graph S (V, E) to a graph comprising N x N vertices and E x m adjacent edges
Figure 935801DEST_PATH_IMAGE004
(ii) a Based on the graph
Figure 683178DEST_PATH_IMAGE004
Generating a first adjacency matrix and obtaining an initial value of the first adjacency matrix; training the first adjacency matrix based on the initial value of the first adjacency matrix to obtain the optimal value of the first adjacency matrix
Figure 362421DEST_PATH_IMAGE005
In one embodiment of the invention, the graph is obtained based on the following formula
Figure 246063DEST_PATH_IMAGE004
The vertex feature of (2):
Figure 985349DEST_PATH_IMAGE006
wherein the content of the first and second substances,f in is shown as a drawing
Figure 954442DEST_PATH_IMAGE004
The characteristic of the vertex of (a),krepresentation diagram
Figure 906217DEST_PATH_IMAGE004
Top point of (2)jAnd vertexiThe geodesic distance between the two ground-measuring devices,
Figure 175525DEST_PATH_IMAGE007
for calculating the weight of any vertex in the N x N vertexes
Figure 557483DEST_PATH_IMAGE004
The optimum value of (a) is set,f j the vertex features of diagram S (V, E) are shown.
In one embodiment of the present invention, the convolutional layer generates a global graph and an independent graph based on an upsampling result of the upsampling layer, and performs a convolution operation based on the global graph and the independent graph; the convolution operation specifically comprises the following steps: initializing the independent graph as graph S (V, E), and generating the independent graph based on the following formula:
Figure 13872DEST_PATH_IMAGE008
wherein the content of the first and second substances,C k a separate figure is shown which is,f in representation diagram
Figure 300497DEST_PATH_IMAGE004
The characteristic of the vertex of (a),
Figure 158732DEST_PATH_IMAGE009
and
Figure 239820DEST_PATH_IMAGE010
are respectively embedded functionsθAndψis determined by the parameters of (a) and (b),SoftMaxis a normalization function; wherein the normalization function is:
Figure 183505DEST_PATH_IMAGE012
wherein N represents a figure
Figure 477083DEST_PATH_IMAGE004
The number of the vertices of (2) is,
Figure 455404DEST_PATH_IMAGE013
Figure 176235DEST_PATH_IMAGE014
each represents 1 × 1 convolution layers with different initial values.
In one embodiment of the present invention, the characteristic value of the heterogeneous multi-node interconnection topology is obtained based on the following formula:
Figure 872796DEST_PATH_IMAGE015
wherein the content of the first and second substances,B k a global graph is represented that represents the global graph,C k a separate figure is shown which is,
Figure 970065DEST_PATH_IMAGE016
a parameter indicating the adjustment of the weight of the independent graph,f in representation diagram
Figure 537312DEST_PATH_IMAGE017
The characteristic of the vertex of (a),K v the size of the kernel representing the spatial dimension,W k represents the weight vector of the 1x1 convolution operation.
In an embodiment of the present invention, the step of ensuring that the heterogeneous multi-node interconnection topology meets a preset accuracy requirement specifically includes: performing cross entropy loss operation on the heterogeneous multi-node interconnection topological structure and a preset real heterogeneous multi-node interconnection topological structure based on the following formula;
Figure 694624DEST_PATH_IMAGE018
wherein the content of the first and second substances,Ethe expected value of the distribution function is represented,P data represents the distribution of the actual topological samples,xis thatP data The real sample in (1) is selected, P z the distribution of input noise is shown, D (x) shows the probability of judging the sample to be correct, G (z) shows a heterogeneous multi-node interconnection topological graph; z represents input noise; obtaining a topology reconstruction loss result based on the following formula:
Figure 612902DEST_PATH_IMAGE019
wherein the content of the first and second substances, P t representing a heterogeneous multi-node interconnect topology,
Figure 251212DEST_PATH_IMAGE020
represents a true heterogeneous multi-node interconnect topology,
Figure 938545DEST_PATH_IMAGE021
the topological distance between the heterogeneous multi-node interconnection topological structure and the corresponding node of the real heterogeneous multi-node interconnection topological structure is represented;
obtaining the final loss of the heterogeneous multi-node interconnection topological structure based on the following formula according to the topological reconstruction loss and the cross entropy loss operation result:
Figure 266758DEST_PATH_IMAGE022
wherein, λ is the weight of the reconstruction term,
Figure 406753DEST_PATH_IMAGE023
in order to reconstruct the loss of the topology,
Figure 111403DEST_PATH_IMAGE024
is the cross entropy loss;
and comparing the final loss of the heterogeneous multi-node interconnection topological structure with the loss of a preset heterogeneous multi-node interconnection topological structure, if the final loss of the heterogeneous multi-node interconnection topological structure is greater than the loss of the preset heterogeneous multi-node interconnection topological structure, repeatedly executing the up-sampling processing and the graph convolution operation until the final loss of the heterogeneous multi-node interconnection topological structure is not greater than the loss of the preset heterogeneous multi-node interconnection topological structure, and ensuring that the heterogeneous multi-node interconnection topological structure meets the preset accuracy requirement.
In one embodiment of the invention, the performance parameters include noise signals, performance requirements, power consumption requirements, and cost requirements.
In order to achieve the above object, the present application further provides a second technical solution:
a computer-readable storage medium storing a program which, when executed by a processor, causes the processor to perform the steps of the heterogeneous multi-node interconnect topology generation method.
Compared with the prior art, the technical scheme of the invention has the following advantages:
the invention discloses a heterogeneous multi-node interconnection topology generation method and a storage medium, which are characterized by extracting node information and a topological structure based on a graph convolution network model so as to obtain low-dimensional vector representation of the node information and the topological structure; inputting performance parameters and the low-dimensional vector representation to a full-connection layer to generate feature integration information; inputting the feature integration information into a preset generation network to generate a heterogeneous multi-node interconnection topological structure; and ensuring that the heterogeneous multi-node interconnection topological structure meets the preset accuracy requirement based on the pre-constructed discrimination network. According to the method for generating the heterogeneous multi-node interconnection topology, the heterogeneous multi-node interconnection topology structure which meets the use requirement can be generated based on the input performance parameters and the low-dimensional vector expression, the cost can be reduced to the greatest extent on the premise that the performance requirement is met, and the resource utilization rate is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic representation of a prior art Crossbar;
FIG. 2 is a schematic diagram of a prior art Crossbar cascade;
FIG. 3 is a prior art topology of a bus and Ring;
FIG. 4 is a schematic diagram of a prior art 2D Mesh and 2D Torus distributed switching matrix;
FIG. 5 is a schematic diagram of a prior art 2D Torus configuration;
FIG. 6 is a schematic representation of a prior art chordal ring and triple ring topology;
FIG. 7 is a diagram of the internal architecture of a chip dedicated to image processing in the prior art;
FIG. 8 is a flow chart of a method of the present invention;
FIG. 9 is a schematic diagram of an application environment of the method of the present invention;
FIG. 10 is a schematic diagram of the internal operation structure of the discrimination network according to the present invention;
FIG. 11 is a diagram illustrating the internal operation of the convolutional layer according to the present invention;
fig. 12 is a schematic diagram of an internal operation structure of the generation network according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The first embodiment is as follows:
referring to fig. 8, fig. 8 is a flowchart of a method according to a first embodiment.
The method of the present embodiment is applied to the application environment shown in fig. 9. The method of the embodiment comprises the following steps:
and step S1, extracting node information and a topological structure based on the graph convolution network model by the characteristics to obtain low-dimensional vector representation of the node information and the topological structure.
In one embodiment, the graph convolution network model is based on a node information base and a topology structure base, feature node information and a topology structure. The node information base comprises node models respectively established on the basis of the characteristics of node computing power, node core quantity, node interface broadband and the like; the topological structure library comprises a topological structure model established based on the characteristics of connection modes, the number of connection points, the connection series, the density of connection lines, the length of the connection lines and the like; the performance parameters include performance, power consumption, cost, etc.
In one embodiment, based on the performance parameter, the node information, and the low-dimensional vector representation of the topology structure, the node embedding vector of the graph convolution network model is obtained based on the following formula:
Figure 653243DEST_PATH_IMAGE001
wherein the content of the first and second substances,v i the nodes are represented as a list of nodes,
Figure 152358DEST_PATH_IMAGE002
representing nodesv i The neighbor nodes of (a) are,e ij representing nodesv i Andv j the connection of (a) to (b),meanrepresents a mean function; obtaining a connection embedding vector of the graph convolution network model based on the following formula:
Figure 779648DEST_PATH_IMAGE003
wherein the content of the first and second substances,f c0 andf c1 two feed-forward networks of different sizes are shown,w ij e is a learnable 1x1 weight corresponding to the adjacent edge,concatrepresenting a splicing function, creating a node vector based on the node features,v i andv j each represents a node.
And step S2, inputting the node embedding vector, the connection embedding vector and the performance parameter into a full connection layer based on the node embedding vector, the connection embedding vector and the performance parameter, fusing the node embedding vector, the connection embedding vector and the performance parameter in the full connection layer to form feature integration information, and inputting the feature integration information into a preset generation network to generate the heterogeneous multi-node interconnection topological structure.
In one embodiment, the generating network includes, but is not limited to, an upsampling layer and a convolutional layer, and as shown in fig. 10, the generating network is a schematic diagram of the generating network described in this application. Wherein the spatial up-sampling layer uses the histogram
Figure 756831DEST_PATH_IMAGE007
The defined aggregation function operates on maps S (V, E) containing V vertices and E edges to a larger map
Figure 418757DEST_PATH_IMAGE004
By assigning different importance to a new set of vertices, the network can learn
Figure 354352DEST_PATH_IMAGE025
To a good upsampling of the map. The upsampling process of the upsampling layer specifically includes: mapping the graph S (V, E) to a graph comprising N x N vertices and E x m adjacent edges
Figure 734518DEST_PATH_IMAGE004
(ii) a Based on the graph
Figure 515392DEST_PATH_IMAGE026
Generating a first adjacency matrix and obtaining an initial value of the first adjacency matrix; training the first adjacency matrix based on the initial value of the first adjacency matrix to obtain the optimal value of the first adjacency matrix
Figure 31824DEST_PATH_IMAGE007
. Obtaining the graph based on the following formula
Figure 872741DEST_PATH_IMAGE026
Vertex characteristics:
Figure 465834DEST_PATH_IMAGE027
wherein, in the step (A),f in is shown as a drawing
Figure 50400DEST_PATH_IMAGE004
The characteristic of the vertex of (a),krepresentation diagram
Figure 421338DEST_PATH_IMAGE026
Top point of (2)jAnd vertexiGeodesic distance therebetween;
Figure 433156DEST_PATH_IMAGE028
for calculating the weight of any vertex in the N x N vertexes
Figure 991177DEST_PATH_IMAGE026
The optimum value of (c);f j the vertex features of diagram S (V, E) are shown. It is to be understood that the values of n and m may be the same or different.
In one embodiment, after upsampling at the upsampling layer, the convolutional layer performs a convolution operation based on the global graph and the independent graph. The operation process of the convolutional layer is shown in fig. 11, specifically, the independent graph is initialized to be the graph S (V, E);
generating the independent graph based on:
Figure 379433DEST_PATH_IMAGE008
wherein the content of the first and second substances,C k a separate figure is shown which is,f in representation diagram
Figure 604878DEST_PATH_IMAGE004
The characteristic of the vertex of (a),
Figure 787597DEST_PATH_IMAGE009
and
Figure 364072DEST_PATH_IMAGE010
are respectively embedded functionsθAndψis determined by the parameters of (a) and (b),SoftMaxis a normalization function; wherein the normalization function is:
Figure 290440DEST_PATH_IMAGE012
wherein N represents a figure
Figure 370391DEST_PATH_IMAGE004
The number of vertices in (a) is,
Figure 989591DEST_PATH_IMAGE013
Figure 787783DEST_PATH_IMAGE014
each represents 1 × 1 convolution layers with different initial values. Obtaining a characteristic value of the heterogeneous multi-node interconnection topological structure based on the following formula:
Figure 252263DEST_PATH_IMAGE015
wherein the content of the first and second substances,B k a global graph is represented that represents the global graph,C k a separate figure is shown which is,
Figure 452300DEST_PATH_IMAGE016
a parameter indicating the adjustment of the weight of the independent graph,f in representation diagram
Figure 242401DEST_PATH_IMAGE017
The characteristic of the vertex of (a),K v the size of the kernel representing the spatial dimension,W k represents the weight vector of the 1x1 convolution operation.
And step S3, based on the pre-constructed discrimination network, ensuring that the heterogeneous multi-node interconnection topological structure meets the preset accuracy requirement. The similarity exists between the operation structure of the discrimination network and the operation structure of the generation network, as shown in fig. 12, which is the internal structure of the discrimination network described in the present application. Specifically, the discrimination network uses trainable weights different from the weights learned by the generation network
Figure 265240DEST_PATH_IMAGE029
Of the aggregate matrix
Figure 798989DEST_PATH_IMAGE030
Since the aggregation is from a larger graph
Figure 853533DEST_PATH_IMAGE031
Mapping to a smaller graph S1(V1,E1) Obtaining a graph S based on the following formula1(V1,E1) The vertex feature of (2):
Figure 283377DEST_PATH_IMAGE032
wherein the content of the first and second substances,f i is shown as a drawing S1(V1,E1) The vertex feature of (1);f j representation diagram
Figure 56161DEST_PATH_IMAGE031
The vertex feature of (1);krepresentation diagram
Figure 128022DEST_PATH_IMAGE017
Top point of (2)jAnd vertexiIn betweenMeasuring the distance to the ground;
Figure 37072DEST_PATH_IMAGE030
to have trainable weight
Figure 168976DEST_PATH_IMAGE029
Of the aggregate matrix
Figure 163477DEST_PATH_IMAGE030
In one embodiment, the step of ensuring that the heterogeneous multi-node interconnection topology meets the preset accuracy requirement specifically includes: performing cross entropy loss operation on the heterogeneous multi-node interconnection topological structure and a preset real heterogeneous multi-node interconnection topological structure based on the following formula;
Figure 39029DEST_PATH_IMAGE018
wherein the content of the first and second substances,Ethe expected value of the distribution function is represented,P data represents the distribution of the actual topological samples,xis thatP data The real sample in (1) is selected, P z the distribution of input noise is shown, D (x) shows the probability of judging the sample to be correct, G (z) shows a heterogeneous multi-node interconnection topological graph; z represents input noise; obtaining a topology reconstruction loss result based on the following formula:
Figure 68165DEST_PATH_IMAGE019
wherein the content of the first and second substances,P t representing a heterogeneous multi-node interconnect topology,
Figure 105391DEST_PATH_IMAGE020
represents a true heterogeneous multi-node interconnect topology,
Figure 852768DEST_PATH_IMAGE021
pair for representing heterogeneous multi-node interconnection topological structure and real heterogeneous multi-node interconnection topological structureThe topological distance of the corresponding node; obtaining the final loss of the heterogeneous multi-node interconnection topological structure based on the following formula according to the topological reconstruction loss and the cross entropy loss operation result:
Figure 266431DEST_PATH_IMAGE033
wherein, λ is the weight of the reconstruction term,
Figure 150074DEST_PATH_IMAGE023
in order to reconstruct the loss of the topology,
Figure 358201DEST_PATH_IMAGE024
is the cross entropy loss; and comparing the final loss of the heterogeneous multi-node interconnection topological structure with the loss of a preset heterogeneous multi-node interconnection topological structure, and if the final loss of the heterogeneous multi-node interconnection topological structure is greater than the loss of the preset heterogeneous multi-node interconnection topological structure, repeatedly executing the step S2 until the final loss of the heterogeneous multi-node interconnection topological structure is not greater than the loss of the preset heterogeneous multi-node interconnection topological structure, so as to ensure that the heterogeneous multi-node interconnection topological structure meets the preset accuracy requirement.
Example two:
the method of the embodiment comprises the following steps: extracting node information and a topological structure according to the characteristics based on the graph convolution network model so as to obtain low-dimensional vector representation of the node information and the topological structure; inputting performance parameters and the low-dimensional vector representation to a full-connection layer to generate feature integration information; inputting the feature integration information into a preset generation network to generate a heterogeneous multi-node interconnection topological structure; and ensuring that the heterogeneous multi-node interconnection topological structure meets the preset accuracy requirement based on the pre-constructed discrimination network.
In one embodiment, the generating a network specifically includes: an upper sampling layer and a convolution layer; and the convolution layer generates a heterogeneous multi-node interconnection topological structure based on an up-sampling processing result of the up-sampling layer.
In one embodiment, the low-dimensional vector representation comprises a node embedding vector and a connection embedding vector; acquiring a node embedding vector of the graph convolution network model based on the following formula:
Figure 327294DEST_PATH_IMAGE001
wherein the content of the first and second substances,v i the nodes are represented as a list of nodes,
Figure 813158DEST_PATH_IMAGE002
representing nodesv i The neighbor nodes of (a) are,e ij representing nodesv i Andv j the connection of (a) to (b),meanrepresents a mean function; obtaining a connection embedding vector of the graph convolution network model based on the following formula:
Figure 285727DEST_PATH_IMAGE003
wherein the content of the first and second substances,f c0 andf c1 two feed-forward networks of different sizes are shown,w ij e is a learnable 1x1 weight corresponding to the adjacent edge,concatrepresenting a splicing function, creating a node vector based on the node features,v i andv j each represents a node.
In one embodiment, the upsampling process of the upsampling layer specifically includes: assuming that the feature integration information is a graph S (V, E) comprising V vertices and E adjacent edges; based on the graph S (V, E), the following operations are performed in sequence: mapping the graph S (V, E) to a graph comprising N x N vertices and E x m adjacent edges
Figure 930335DEST_PATH_IMAGE031
(ii) a Based on the graph
Figure 386725DEST_PATH_IMAGE031
Generating a first adjacency matrix and obtaining an initial value of the first adjacency matrix; training the first adjacency matrix based on the initial value of the first adjacency matrix to obtain the optimal value of the first adjacency matrix
Figure 407770DEST_PATH_IMAGE005
In one embodiment, the map is obtained based on the following equation
Figure 266005DEST_PATH_IMAGE031
Vertex characteristics:
Figure 81514DEST_PATH_IMAGE006
wherein the content of the first and second substances,f in is shown as a drawing
Figure 759620DEST_PATH_IMAGE031
The characteristic of the vertex of (a),k ij representation diagram
Figure 318777DEST_PATH_IMAGE031
Top point of (2)jAnd vertexiGeodesic distance therebetween;
Figure 31518DEST_PATH_IMAGE007
for calculating the weight of any vertex in the N x N vertexes
Figure 17929DEST_PATH_IMAGE031
The optimum value of (c);f j the vertex features of diagram S (V, E) are shown.
In one embodiment, the convolutional layer generates a global graph and an independent graph based on an upsampling result of the upsampling layer, and performs a convolution operation based on the global graph and the independent graph; the convolution operation specifically comprises the following steps: initializing the independent graph as graph S (V, E), and generating the independent graph based on the following formula:
Figure 448910DEST_PATH_IMAGE034
wherein the content of the first and second substances,C k a separate figure is shown which is,f in representation diagram
Figure 546179DEST_PATH_IMAGE004
The characteristic of the vertex of (a),
Figure 113427DEST_PATH_IMAGE035
and
Figure 270739DEST_PATH_IMAGE010
are respectively embedded functionsθAndψis determined by the parameters of (a) and (b),SoftMaxis a normalization function; wherein the normalization function is:
Figure 926367DEST_PATH_IMAGE036
wherein N represents a figure
Figure 827327DEST_PATH_IMAGE004
The number of the vertices of (2) is,
Figure 514660DEST_PATH_IMAGE013
Figure 842873DEST_PATH_IMAGE014
each represents 1 × 1 convolution layers with different initial values.
In one embodiment, the characteristic value of the heterogeneous multi-node interconnection topology is obtained based on the following formula:
Figure 982867DEST_PATH_IMAGE037
wherein the content of the first and second substances,B k a global graph is represented that represents the global graph,C k a separate figure is shown which is,
Figure 421939DEST_PATH_IMAGE016
a parameter indicating the adjustment of the weight of the independent graph,f in representation diagram
Figure 963779DEST_PATH_IMAGE026
The characteristic of the vertex of (a),K v the size of the kernel representing the spatial dimension,W k represents the weight vector of the 1x1 convolution operation. The specific operation process of the interior of the convolutional layer is shown in fig. 11, in which,B k is a global graph and is unique to each layer.C k Is an independent graph used to learn the specific topology of each sample. θ and ψ are two embedded functions, here convolutional layers of 1x 1.K v The number of sub-graphs is represented,
Figure 462893DEST_PATH_IMAGE038
which represents the operation of the residual error,
Figure 90184DEST_PATH_IMAGE039
a matrix multiplication operation is represented as a function of,
Figure 332946DEST_PATH_IMAGE040
are gates that control the importance weights of both graphs. Adjusting the importance of independent graphs in different layers by gating mechanisms, using a different one for each layer
Figure 994872DEST_PATH_IMAGE041
The value is learned and updated through training.
In one embodiment, the step of ensuring that the heterogeneous multi-node interconnection topology meets the preset accuracy requirement specifically includes:
performing cross entropy loss operation on the heterogeneous multi-node interconnection topological structure and a preset real heterogeneous multi-node interconnection topological structure based on the following formula;
Figure 664887DEST_PATH_IMAGE042
wherein,EThe expected value of the distribution function is represented,P data represents the distribution of the actual topological samples,xis thatP data The real sample in (1) is selected, P z the distribution of input noise is shown, D (x) shows the probability of judging the sample to be correct, G (z) shows a heterogeneous multi-node interconnection topological graph; z represents input noise; obtaining a topology reconstruction loss result based on the following formula:
Figure 513895DEST_PATH_IMAGE019
wherein the content of the first and second substances, P t representing a heterogeneous multi-node interconnect topology,
Figure 560348DEST_PATH_IMAGE020
represents a true heterogeneous multi-node interconnect topology,
Figure 76780DEST_PATH_IMAGE021
the topological distance between the heterogeneous multi-node interconnection topological structure and the corresponding node of the real heterogeneous multi-node interconnection topological structure is represented; obtaining the final loss of the heterogeneous multi-node interconnection topological structure based on the following formula according to the topological reconstruction loss and the cross entropy loss operation result:
Figure 652118DEST_PATH_IMAGE033
wherein, λ is the weight of the reconstruction term,
Figure 81149DEST_PATH_IMAGE023
in order to reconstruct the loss of the topology,
Figure 665714DEST_PATH_IMAGE024
is the cross entropy loss; comparing the final loss of the heterogeneous multi-node interconnection topological structure with the loss of a preset heterogeneous multi-node interconnection topological structure, and if the final loss of the heterogeneous multi-node interconnection topological structure is greater than the preset heterogeneous multi-node interconnection topological structureThe upsampling operation and the graph convolution operation are repeatedly executed until the final loss of the heterogeneous multi-node interconnection topological structure is not greater than the loss of the preset heterogeneous multi-node interconnection topological structure, so that the heterogeneous multi-node interconnection topological structure is ensured to meet the preset accuracy requirement. Wherein the upsampling operation comprises: the upsampling process of the upsampling layer specifically includes: assuming that the feature integration information is a graph S (V, E) comprising V vertices and E adjacent edges; based on the graph S (V, E), the following operations are performed in sequence: mapping the graph S (V, E) to a graph comprising N x N vertices and E x m adjacent edges
Figure 36653DEST_PATH_IMAGE017
(ii) a Based on the graph
Figure 48471DEST_PATH_IMAGE031
Generating a first adjacency matrix and obtaining an initial value of the first adjacency matrix; training the first adjacency matrix based on the initial value of the first adjacency matrix to obtain the optimal value of the first adjacency matrix
Figure 137650DEST_PATH_IMAGE007
(ii) a Based on the optimum value
Figure 260326DEST_PATH_IMAGE007
Obtaining the map
Figure 751351DEST_PATH_IMAGE031
The vertex feature of (1). Obtaining the graph based on the following formula
Figure 668491DEST_PATH_IMAGE031
Vertex characteristics:
Figure 244966DEST_PATH_IMAGE006
wherein the content of the first and second substances,f in is shown as a drawing
Figure 171334DEST_PATH_IMAGE043
The characteristic of the vertex of (a),k ij representation diagram
Figure 251285DEST_PATH_IMAGE031
Top point of (2)jAnd vertexiGeodesic distance therebetween;
Figure 604906DEST_PATH_IMAGE044
for calculating the weight of any vertex in the N x N vertexes
Figure 403098DEST_PATH_IMAGE031
The optimum value of (c);f j the vertex features of diagram S (V, E) are shown. The graph convolution operation comprises the steps that the convolution layer generates a global graph and an independent graph based on an up-sampling result of the up-sampling layer, and convolution operation is carried out based on the global graph and the independent graph; the convolution operation specifically comprises the following steps: initializing the independent graph as graph S (V, E), and generating the independent graph based on the following formula:
Figure 133156DEST_PATH_IMAGE008
wherein the content of the first and second substances,C k a separate figure is shown which is,f in representation diagram
Figure 333193DEST_PATH_IMAGE004
The characteristic of the vertex of (a),
Figure 857716DEST_PATH_IMAGE009
and
Figure 143204DEST_PATH_IMAGE010
are respectively embedded functionsθAndψis determined by the parameters of (a) and (b),SoftMaxis a normalization function;
wherein the normalization function is:
Figure 679883DEST_PATH_IMAGE045
wherein N represents a figure
Figure 734426DEST_PATH_IMAGE004
The number of the vertices of (2) is,
Figure 429850DEST_PATH_IMAGE013
Figure 202634DEST_PATH_IMAGE014
each represents 1 × 1 convolution layers with different initial values. Obtaining a characteristic value of the heterogeneous multi-node interconnection topological structure based on the following formula: obtaining a characteristic value of the heterogeneous multi-node interconnection topological structure based on the following formula:
Figure 8916DEST_PATH_IMAGE037
wherein the content of the first and second substances,B k a global graph is represented that represents the global graph,C k a separate figure is shown which is,
Figure 917966DEST_PATH_IMAGE016
a parameter indicating the adjustment of the weight of the independent graph,f in representation diagram
Figure 49870DEST_PATH_IMAGE026
The characteristic of the vertex of (a),K v the size of the kernel representing the spatial dimension,W k represents the weight vector of the 1x1 convolution operation.
In one embodiment, the performance parameters include noise signals, performance requirements, power consumption requirements, and cost requirements.
Example three:
the present embodiment provides a computer-readable storage medium, which stores a program, and when the program is executed by a processor, the program causes the processor to execute the steps of the method for generating a heterogeneous multi-node interconnection topology in the first embodiment.
In one embodiment, the program, when executed by the processor, causes the processor to perform the steps of: extracting node information and a topological structure according to the characteristics based on the graph convolution network model so as to obtain low-dimensional vector representation of the node information and the topological structure; inputting performance parameters and the low-dimensional vector representation to a full-connection layer to generate feature integration information; inputting the feature integration information into a preset generation network to generate a heterogeneous multi-node interconnection topological structure; and ensuring that the heterogeneous multi-node interconnection topological structure meets the preset accuracy requirement based on the pre-constructed discrimination network. Wherein the performance parameters include a noise signal, performance requirements, power consumption requirements, and cost requirements.
In one embodiment, the program, when executed by the processor, causes the processor to perform the steps of: and the convolution layer generates a heterogeneous multi-node interconnection topological structure based on an up-sampling processing result of the up-sampling layer.
In one embodiment, the program, when executed by the processor, causes the processor to perform the steps of: acquiring a node embedding vector of the graph convolution network model based on the following formula:
Figure 44371DEST_PATH_IMAGE001
wherein the content of the first and second substances,v i the nodes are represented as a list of nodes,
Figure 919923DEST_PATH_IMAGE002
representing nodesv i The neighbor nodes of (a) are,e ij representing nodesv i Andv j the connection of (a) to (b),meanrepresents a mean function;
obtaining a connection embedding vector of the graph convolution network model based on the following formula:
Figure 683480DEST_PATH_IMAGE003
wherein the content of the first and second substances,f c0 andf c1 means two are notA feed-forward network of the same size,w ij e is a learnable 1x1 weight corresponding to the adjacent edge,concatrepresenting a splicing function, creating a node vector based on the node features,v i andv j each represents a node.
In one embodiment, the program, when executed by the processor, causes the processor to perform the steps of: assuming that the feature integration information is a graph S (V, E) comprising V vertices and E adjacent edges; based on the graph S (V, E), the following operations are performed in sequence: mapping the graph S (V, E) to a graph comprising N x N vertices and E x m adjacent edges
Figure 986285DEST_PATH_IMAGE031
(ii) a Based on the graph
Figure 733661DEST_PATH_IMAGE017
Generating a first adjacency matrix and obtaining an initial value of the first adjacency matrix; training the first adjacency matrix based on the initial value of the first adjacency matrix to obtain the optimal value of the first adjacency matrix
Figure 881746DEST_PATH_IMAGE046
In one embodiment, the program, when executed by the processor, causes the processor to perform the steps of: obtaining the graph based on the following formula
Figure 765388DEST_PATH_IMAGE031
The vertex feature of (2):
Figure 239095DEST_PATH_IMAGE006
wherein the content of the first and second substances,f in is shown as a drawing
Figure 208188DEST_PATH_IMAGE031
The characteristic of the vertex of (a),k ij representation diagram
Figure 428472DEST_PATH_IMAGE031
Top point of (2)jAnd vertexiGeodesic distance therebetween;
Figure 166621DEST_PATH_IMAGE044
for calculating the weight of any vertex in the N x N vertexes
Figure 811229DEST_PATH_IMAGE031
The optimum value of (c);f j the vertex features of diagram S (V, E) are shown.
In one embodiment, the program, when executed by the processor, causes the processor to perform the steps of: initializing the independent graph as graph S (V, E), and generating the independent graph based on the following formula:
Figure 267618DEST_PATH_IMAGE008
wherein the content of the first and second substances,C k a separate figure is shown which is,f in representation diagram
Figure 23085DEST_PATH_IMAGE004
The characteristic of the vertex of (a),
Figure 881319DEST_PATH_IMAGE009
and
Figure 431249DEST_PATH_IMAGE010
are respectively embedded functionsθAndψis determined by the parameters of (a) and (b),SoftMaxis a normalization function; wherein the normalization function is:
Figure 109355DEST_PATH_IMAGE045
wherein N represents a figure
Figure 668513DEST_PATH_IMAGE004
The number of the vertices of (2) is,
Figure 381254DEST_PATH_IMAGE013
Figure 367664DEST_PATH_IMAGE014
each represents 1 × 1 convolution layers with different initial values.
In one embodiment, the program, when executed by the processor, causes the processor to perform the steps of: obtaining a characteristic value of the graph convolution network model based on the following formula:
Figure 533066DEST_PATH_IMAGE037
wherein the content of the first and second substances,B k a global graph is represented that represents the global graph,C k a separate figure is shown which is,
Figure 630335DEST_PATH_IMAGE016
a parameter indicating the adjustment of the weight of the independent graph,f in representation diagram
Figure 463162DEST_PATH_IMAGE026
The characteristic of the vertex of (a),K v the size of the kernel representing the spatial dimension,W k represents the weight vector of the 1x1 convolution operation.
In one embodiment, the program, when executed by the processor, causes the processor to perform the steps of: the step of ensuring that the heterogeneous multi-node interconnection topological structure meets the preset accuracy requirement specifically comprises the following steps: performing cross entropy loss operation on the heterogeneous multi-node interconnection topological structure and a preset real heterogeneous multi-node interconnection topological structure based on the following formula;
Figure 620474DEST_PATH_IMAGE042
wherein the content of the first and second substances,Ethe expected value of the distribution function is represented,P data represents the distribution of the actual topological samples,xis thatP data In (1) is trueThe sample is taken from the sample container, P z the distribution of input noise is shown, D (x) shows the probability of judging the sample to be correct, G (z) shows a heterogeneous multi-node interconnection topological graph; z represents input noise; obtaining a topology reconstruction loss result based on the following formula:
Figure 273172DEST_PATH_IMAGE019
wherein the content of the first and second substances,P t representing a heterogeneous multi-node interconnect topology,
Figure 174132DEST_PATH_IMAGE020
represents a true heterogeneous multi-node interconnect topology,
Figure 864395DEST_PATH_IMAGE021
the topological distance between the heterogeneous multi-node interconnection topological structure and the corresponding node of the real heterogeneous multi-node interconnection topological structure is represented; obtaining the final loss of the heterogeneous multi-node interconnection topological structure based on the following formula according to the topological reconstruction loss and the cross entropy loss operation result:
Figure 192608DEST_PATH_IMAGE033
wherein, λ is the weight of the reconstruction term,
Figure 67023DEST_PATH_IMAGE023
in order to reconstruct the loss of the topology,
Figure 506095DEST_PATH_IMAGE024
is the cross entropy loss; comparing the final loss of the heterogeneous multi-node interconnection topological structure with the loss of a preset heterogeneous multi-node interconnection topological structure, if the final loss of the heterogeneous multi-node interconnection topological structure is larger than the loss of the preset heterogeneous multi-node interconnection topological structure, repeatedly executing the generation of the heterogeneous multi-node interconnection topological structure until the final loss of the heterogeneous multi-node interconnection topological structure is not larger than the preset heterogeneous multi-node interconnection topological structureAnd the loss of the point interconnection topological structure ensures that the heterogeneous multi-node interconnection topological structure meets the preset accuracy requirement.
As will be appreciated by one of skill in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A heterogeneous multi-node interconnection topology generation method is characterized by comprising the following steps: the method comprises the following steps:
extracting node information and a topological structure according to the characteristics based on the graph convolution network model so as to obtain low-dimensional vector representation of the node information and the topological structure;
inputting performance parameters and the low-dimensional vector representation to a full-connection layer to generate feature integration information;
inputting the feature integration information into a preset generation network to generate a heterogeneous multi-node interconnection topological structure;
and acquiring a characteristic value of the heterogeneous multi-node interconnection topological structure, and ensuring that the heterogeneous multi-node interconnection topological structure meets a preset accuracy requirement.
2. The heterogeneous multi-node interconnection topology generation method according to claim 1, wherein: the generating the network specifically includes: an upsampling layer, a convolutional layer, a full link layer, a batch normalization layer, a modified linear unit, and an sigmoid function.
3. The heterogeneous multi-node interconnection topology generation method according to claim 1, wherein: the low-dimensional vector representation comprises a node embedding vector and a connection embedding vector;
acquiring a node embedding vector of the graph convolution network model based on the following formula:
Figure 522467DEST_PATH_IMAGE001
wherein the content of the first and second substances,v i the nodes are represented as a list of nodes,
Figure 130166DEST_PATH_IMAGE002
representing nodesv i The neighbor nodes of (a) are,e ij representing nodesv i Andv j the connection of (a) to (b),meanrepresents a mean function;
obtaining a connection embedding vector of the graph convolution network model based on the following formula:
Figure 182436DEST_PATH_IMAGE003
wherein the content of the first and second substances,f c0 andf c1 two feed-forward networks of different sizes are shown,w ij e indicating a learnable 1x1 weight for the corresponding adjacent edge,concatrepresenting a splicing function, creating a node vector based on the node features,v i andv j each represents a node.
4. The heterogeneous multi-node interconnection topology generation method according to claim 2, wherein: the upsampling process of the upsampling layer specifically includes: assuming that the feature integration information is a graph S (V, E) comprising V vertices and E adjacent edges; based on the graph S (V, E), the following operations are performed in sequence:
mapping the graph S (V, E) to a graph comprising N x N vertices and E x m adjacent edges
Figure 900993DEST_PATH_IMAGE004
Based on the graph
Figure 355108DEST_PATH_IMAGE004
Generating a first adjacency matrix and obtaining an initial value of the first adjacency matrix;
training the first adjacency matrix based on the initial value of the first adjacency matrix to obtain the optimal value of the first adjacency matrix
Figure 133708DEST_PATH_IMAGE005
5. The heterogeneous multi-node interconnection topology generation method according to claim 4, wherein: obtaining the graph based on the following formula
Figure 673274DEST_PATH_IMAGE004
The vertex feature of (2):
Figure 195522DEST_PATH_IMAGE006
wherein the content of the first and second substances,f in is shown as a drawing
Figure 501214DEST_PATH_IMAGE004
The characteristic of the vertex of (a),k ij representation diagram
Figure 450715DEST_PATH_IMAGE004
Top point of (2)jAnd vertexiThe geodesic distance between the two ground-measuring devices,
Figure 211998DEST_PATH_IMAGE005
for calculating the weight of any vertex in the N x N vertexes
Figure 537937DEST_PATH_IMAGE004
The optimum value of (a) is set,f j the vertex features of diagram S (V, E) are shown.
6. The heterogeneous multi-node interconnection topology generation method according to claim 2, wherein: the convolution layer generates a global graph and an independent graph based on an up-sampling result of the up-sampling layer, and performs convolution operation based on the global graph and the independent graph; the convolution operation specifically comprises the following steps:
initializing the independent graph as graph S (V, E), and generating the independent graph based on the following formula:
Figure 701065DEST_PATH_IMAGE007
wherein the content of the first and second substances,C k a separate figure is shown which is,f in representation diagram
Figure 821468DEST_PATH_IMAGE008
The characteristic of the vertex of (a),
Figure 335626DEST_PATH_IMAGE009
and
Figure 199677DEST_PATH_IMAGE010
are respectively embedded functionsθAndψis determined by the parameters of (a) and (b),SoftMaxis a normalization function;
wherein the normalization function is:
Figure 217311DEST_PATH_IMAGE012
wherein N represents a figure
Figure 39774DEST_PATH_IMAGE008
The number of the vertices of (2) is,
Figure 775649DEST_PATH_IMAGE013
Figure 443390DEST_PATH_IMAGE014
each represents 1 × 1 convolution layers with different initial values.
7. The heterogeneous multi-node interconnection topology generation method according to claim 6, wherein: obtaining a characteristic value of the heterogeneous multi-node interconnection topological structure based on the following formula:
Figure 315531DEST_PATH_IMAGE015
wherein the content of the first and second substances,B k a global graph is represented that represents the global graph,C k a separate figure is shown which is,
Figure 43316DEST_PATH_IMAGE016
a parameter indicating the adjustment of the weight of the independent graph,f in representation diagram
Figure 266487DEST_PATH_IMAGE017
The characteristic of the vertex of (a),K v the size of the kernel representing the spatial dimension,W k represents the weight vector of the 1x1 convolution operation.
8. The heterogeneous multi-node interconnection topology generation method according to claim 7, wherein: the step of ensuring that the heterogeneous multi-node interconnection topological structure meets the preset accuracy requirement specifically comprises the following steps:
and carrying out cross entropy loss operation on the heterogeneous multi-node interconnection topological structure and a preset real heterogeneous multi-node interconnection topological structure based on the following formula:
Figure 472340DEST_PATH_IMAGE018
wherein the content of the first and second substances,Ethe expected value of the distribution function is represented,P data represents the distribution of the actual topological samples,xis thatP data The real sample in (1) is selected,P z the distribution of input noise is shown, D (x) shows the probability of judging the sample to be correct, G (z) shows a heterogeneous multi-node interconnection topological graph; z represents input noise;
obtaining a topology reconstruction loss result based on the following formula:
Figure 464567DEST_PATH_IMAGE019
wherein the content of the first and second substances, P t representing a heterogeneous multi-node interconnect topology,
Figure 377901DEST_PATH_IMAGE020
represents a true heterogeneous multi-node interconnect topology,
Figure 822789DEST_PATH_IMAGE021
the topological distance between the heterogeneous multi-node interconnection topological structure and the corresponding node of the real heterogeneous multi-node interconnection topological structure is represented;
obtaining the final loss of the heterogeneous multi-node interconnection topological structure according to the topological reconstruction loss result and the cross entropy loss operation result based on the following formula:
Figure 97913DEST_PATH_IMAGE022
wherein, λ is the weight of the reconstruction term,
Figure 944646DEST_PATH_IMAGE023
in order to reconstruct the loss of the topology,
Figure 14233DEST_PATH_IMAGE024
is the cross entropy loss;
comparing the final loss of the heterogeneous multi-node interconnection topological structure with the loss of a preset heterogeneous multi-node interconnection topological structure, and if the final loss of the heterogeneous multi-node interconnection topological structure is greater than the loss of the preset heterogeneous multi-node interconnection topological structure, repeatedly executing the claims 4-7 until the final loss of the heterogeneous multi-node interconnection topological structure is not greater than the loss of the preset heterogeneous multi-node interconnection topological structure, so as to ensure that the heterogeneous multi-node interconnection topological structure meets the preset accuracy requirement.
9. The heterogeneous multi-node interconnection topology generation method according to claim 1, wherein: the performance parameters include noise signals, performance requirements, power consumption requirements, and cost requirements.
10. A computer-readable storage medium characterized by: the computer readable storage medium stores a program which, when executed by a processor, causes the processor to perform the steps of the method according to any one of claims 1 to 9.
CN202210024578.5A 2022-01-10 2022-01-10 Heterogeneous multi-node interconnection topology generation method and storage medium Active CN114050975B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210024578.5A CN114050975B (en) 2022-01-10 2022-01-10 Heterogeneous multi-node interconnection topology generation method and storage medium
PCT/CN2022/096236 WO2023130656A1 (en) 2022-01-10 2022-05-31 Method for generating heterougeneous multi-node interconnection topology, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210024578.5A CN114050975B (en) 2022-01-10 2022-01-10 Heterogeneous multi-node interconnection topology generation method and storage medium

Publications (2)

Publication Number Publication Date
CN114050975A true CN114050975A (en) 2022-02-15
CN114050975B CN114050975B (en) 2022-04-19

Family

ID=80196189

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210024578.5A Active CN114050975B (en) 2022-01-10 2022-01-10 Heterogeneous multi-node interconnection topology generation method and storage medium

Country Status (2)

Country Link
CN (1) CN114050975B (en)
WO (1) WO2023130656A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114726739A (en) * 2022-04-18 2022-07-08 深圳市智象科技有限公司 Topological data processing method, device, equipment and storage medium
CN114884908A (en) * 2022-04-29 2022-08-09 浪潮电子信息产业股份有限公司 Data synchronization method, device, equipment and storage medium
WO2023130656A1 (en) * 2022-01-10 2023-07-13 苏州浪潮智能科技有限公司 Method for generating heterougeneous multi-node interconnection topology, and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111861178A (en) * 2020-07-13 2020-10-30 北京嘀嘀无限科技发展有限公司 Service matching model training method, service matching method, device and medium
CN111914484A (en) * 2020-08-07 2020-11-10 中国南方电网有限责任公司 Recursive graph convolution network system for power grid transient stability evaluation
CN112163219A (en) * 2020-08-27 2021-01-01 北京航空航天大学 Malicious program identification and classification method based on word embedding and GCN
WO2021018228A1 (en) * 2019-07-30 2021-02-04 Huawei Technologies Co., Ltd. Detection of adverserial attacks on graphs and graph subsets
CN112417219A (en) * 2020-11-16 2021-02-26 吉林大学 Hyper-graph convolution-based hyper-edge link prediction method
CN112651492A (en) * 2020-12-30 2021-04-13 广州大学华软软件学院 Self-connection width graph convolution neural network model and training method thereof
CN112800903A (en) * 2021-01-19 2021-05-14 南京邮电大学 Dynamic expression recognition method and system based on space-time diagram convolutional neural network
CN112925989A (en) * 2021-01-29 2021-06-08 中国计量大学 Group discovery method and system of attribute network
CN113222328A (en) * 2021-03-25 2021-08-06 中国科学技术大学先进技术研究院 Air quality monitoring equipment point arrangement and site selection method based on road section pollution similarity
CN113240187A (en) * 2021-05-26 2021-08-10 合肥工业大学 Prediction model generation method, system, device, storage medium and prediction method
CN113780470A (en) * 2021-09-28 2021-12-10 广西师范大学 Graph classification method based on self-adaptive multi-channel cross graph convolution network
CN113904786A (en) * 2021-06-29 2022-01-07 重庆大学 False data injection attack identification method based on line topology analysis and power flow characteristics

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114050975B (en) * 2022-01-10 2022-04-19 苏州浪潮智能科技有限公司 Heterogeneous multi-node interconnection topology generation method and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021018228A1 (en) * 2019-07-30 2021-02-04 Huawei Technologies Co., Ltd. Detection of adverserial attacks on graphs and graph subsets
CN111861178A (en) * 2020-07-13 2020-10-30 北京嘀嘀无限科技发展有限公司 Service matching model training method, service matching method, device and medium
CN111914484A (en) * 2020-08-07 2020-11-10 中国南方电网有限责任公司 Recursive graph convolution network system for power grid transient stability evaluation
CN112163219A (en) * 2020-08-27 2021-01-01 北京航空航天大学 Malicious program identification and classification method based on word embedding and GCN
CN112417219A (en) * 2020-11-16 2021-02-26 吉林大学 Hyper-graph convolution-based hyper-edge link prediction method
CN112651492A (en) * 2020-12-30 2021-04-13 广州大学华软软件学院 Self-connection width graph convolution neural network model and training method thereof
CN112800903A (en) * 2021-01-19 2021-05-14 南京邮电大学 Dynamic expression recognition method and system based on space-time diagram convolutional neural network
CN112925989A (en) * 2021-01-29 2021-06-08 中国计量大学 Group discovery method and system of attribute network
CN113222328A (en) * 2021-03-25 2021-08-06 中国科学技术大学先进技术研究院 Air quality monitoring equipment point arrangement and site selection method based on road section pollution similarity
CN113240187A (en) * 2021-05-26 2021-08-10 合肥工业大学 Prediction model generation method, system, device, storage medium and prediction method
CN113904786A (en) * 2021-06-29 2022-01-07 重庆大学 False data injection attack identification method based on line topology analysis and power flow characteristics
CN113780470A (en) * 2021-09-28 2021-12-10 广西师范大学 Graph classification method based on self-adaptive multi-channel cross graph convolution network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZONGHAN WU,SHIRUI PAN,FENGWEN CHEN,GUODONG LONG,CHENGQI ZHANG: "A Comprehensive Survey on Graph Neural Networks", 《IEEE》 *
富坤,高金辉,赵晓梦: "双空间拓扑优化图卷积网络", 《信息技术与信息化》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023130656A1 (en) * 2022-01-10 2023-07-13 苏州浪潮智能科技有限公司 Method for generating heterougeneous multi-node interconnection topology, and storage medium
CN114726739A (en) * 2022-04-18 2022-07-08 深圳市智象科技有限公司 Topological data processing method, device, equipment and storage medium
CN114726739B (en) * 2022-04-18 2024-04-09 深圳市智象科技有限公司 Topology data processing method, device, equipment and storage medium
CN114884908A (en) * 2022-04-29 2022-08-09 浪潮电子信息产业股份有限公司 Data synchronization method, device, equipment and storage medium
CN114884908B (en) * 2022-04-29 2024-02-13 浪潮电子信息产业股份有限公司 Data synchronization method, device, equipment and storage medium

Also Published As

Publication number Publication date
WO2023130656A1 (en) 2023-07-13
CN114050975B (en) 2022-04-19

Similar Documents

Publication Publication Date Title
CN114050975B (en) Heterogeneous multi-node interconnection topology generation method and storage medium
Chen et al. NoC-based DNN accelerator: A future design paradigm
US11816790B2 (en) Unsupervised learning of scene structure for synthetic data generation
WO2021190127A1 (en) Data processing method and data processing device
WO2022063247A1 (en) Neural architecture search method and apparatus
WO2022068623A1 (en) Model training method and related device
WO2022179492A1 (en) Pruning processing method for convolutional neural network, data processing method and devices
US20210295168A1 (en) Gradient compression for distributed training
JP7350867B2 (en) Quantum feature kernel alignment
US20220147877A1 (en) System and method for automatic building of learning machines using learning machines
US20220269548A1 (en) Profiling and performance monitoring of distributed computational pipelines
CN113449839A (en) Distributed training method, gradient communication device and computing equipment
US11119507B2 (en) Hardware accelerator for online estimation
Loni et al. ADONN: adaptive design of optimized deep neural networks for embedded systems
WO2022100607A1 (en) Method for determining neural network structure and apparatus thereof
US20230139623A1 (en) Data path circuit design using reinforcement learning
CN109416688A (en) Method and system for flexible high performance structured data processing
Jin et al. Mapping very large scale spiking neuron network to neuromorphic hardware
Tang et al. Energy-efficient and high-throughput CNN inference on embedded CPUs-GPUs MPSoCs
CN115412401B (en) Method and device for training virtual network embedding model and virtual network embedding
WO2023164933A1 (en) Building modeling method and related apparatus
CN116127685A (en) Performing simulations using machine learning
CN116710974A (en) Domain adaptation using domain countermeasure learning in composite data systems and applications
Khalifa et al. New hardware architecture for self-organizing map used for color vector quantization
US20240177034A1 (en) Simulating quantum computing circuits using kronecker factorization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant