CN114090238A - Edge node load prediction method and device - Google Patents

Edge node load prediction method and device Download PDF

Info

Publication number
CN114090238A
CN114090238A CN202111276213.3A CN202111276213A CN114090238A CN 114090238 A CN114090238 A CN 114090238A CN 202111276213 A CN202111276213 A CN 202111276213A CN 114090238 A CN114090238 A CN 114090238A
Authority
CN
China
Prior art keywords
edge node
edge
load
nodes
node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111276213.3A
Other languages
Chinese (zh)
Other versions
CN114090238B (en
Inventor
缪巍巍
曾锃
张明轩
张震
张瑞
滕昌志
李世豪
毕思博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Information and Telecommunication Branch of State Grid Jiangsu Electric Power Co Ltd
Original Assignee
Information and Telecommunication Branch of State Grid Jiangsu Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Information and Telecommunication Branch of State Grid Jiangsu Electric Power Co Ltd filed Critical Information and Telecommunication Branch of State Grid Jiangsu Electric Power Co Ltd
Priority to CN202111276213.3A priority Critical patent/CN114090238B/en
Publication of CN114090238A publication Critical patent/CN114090238A/en
Application granted granted Critical
Publication of CN114090238B publication Critical patent/CN114090238B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • G06F9/505Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering the load
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/14Network analysis or design
    • H04L41/145Network analysis or design involving simulating, designing, planning or modelling of a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/14Network analysis or design
    • H04L41/147Network analysis or design for predicting network behaviour
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y04INFORMATION OR COMMUNICATION TECHNOLOGIES HAVING AN IMPACT ON OTHER TECHNOLOGY AREAS
    • Y04SSYSTEMS INTEGRATING TECHNOLOGIES RELATED TO POWER NETWORK OPERATION, COMMUNICATION OR INFORMATION TECHNOLOGIES FOR IMPROVING THE ELECTRICAL POWER GENERATION, TRANSMISSION, DISTRIBUTION, MANAGEMENT OR USAGE, i.e. SMART GRIDS
    • Y04S10/00Systems supporting electrical power generation, transmission or distribution
    • Y04S10/50Systems or methods supporting the power network operation or management, involving a certain degree of interaction with the load-side end user applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a method and a device for predicting load of an edge node, wherein the method comprises the following steps: determining adjacent edge nodes of the current edge node according to the edge node topological relation graph, and collecting the characteristics of the current edge node and the characteristics of the adjacent edge nodes; and performing weighted summation on the characteristics of the current edge node and the characteristics of the adjacent edge nodes by adopting the trained graph neural network model to calculate and obtain a predicted load value of the current edge node in a future preset time period. According to the edge node load prediction method and device provided by the invention, the characteristics of the edge node and the characteristics of the adjacent edge node are comprehensively considered, and even for a complex topological structure with multiple edge nodes, a more accurate load prediction result can be provided.

Description

Edge node load prediction method and device
Technical Field
The invention relates to a method and a device for predicting load of an edge node, and belongs to the technical field of communication.
Background
The existing edge node load prediction technology mainly models the loads of edge nodes in different regions and different time periods through models such as machine learning and time series, and the problems that adjacent edge nodes are mutually influenced and the loads of the edge nodes have spatial correlation are not considered, so that the load prediction accuracy of the edge nodes is low.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide a method and a device for predicting the load of an edge node, which consider the influence of adjacent edge nodes and can provide a more accurate load prediction result of the edge node.
In order to achieve the purpose, the invention is realized by adopting the following technical scheme:
according to an aspect of the present invention, there is provided an edge node load prediction method, including:
determining adjacent edge nodes of the current edge node according to the edge node topological relation graph, and collecting the characteristics of the current edge node and the characteristics of the adjacent edge nodes;
and performing weighted summation on the characteristics of the current edge node and the characteristics of the adjacent edge nodes by adopting the trained graph neural network model to calculate and obtain a predicted load value of the current edge node in a future preset time period.
In an embodiment of the invention, the characteristics include resource characteristics, historical load values and task request characteristics of the edge nodes.
In an embodiment of the invention, the historical load values comprise yesterday historical load values, last week historical average load values, last month historical average load values, and last year calendar historical average load values.
In the embodiment of the present invention, the resource characteristics include the number of CPU cores of the edge node, the total amount of memory, the total amount of bandwidth, and the number of servers of the edge node.
In an embodiment of the invention, the task request characteristics include yesterday user request total and resource characteristics of the user request.
In an embodiment of the present invention, the training method of the graph neural network model includes:
collecting the characteristics of each edge node in the edge node topological relation graph;
according to the characteristics of each edge node and the characteristics of the adjacent edge nodes of each edge node, learning the relationship between the characteristics of the edge nodes and the load by adopting a forward propagation algorithm, and thus calculating and acquiring a predicted load value of each edge node in a specified historical time period based on a graph neural network model;
and calculating the MSE loss based on the predicted load value of the graph neural network model and the actual load value of the corresponding edge node in the appointed historical time period according to each edge node so as to optimize the parameters of the graph neural network model.
In an embodiment of the present invention, the method for constructing the edge node topological relation graph includes:
according to the structural information and the network delay information among the edge nodes, constructing an undirected graph among the edge nodes belonging to the same region through edge connection, and simultaneously extracting the characteristics of each edge node;
wherein the weight of an edge in an undirected graph is defined as the network delay between edge nodes.
In the embodiment of the invention, when the number of the edge nodes exceeds the preset upper limit value, the scale of the edge node topological relation graph is limited by deleting the long-tail edge nodes and setting the edge weight threshold value.
In the embodiment of the invention, the weight of the weighted summation is determined by the weight of the edge in the edge node topological relation graph and the coefficient matrix of the graph neural network model of each layer.
According to another aspect of the present invention, there is provided an edge node load prediction apparatus, including:
a characteristic acquisition unit: the system comprises a node topology relation graph, a node configuration graph and a node configuration graph, wherein the node topology relation graph is used for determining adjacent edge nodes of a current edge node and acquiring the characteristics of the current edge node and the characteristics of the adjacent edge nodes;
a prediction unit: the method is used for carrying out weighted summation on the features of the current edge node and the features of the adjacent edge nodes by adopting the trained graph neural network model so as to calculate and obtain a predicted load value of the current edge node in a future preset time period.
Compared with the prior art, the invention has the following beneficial effects:
according to the edge node load prediction method and device provided by the invention, the characteristics of the edge node and the characteristics of the adjacent edge node are comprehensively considered, and even for a complex topological structure with multiple edge nodes, a more accurate load prediction result can be provided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings of the embodiments will be briefly described below, it being understood that the drawings described below relate only to some embodiments of the present invention and are not limiting thereof, wherein:
FIG. 1 is an exemplary flow diagram of a method for edge node load prediction according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a neural network model according to an embodiment of the present invention;
FIG. 3 is a flow diagram of a method of training a neural network model in accordance with an embodiment of the present invention;
fig. 4 is an exemplary block diagram of an edge node topology network in accordance with an embodiment of the present invention.
Detailed Description
In order to make the technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the invention without any inventive step, also belong to the scope of protection of the invention.
The first embodiment is as follows:
fig. 1 shows an exemplary flowchart of an edge node load prediction method according to an embodiment of the present invention.
As shown in fig. 1, in the first step, the neighboring edge node of the current edge node is determined according to the edge node topological relation graph, and the features of the current edge node and the features of the neighboring edge node are collected.
Referring to fig. 4, which is a diagram of an exemplary structure of an edge node topology network according to an embodiment of the present invention, the edge node topology relationship diagram is an undirected graph, and in some embodiments, the constructing method may be:
according to the structural information and the network delay information among the edge nodes, constructing an undirected graph among the edge nodes belonging to the same region through edge connection, and simultaneously extracting the characteristics of each edge node;
wherein the weight of an edge in an undirected graph is defined as the network delay between edge nodes.
In the edge node topological relation graph, edge nodes with edge connection are adjacent edge nodes to each other, and the adjacent edge nodes are usually closer to each other, for example: belonging to a geographic area, etc.
In some embodiments, the characteristics of each edge node mainly include the following categories:
a) resource characteristics: the number of CPU cores, the total amount of memory and the total amount of bandwidth of the edge node; the number of servers of the edge node, etc.;
b) historical load: yesterday historical load; historical average load over the last week; historical average load over the last month; average load of recent almanac history, etc.;
c) task request feature: yesterday's total number of user requests, and resource characteristics (such as CPU, memory, IO, bandwidth, etc.) of the user requests.
In the second step, the trained graph neural network model is adopted to carry out weighted summation on the features of the current edge node and the features of the adjacent edge nodes so as to calculate and obtain the predicted load value of the current edge node in a future preset time period.
Referring to fig. 3, a training process of the graph neural network model, namely learning a relationship between features of edge nodes and loads through historical loads, determines parameters of the graph neural network model, and specifically, may include the following steps:
s201: collecting the characteristics of each edge node in the edge node topological relation graph;
s202: according to the characteristics of each edge node and the characteristics of the adjacent edge nodes of each edge node, learning the relationship between the characteristics of the edge nodes and the load by adopting a forward propagation algorithm, and thus calculating and acquiring a predicted load value of each edge node in a specified historical time period based on a graph neural network model;
FIG. 2 is a diagram illustrating a neural network model according to an embodiment of the present invention. Each node X1-X4 in the input layer represents an edge node, predicted load information Y1-Y4 of each node can be output through forward propagation of the graph neural network, and the output load information is compared with real load information Z1-Z4, so that the loss function can be calculated as follows:
Figure BDA0003329449640000051
wherein Z isiLoad data of a real edge node i; y isiPredicted load data for edge node i output for forward propagation through the graph neural network; finally, the parameters of the network can be updated by a gradient back propagation method.
The specific forward propagation process is as follows:
in the calculation process of adopting the forward propagation algorithm, for an edge node, the output of the next layer is obtained by weighting and summing the input characteristics of the current edge node and the current layer of the adjacent edge node of the current edge node, and the weight is determined by the weight of the edge and the coefficient matrix of the graph neural network model of each layer. It is worth noting that, unlike the conventional method, the graph neural network model does not learn the relationship between the edge node features and the load according to its own features, but comprehensively considers the features of resources, loads, requests and the like of the neighboring edge nodes of the current edge node, and then comprehensively gives the load prediction result of each edge node, and the forward propagation calculation method can be represented by the following formula:
Hl=σ(AHl-1Wl-1)
in the formula: hlHidden state output for the l hidden layer; a is an adjacency matrix composed of a topology structure of edge nodes, Hl-1Hidden states output by the (l-1) th hidden layer, wherein the first hidden state is an initial input X, and the last hidden state is an output Z of the neural network; wl-1The parameters of the neural network corresponding to the l-1 th hidden layer are also parameters that need to be updated by inverse gradient propagation.
S203: and calculating MSE (mean-square error) loss based on the predicted load value of the graph neural network model and the actual load value of the corresponding edge node in a specified historical time period according to each edge node so as to optimize the parameters of the graph neural network model.
It should be noted that, in some embodiments, the graph neural network model supports million-level nodes and million-level edges, and a larger-scale graph calculation may limit the scale of the edge node graph by deleting long-tail edges (the request amount is small) and setting edge weight thresholds in the case of an excessive number of edge nodes.
The edge node load prediction method provided by the embodiment of the invention comprehensively considers the characteristics of the edge node and the characteristics of the adjacent edge nodes, and can still provide a more accurate load prediction result even for a complex topological structure with multiple edge nodes.
Example two:
the present embodiment provides an edge node load prediction apparatus, which may be used to implement the method in the first embodiment, and specifically includes:
a characteristic acquisition unit: the system comprises a node topology relation graph, a node configuration graph and a node configuration graph, wherein the node topology relation graph is used for determining adjacent edge nodes of a current edge node and acquiring the characteristics of the current edge node and the characteristics of the adjacent edge nodes;
a prediction unit: the method is used for carrying out weighted summation on the features of the current edge node and the features of the adjacent edge nodes by adopting the trained graph neural network model so as to calculate and obtain a predicted load value of the current edge node in a future preset time period.
It should be noted that, the method for implementing the corresponding method steps by the feature acquisition unit and the prediction unit may be referred to in the first embodiment, and details are not described herein.
Example three:
a terminal comprising a processor, a memory coupled to the processor, wherein the memory stores program instructions for performing a method of an embodiment;
the processor is configured to execute the program instructions stored in the memory to control the execution of the method of embodiment one.
Example four:
a storage medium storing program instructions executable by a processor to perform a method according to an embodiment.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, several modifications and variations can be made without departing from the technical principle of the present invention, and these modifications and variations should also be regarded as the protection scope of the present invention.

Claims (10)

1.一种边缘节点负载预测方法,其特征在于,所述方法包括:1. An edge node load prediction method, wherein the method comprises: 根据边缘节点拓扑关系图确定当前边缘节点的邻近边缘节点,并采集当前边缘节点的特征及邻近边缘节点的特征;Determine the adjacent edge nodes of the current edge node according to the edge node topology diagram, and collect the characteristics of the current edge node and the characteristics of the adjacent edge nodes; 采用已训练好的图神经网络模型对当前边缘节点的特征及邻近边缘节点的特征进行加权求和,以计算获取当前边缘节点在未来预定时间段内的预测负载值。The trained graph neural network model is used to perform a weighted summation on the features of the current edge node and the features of adjacent edge nodes to calculate and obtain the predicted load value of the current edge node in a predetermined time period in the future. 2.根据权利要求1所述的边缘节点负载预测方法,其特征在于,所述特征包括边缘节点的资源特征、历史负载值和任务请求特征。2 . The method for predicting the load of an edge node according to claim 1 , wherein the features include resource features, historical load values and task request features of the edge nodes. 3 . 3.根据权利要求2所述的边缘节点负载预测方法,其特征在于,所述历史负载值包括昨天历史负载值、最近一周历史平均负载值、最近一个月历史平均负载值、以及最近一年历史平均负载值。3. The edge node load prediction method according to claim 2, wherein the historical load value comprises the historical load value of yesterday, the historical average load value of the last week, the historical average load value of the last month, and the historical load value of the last year Average load value. 4.根据权利要求2所述的边缘节点负载预测方法,其特征在于,所述资源特征包括边缘节点的CPU核数、内存总量、带宽总量及边缘节点的服务器数目。4 . The method for predicting the load of an edge node according to claim 2 , wherein the resource characteristics include the number of CPU cores, the total amount of memory, the total amount of bandwidth, and the number of servers of the edge node. 5 . 5.根据权利要求2所述的边缘节点负载预测方法,其特征在于,所述任务请求特征包括昨日用户请求总数及用户请求的资源特征。5 . The method for predicting the load of an edge node according to claim 2 , wherein the task request characteristics include the total number of user requests yesterday and the resource characteristics requested by the users. 6 . 6.根据权利要求1所述的边缘节点负载预测方法,其特征在于,所述图神经网络模型的训练方法包括:6. The edge node load prediction method according to claim 1, wherein the training method of the graph neural network model comprises: 采集边缘节点拓扑关系图中每个边缘节点的特征;Collect the features of each edge node in the edge node topology diagram; 根据每个边缘节点的特征及每个边缘节点的邻近边缘节点的特征,采用前向传播算法学习边缘节点的特征与负载的关系,从而计算获取每个边缘节点在指定历史时间段内基于图神经网络模型的预测负载值;According to the characteristics of each edge node and the characteristics of adjacent edge nodes of each edge node, the forward propagation algorithm is used to learn the relationship between the characteristics of the edge node and the load, so as to calculate and obtain each edge node in the specified historical period based on the graph neural network. The predicted load value of the network model; 根据每个边缘节点在指定历史时间段内基于图神经网络模型的预测负载值与相应边缘节点的实际负载值计算MSE损失,以优化图神经网络模型的参数。The MSE loss is calculated according to the predicted load value of each edge node based on the graph neural network model and the actual load value of the corresponding edge node in the specified historical time period to optimize the parameters of the graph neural network model. 7.根据权利要求1所述的边缘节点负载预测方法,其特征在于,所述边缘节点拓扑关系图的构建方法包括:7. The edge node load prediction method according to claim 1, wherein the method for constructing the edge node topology relation graph comprises: 根据边缘节点之间的结构信息与网络延迟信息,在同属于一个区域的边缘节点之间通过边连接构建无向图,同时提取每个边缘节点的特征;According to the structural information and network delay information between edge nodes, an undirected graph is constructed by edge connections between edge nodes that belong to the same area, and the features of each edge node are extracted at the same time; 其中,无向图中边的权重被定义为边缘节点之间的网络延迟。where the weight of an edge in an undirected graph is defined as the network delay between edge nodes. 8.根据权利要求7所述的边缘节点负载预测方法,其特征在于,当边缘节点数量超过预定上限值时,通过删除长尾边缘节点及设置边权重阈值的方式限制边缘节点拓扑关系图的规模。8. The edge node load prediction method according to claim 7, wherein when the number of edge nodes exceeds a predetermined upper limit value, the edge node topology relationship graph is limited by deleting the long tail edge node and setting the edge weight threshold. scale. 9.根据权利要求1所述的边缘节点负载预测方法,其特征在于,所述加权求和的权重由边缘节点拓扑关系图中边的权重、及每层的图神经网络模型的系数矩阵共同确定。9. The edge node load prediction method according to claim 1, wherein the weight of the weighted summation is jointly determined by the weight of the edge in the edge node topological relationship graph and the coefficient matrix of the graph neural network model of each layer . 10.一种边缘节点负载预测装置,其特征在于,所述装置包括:10. An edge node load prediction device, characterized in that the device comprises: 特征采集单元:用于根据边缘节点拓扑关系图确定当前边缘节点的邻近边缘节点,并采集当前边缘节点的特征及邻近边缘节点的特征;Feature acquisition unit: used to determine the adjacent edge nodes of the current edge node according to the edge node topology relationship diagram, and collect the characteristics of the current edge node and the characteristics of the adjacent edge nodes; 预测单元:用于采用已训练好的图神经网络模型对当前边缘节点的特征及邻近边缘节点的特征进行加权求和,以计算获取当前边缘节点在未来预定时间段内的预测负载值。Prediction unit: used to use the trained graph neural network model to perform weighted summation of the features of the current edge node and the features of adjacent edge nodes to calculate and obtain the predicted load value of the current edge node in a predetermined time period in the future.
CN202111276213.3A 2021-10-29 2021-10-29 Edge node load prediction method and device Active CN114090238B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111276213.3A CN114090238B (en) 2021-10-29 2021-10-29 Edge node load prediction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111276213.3A CN114090238B (en) 2021-10-29 2021-10-29 Edge node load prediction method and device

Publications (2)

Publication Number Publication Date
CN114090238A true CN114090238A (en) 2022-02-25
CN114090238B CN114090238B (en) 2024-10-29

Family

ID=80298319

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111276213.3A Active CN114090238B (en) 2021-10-29 2021-10-29 Edge node load prediction method and device

Country Status (1)

Country Link
CN (1) CN114090238B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180157161A1 (en) * 2016-12-01 2018-06-07 Lam Research Corporation Design layout pattern proximity correction through fast edge placement error prediction
CN110445866A (en) * 2019-08-12 2019-11-12 南京工业大学 Task migration and cooperative load balancing method in mobile edge computing environment
CN111538567A (en) * 2020-04-26 2020-08-14 国网江苏省电力有限公司信息通信分公司 Method and device for deploying virtual network function chain on edge device
CN112732442A (en) * 2021-01-11 2021-04-30 重庆大学 Distributed model for edge computing load balancing and solving method thereof
CN112910710A (en) * 2021-02-08 2021-06-04 清华大学 Network flow space-time prediction method and device, computer equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180157161A1 (en) * 2016-12-01 2018-06-07 Lam Research Corporation Design layout pattern proximity correction through fast edge placement error prediction
CN110445866A (en) * 2019-08-12 2019-11-12 南京工业大学 Task migration and cooperative load balancing method in mobile edge computing environment
CN111538567A (en) * 2020-04-26 2020-08-14 国网江苏省电力有限公司信息通信分公司 Method and device for deploying virtual network function chain on edge device
CN112732442A (en) * 2021-01-11 2021-04-30 重庆大学 Distributed model for edge computing load balancing and solving method thereof
CN112910710A (en) * 2021-02-08 2021-06-04 清华大学 Network flow space-time prediction method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郭倩彤: "基于负载预测的边缘数据中心节能管理机制研究", 《中国优秀硕士学位论文全文数据库(电子期刊)》, 15 March 2020 (2020-03-15) *

Also Published As

Publication number Publication date
CN114090238B (en) 2024-10-29

Similar Documents

Publication Publication Date Title
JP7234370B2 (en) Generating Integrated Circuit Floorplans Using Neural Networks
CN111324993B (en) Turbulent flow field updating method, device and related equipment
CN112311578B (en) VNF scheduling method and device based on deep reinforcement learning
CN106933649B (en) Method and system for virtual machine load prediction based on moving average and neural network
CN110910004A (en) A method and system for extracting reservoir scheduling rules with multiple uncertainties
CN112990958B (en) Data processing method, device, storage medium and computer equipment
CN112329997A (en) Power demand load prediction method and system, electronic device, and storage medium
CN112836885B (en) Combined load prediction method, combined load prediction device, electronic equipment and storage medium
CN112819258A (en) Bank branch to store customer quantity prediction method and device
CN113379071A (en) Noise label correction method based on federal learning
CN114580578B (en) Constrained distributed stochastic optimization model training method, device and terminal
CN110209467A (en) A kind of flexible resource extended method and system based on machine learning
CN113743594A (en) Network flow prediction model establishing method and device, electronic equipment and storage medium
CN104932898B (en) A kind of component selection method to be increased based on improvement multi-objective particle
Inoue et al. Estimating customer impatience in a service system with unobserved balking
CN114880363A (en) A data center traffic forecasting system, training method, and forecasting method
CN112257977B (en) Logistics project construction period optimization method and system with resource limitation under fuzzy man-hour
CN118100151B (en) A method, device, equipment and storage medium for predicting power grid load
CN119127419A (en) Task allocation method, device, computer equipment, readable storage medium and program product
CN114090238B (en) Edge node load prediction method and device
CN111510473A (en) Access request processing method and device, electronic equipment and computer readable medium
KR102689100B1 (en) Method and system for utilizing thin sub networks for anytime prediction
CN110768825A (en) A business traffic prediction method based on network big data analysis
CN117175546A (en) Distributed energy power distribution network topology identification method and medium based on graphical modeling
CN113379392A (en) Method for acquiring high-quality data for numerical tasks in crowdsourcing scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant