CN115759285A - Network data analysis method, device, system and equipment and storage medium - Google Patents

Network data analysis method, device, system and equipment and storage medium Download PDF

Info

Publication number
CN115759285A
CN115759285A CN202211384612.6A CN202211384612A CN115759285A CN 115759285 A CN115759285 A CN 115759285A CN 202211384612 A CN202211384612 A CN 202211384612A CN 115759285 A CN115759285 A CN 115759285A
Authority
CN
China
Prior art keywords
nwdaf
model parameters
node
network data
machine learning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211384612.6A
Other languages
Chinese (zh)
Inventor
廖星星
吕静雯
陶剑
王春晓
张停
袁玉勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Network Communication and Security Zijinshan Laboratory
Original Assignee
Network Communication and Security Zijinshan Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Network Communication and Security Zijinshan Laboratory filed Critical Network Communication and Security Zijinshan Laboratory
Priority to CN202211384612.6A priority Critical patent/CN115759285A/en
Publication of CN115759285A publication Critical patent/CN115759285A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The application discloses a network data analysis method, a system and a storage medium, wherein the method is applied to an area NWDAF node and comprises the following steps: receiving first target network data, and inputting the first target network data into a first machine learning model which is constructed in advance so as to output a corresponding first data analysis result; the construction process of the first machine learning model comprises the following steps: determining a plurality of NFs with learning capabilities; obtaining a plurality of trained model parameters sent by a plurality of NF, and aggregating the plurality of model parameters to obtain a regional aggregation model parameter; the MTLF in each NF is used for training a machine learning model based on training network data to obtain trained model parameters; a first machine learning model is built locally based on the region aggregation model parameters. The method and the device realize network data analysis on the premise of guaranteeing the data security of the 5G core network.

Description

Network data analysis method, device, system and equipment and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to a method, an apparatus system, an electronic device, and a computer-readable storage medium for network data analysis.
Background
A Network Data analysis Function (NWDAF node) proposed in 3GPP R-16 and R-17 stages provides support for 5G Network intellectualization. The federal Learning (fed Learning) technology reduces the calculation pressure of the center node of the traditional machine Learning mode in an uploading model updating mode, and can enhance the safety of the traditional machine Learning mode. The Federal learning is integrated into the 5G core Network, so that the data service safety of the 5G core Network can be improved, the information quantity transmitted between NF (Network Function) is reduced, and the communication efficiency is improved.
Under the existing 5G core network service architecture, the current NWDAF node can only complete the traditional machine learning. The NF itself does not have a learning capability, and if the learning capability is introduced under the original core network architecture, a machine learning function needs to be added to the NF. In addition, after the NF is trained according to local data, the NWDAF node acquires data from different NFs respectively to perform model training and analysis reasoning, the data of the NF flows out of the network element, the safety of privacy data is influenced, and the work performance of the NF is reduced due to the fact that the NF is directly applied to perform model training.
Therefore, how to implement network data analysis on the premise of ensuring the data security of the 5G core network is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
The application aims to provide a network data analysis method, a system and a computer readable storage medium, which realize network data analysis on the premise of ensuring the data security of a 5G core network.
In order to achieve the above object, the present application provides a network data analysis method applied to an area NWDAF node, where the method includes:
receiving first target network data, and inputting the first target network data into a first machine learning model which is constructed in advance so as to output a corresponding first data analysis result;
the first machine learning model building process comprises the following steps:
determining a plurality of NF with learning ability;
obtaining a plurality of trained model parameters sent by the NF, and aggregating the model parameters to obtain a regional aggregation model parameter; the MTLF in each NF is used for training a machine learning model based on training network data to obtain the trained model parameters;
locally building the first machine learning model based on the region aggregation model parameters.
After the aggregating the plurality of model parameters to obtain the region aggregation model parameter, the method further includes:
sending the region aggregation model parameters to corresponding operator NWDAF nodes; the operator NWDAF node is used for aggregating all the received regional aggregation model parameters to obtain operator aggregation model parameters and returning the operator aggregation model parameters to the regional NWDAF node;
and receiving the operator aggregation model parameters sent by the operator NWDAF node, and updating the local first machine learning model based on the operator aggregation model parameters.
Sending the region aggregation model parameters to corresponding carrier NWDAF nodes, wherein the sending of the region aggregation model parameters comprises the following steps:
when a first model parameter request of a corresponding operator NWDAF node is received, sending the region aggregation model parameter to the corresponding operator NWDAF node; the operator NWDAF node is configured to determine all corresponding area NWDAF nodes through the NRF, and send the first model parameter request to all corresponding area NWDAF nodes.
The operator NWDAF node is further used for building a second machine learning model locally based on the operator aggregation model parameters, and the second machine learning model is used for reasoning based on second target network data to obtain a corresponding second data analysis result.
The top-layer NWDAF node is used for aggregating all the received operator aggregation model parameters to obtain top-layer aggregation model parameters and returning the top-layer aggregation model parameters to the operator NWDAF node;
the carrier NWDAF node is further configured to receive the top-level aggregated model parameters sent by the top-level NWDAF node, and update the second machine learning model locally based on the top-level aggregated model parameters.
The top-layer NWDAF node is further used for acquiring service addresses of all the NWDAF nodes of the operators through NEF and sending the second model parameter request to all the NWDAF nodes of the operators based on the service addresses.
The top-layer NWDAF node is further used for building a third machine learning model locally based on the top-layer aggregation model parameters, and the third machine learning model is used for reasoning based on third target network data to obtain a corresponding third data analysis result.
Wherein the determining a plurality of NFs with learning capabilities comprises:
multiple NFs with learning capabilities were determined by NRF.
Wherein each of the NFs is to: receiving a registration request of MTLF; the registration request comprises information used for indicating that the MTLF has learning capacity; sending the registration request to an NRF to complete the registration of the MTLF; and sending training network data to the MTLF so that the MTLF trains a machine learning model based on the training network data to obtain the trained model parameters.
To achieve the above object, the present application provides a network data analysis apparatus applied to an area NWDAF node, the apparatus including:
the analysis module is used for receiving first target network data, inputting the first target network data into a first machine learning model which is constructed in advance, and outputting a corresponding first data analysis result;
the device further comprises:
a determining module for determining a plurality of NFs with learning capabilities;
the fusion module is used for acquiring a plurality of trained model parameters sent by the NF and aggregating the model parameters to obtain regional aggregation model parameters; training a machine learning model by using MTLF in each NF based on training network data to obtain the trained model parameters;
a building module to build the first machine learning model locally based on the regional aggregate model parameters.
In order to achieve the above object, the present application provides a network data analysis system, including a top NWDAF node, a plurality of carrier NWDAF nodes, a plurality of regional NWDAF nodes, and a plurality of NFs, where each carrier NWDAF node corresponds to a plurality of the regional NWDAF nodes, and each regional NWDAF node corresponds to a plurality of the NFs;
the MTLF in the NF is used for training a machine learning model based on training network data to obtain trained model parameters;
the regional NWDAF node is used for aggregating a plurality of model parameters obtained by a plurality of corresponding NF training to obtain regional aggregation model parameters, locally constructing a first machine learning model based on the regional aggregation model parameters, and inputting first target network data into the first machine learning model when receiving the first target network data so as to output a corresponding first data analysis result;
the operator NWDAF node is used for aggregating region aggregation model parameters obtained by aggregating a plurality of corresponding region NWDAF nodes to obtain operator aggregation model parameters, building a second machine learning model locally based on the operator aggregation model parameters, and inputting second target network data into the second machine learning model when receiving the second target network data so as to output a corresponding second data analysis result;
the top-layer NWDAF node is used for aggregating operator aggregation model parameters obtained by aggregation of all operator NWDAF nodes to obtain top-layer aggregation model parameters, building a third machine learning model locally based on the top-layer aggregation model parameters, and inputting third target network data into the third machine learning model when receiving the third target network data so as to output a corresponding third data analysis result.
Wherein the operator NWDAF node is further configured to return the operator aggregated model parameters to corresponding all regional NWDAF nodes;
the regional NWDAF node is further configured to update the local first machine learning model based on the received operator aggregated model parameters sent by the corresponding operator NWDAF node.
Wherein the top layer NWDAF node is further to return the top layer aggregated model parameters to all of the operator NWDAF nodes;
the carrier NWDAF node is further to update a second machine learning model locally based on the received top-level aggregated model parameters.
To achieve the above object, the present application provides an electronic device including:
a memory for storing a computer program;
a processor for implementing the steps of the network data analysis method as described above when executing the computer program.
To achieve the above object, the present application provides a computer-readable storage medium having stored thereon a computer program, which when executed by a processor, implements the steps of the network data analysis method as described above.
By the above scheme, the network data analysis method provided by the application is applied to an area NWDAF node, and the method includes: receiving first target network data, and inputting the first target network data into a first machine learning model which is constructed in advance so as to output a corresponding first data analysis result; the first machine learning model building process comprises the following steps: determining a plurality of NFs with learning capabilities; obtaining a plurality of trained model parameters sent by a plurality of NF, and aggregating the plurality of model parameters to obtain a regional aggregation model parameter; the MTLF in each NF is used for training a machine learning model based on training network data to obtain the trained model parameters; locally building the first machine learning model based on the region aggregation model parameters.
In the present application, the zone NWDAF nodes are only used for aggregation and data analysis of model parameters and not for training of machine learning models, i.e., the zone NWDAF nodes only store model parameters of machine learning models and do not store any training network data. The training of the machine learning model is completed by MTLF in NF, that is, the training network data is only stored in NF, thus ensuring that the original training network data of NF in the 5G core network can not be leaked when the NWDAF node is attacked. Therefore, the network data analysis method provided by the application realizes network data analysis on the premise of ensuring the data security of the 5G core network. The application also discloses a network data analysis device, a system and a computer readable storage medium, which can also realize the technical effects.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts. The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
FIG. 1 is a flow diagram illustrating a process for building a machine learning model in accordance with an exemplary embodiment;
FIG. 2 is a flow diagram illustrating a method of network data analysis in accordance with an exemplary embodiment;
FIG. 3 is a block diagram illustrating a network data analysis system in accordance with an exemplary embodiment;
FIG. 4 is a diagram illustrating a multi-layered NWDAF architecture in one embodiment of the present application;
FIG. 5 is a flowchart of training a machine learning model in an embodiment of the present application;
FIG. 6 is a block diagram illustrating a network data analysis device in accordance with an exemplary embodiment;
FIG. 7 is a block diagram illustrating an electronic device in accordance with an exemplary embodiment.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. In addition, in the embodiments of the present application, "first", "second", and the like are used for distinguishing similar objects, and are not necessarily used for describing a specific order or a sequential order.
The embodiment of the application discloses a network data analysis method, which realizes network data analysis on the premise of ensuring the data security of a 5G core network.
The network data analysis method comprises the following steps:
receiving first target network data, and inputting the first target network data into a first machine learning model which is constructed in advance so as to output a corresponding first data analysis result;
in specific implementation, a user finds an area NWDAF node providing data analysis service for the user through NEF, requests the area NWDAF node for the data analysis service, and carries a user ID for positioning user context and target network data, and the area NWDAF node inputs the target network data into a locally-constructed machine learning model to perform inference service, so as to obtain a corresponding data analysis result. For example, the target network data may be UE (User Equipment) location data in the AMF module and in the OAM module, and the data analysis result may be UE movement data and UE movement location prediction information.
Referring to fig. 1, fig. 1 is a flowchart illustrating a process of building a machine learning model according to an exemplary embodiment, and as shown in fig. 1, the process includes:
s101: determining a plurality of NFs with learning capabilities;
the execution subject of the present embodiment is an area NWDAF node. It can be understood that the 5G network includes a plurality of NFs, for example, AMFs (Access and Mobility Management functions), AFs (Application functions), OAM (Operation Maintenance Management and Maintenance), etc., which all need to register with NRFs (network storage functions), and the NFs of the 5G network may or may not have learning capabilities.
S102: obtaining a plurality of trained model parameters sent by a plurality of NF, and aggregating the plurality of model parameters to obtain a regional aggregation model parameter; wherein, MTLF (Model Training local function) in each NF is used for Training a machine learning Model based on Training network data to obtain the trained Model parameters;
in this step, the regional NWDAF node sends a request for obtaining model parameters to the NF, waits for the NF to respond to the model parameters, and aggregates the received multiple model parameters to obtain regional aggregated model parameters if the number of the received model parameters is greater than or equal to 2 within the waiting time. As a possible implementation, the region aggregation model parameter may be obtained by performing weighted average on a plurality of model parameters.
And the MTLF in each NF trains a machine learning model based on training network data to obtain trained model parameters. As a possible implementation, the NF receives a registration request of the MTLF; wherein, the registration request includes information for indicating that the MTLF has learning ability; the NF sends the registration request to an NRF to complete the registration of the MTLF; and the NF sends training network data to the MTLF so that the MTLF trains a machine learning model based on the training network data to obtain the trained model parameters.
In specific implementation, the NF with learning ability locally deploys an MTLF module for training a machine learning model, the MTLF training process can be bound for processing, the service processing process of the NF is not affected, and the reduction of the processing ability of the NF caused by high occupation of a CPU is avoided. And the NF registers or updates the registration to the NRF when receiving the registration request, and indicates that the NF has the data learning capability in the registration request. And the MTLF subscribes training network data to the NF and receives a response of successful subscription returned by the NF. And when network data in the NF is updated, informing the MTLF, transmitting the network data serving as training network data to the MTLF, training a machine learning model by the MTLF based on the received training network data, storing model parameters after model training in a local area, and transmitting after waiting for a request initiated by a regional NWDAF node for acquiring the model parameters. Preferably, when the data volume of the received training network data reaches a preset value, the MTLF trains a machine learning model based on the training network data to obtain trained model parameters.
S103: locally building the first machine learning model based on the region aggregation model parameters.
In this step, the regional NWDAF node locally constructs a first machine learning model based on the regional aggregation model parameters obtained by aggregation, and the first machine learning model is used for implementing data inference.
In the embodiment of the present application, the area NWDAF nodes are only used for aggregation and data analysis of model parameters and are not used for training of the machine learning model, that is, the area NWDAF nodes only store the model parameters of the machine learning model and do not store any training network data. The training of the machine learning model is completed by MTLF in NF, that is, the training network data is only stored in NF, thus ensuring that the original training network data of NF in the 5G core network can not be leaked when the NWDAF node is attacked. Therefore, the network data analysis method provided by the embodiment of the application realizes network data analysis on the premise of ensuring the data security of the 5G core network.
The embodiment of the application discloses a network data analysis method, and compared with the previous embodiment, the embodiment further explains and optimizes the technical scheme. Specifically, according to the area size of the data service, the NWDAF nodes are divided into a top NWDAF node, operator NWDAF nodes and area NWDAF nodes, the top NWDAF node corresponds to a plurality of operator NWDAF nodes, each operator NWDAF node corresponds to a plurality of area NWDAF nodes, and each area NWDAF node corresponds to a plurality of NFs.
Referring to fig. 2, a flow chart of a network data analysis method according to an exemplary embodiment is shown, as shown in fig. 2, including:
s201: determining a plurality of NF with learning capability by the area NWDAF node;
s202: the regional NWDAF node acquires a plurality of trained model parameters sent by the NF, and aggregates the model parameters to obtain regional aggregate model parameters; training a machine learning model by using MTLF in each NF based on training network data to obtain the trained model parameters;
s203: locally constructing a first machine learning model by the regional NWDAF node based on the regional aggregation model parameters; when first target network data are received, inputting the first target network data into the first machine learning model so as to output a corresponding first data analysis result;
s204: the regional NWDAF node sends the regional aggregation model parameters to the corresponding operator NWDAF node;
in this embodiment, the regional NWDAF node sends the aggregated regional model parameters obtained by aggregation to the corresponding previous layer NWDAF node, that is, the operator NWDAF node.
As a possible implementation, when receiving a first model parameter request of a corresponding carrier NWDAF node, the area NWDAF node sends the area aggregation model parameter to the corresponding carrier NWDAF node; the operator NWDAF node determines all corresponding area NWDAF nodes through the NRF and sends the first model parameter request to all corresponding area NWDAF nodes.
In specific implementation, the carrier NWDAF node finds the Area NWDAF node through the NRF, where the parameters carried by the carrier NWDAF node may be slice granularity and TAI (Tracking Area Identity) granularity, sends a request for obtaining the model parameters to the Area NWDAF node, and waits for the Area NWDAF node to respond to the Area aggregation model parameters.
S205: the operator NWDAF node aggregates all the received regional aggregation model parameters to obtain operator aggregation model parameters;
in this step, if the number of the received regional aggregation model parameters is greater than or equal to 2 within the waiting time, the operator NWDAF node aggregates the received multiple regional aggregation model parameters to obtain the operator aggregation model parameters. As a possible implementation, the operator aggregation model parameter may be obtained by performing weighted average on multiple regional aggregation model parameters.
S206: the operator NWDAF node builds a second machine learning model locally based on the operator aggregation model parameters and returns the operator aggregation model parameters to the regional NWDAF node; when second target network data are received, inputting the second target network data into the second machine learning model so as to output a corresponding second data analysis result;
in this step, the carrier NWDAF node locally builds a machine learning model based on the carrier aggregation model parameters obtained by aggregation, and the machine learning model is used for reasoning based on target network data to obtain a data analysis result. And meanwhile, the operator NWDAF node returns the operator aggregation model parameters obtained by aggregation to all the corresponding area NWDAF nodes.
S207: the regional NWDAF node receives the operator aggregation model parameters sent by the operator NWDAF node, and updates a local first machine learning model based on the operator aggregation model parameters;
in this step, the regional NWDAF node receives operator aggregated model parameters from the previous layer of operator NWDAF nodes, and updates the local machine learning model based on the received operator aggregated model parameters.
S208: the operator NWDAF node sends the operator aggregation model parameters to a top-layer NWDAF node;
in this step, the carrier NWDAF node sends the carrier aggregation model parameters obtained by aggregation to the corresponding previous layer NWDAF node, that is, the top layer NWDAF node.
As a possible implementation, when the carrier NWDAF node receives the second model parameter request of the top-level NWDAF node, the carrier NWDAF node sends the carrier aggregated model parameters to the top-level NWDAF node; the top-layer NWDAF node acquires service addresses of all the operator NWDAF nodes through NEF, and sends the second model parameter request to all the operator NWDAF nodes based on the service addresses.
In specific implementation, the top NWDAF node acquires service addresses of NWDAF nodes of different operators through an NEF (Network Exposure Function), sends a request for acquiring model parameters to the NWDAF nodes of the operators based on the service addresses, and waits for a response to the aggregated model parameters of the operators.
S209: the top-layer NWDAF node aggregates all the received operator aggregation model parameters to obtain top-layer aggregation model parameters;
in this step, if the number of the received operator aggregation model parameters is greater than or equal to 2 within the waiting time, the top NWDAF node aggregates the received operator aggregation model parameters to obtain top aggregation model parameters. As a possible implementation, the top-level aggregation model parameter may be obtained by performing weighted average on multiple operator aggregation model parameters.
S210: the top-layer NWDAF node builds a third machine learning model locally based on the top-layer aggregation model parameters, and returns the top-layer aggregation model parameters to the operator NWDAF node; when third target network data are received, inputting the third target network data into the third machine learning model so as to output a corresponding third data analysis result;
in this step, the top NWDAF node locally builds a machine learning model based on the top aggregation model parameters obtained by aggregation, and the machine learning model is used for reasoning based on target network data to obtain a data analysis result. And meanwhile, the top-layer NWDAF node returns the top-layer aggregation model parameters obtained through aggregation to all the corresponding operator NWDAF nodes.
S211: and the carrier NWDAF node receives the top layer aggregation model parameters sent by the top layer NWDAF node and updates a local second machine learning model based on the top layer aggregation model parameters.
In this step, the carrier NWDAF node receives the top-level aggregated model parameters from the upper layer, i.e., the top-level NWDAF node, and updates the local machine learning model based on the received top-level aggregated model parameters.
That is, in this embodiment, the NF is responsible for training model parameters, and each NWDAF node including the top NWDAF node, the carrier NWDAF node, and the area NWDAF node has a data inference function. In this embodiment, the first target network data, the second target network data, and the third target network data may be the same data or different data, and are not limited herein.
According to the embodiment of the application, the NWDAF nodes are divided into top-layer NWDAF nodes, operator NWDAF nodes and area NWDAF nodes according to the area size of data service, and the NWDAF nodes are not only used for aggregation and data analysis of model parameters and training of machine learning models, namely, the NWDAF nodes only store the model parameters of the machine learning models and do not store any training network data. The training of the machine learning model is completed by MTLF in NF, that is, the training network data is only stored in NF, thus ensuring that the original training network data of NF in the 5G core network can not be leaked when the NWDAF node is attacked. Therefore, the network data analysis method provided by the embodiment of the application realizes network data analysis on the premise of ensuring the data security of the 5G core network.
The embodiment of the application discloses a network data analysis system, which specifically comprises:
referring to fig. 3, a block diagram of a network data analysis system according to an exemplary embodiment is shown, as shown in fig. 3, including a top NWDAF node 100, a plurality of carrier NWDAF nodes 200, a plurality of regional NWDAF nodes 300, and a plurality of NFs 400, where each of the carrier NWDAF nodes 200 corresponds to a plurality of the regional NWDAF nodes 300, and each of the regional NWDAF nodes 300 corresponds to a plurality of the NFs 400;
in this embodiment, the NWDAF nodes are divided into top NWDAF nodes, carrier NWDAF nodes, and area NWDAF nodes according to the area size of the data service. One top-level NWDAF node corresponds to multiple operator NWDAF nodes, one operator NWDAF node 200 corresponds to multiple area NWDAF nodes, and one area NWDAF node 300 corresponds to multiple NFs.
The MTLF in the NF is used for training a machine learning model based on training network data to obtain trained model parameters;
in specific implementation, the NF with learning ability locally deploys an MTLF module for training a machine learning model, the MTLF training process can be bound for processing, the service processing process of the NF is not affected, and the reduction of the processing ability of the NF caused by high occupation of a CPU is avoided. And the NF registers or updates the registration to the NRF when receiving the registration request, and indicates that the NF has the data learning capability in the registration request. And the MTLF subscribes training network data to the NF and receives a response of successful subscription returned by the NF. And when network data in the NF is updated, informing the MTLF, transmitting the network data serving as training network data to the MTLF, training a machine learning model by the MTLF based on the received training network data, storing model parameters after model training in a local area, and transmitting after waiting for a request initiated by a regional NWDAF node for acquiring the model parameters. Preferably, when the data volume of the received training network data reaches a preset value, the MTLF trains the machine learning model based on the training network data to obtain trained model parameters.
The area NWDAF node is used for aggregating a plurality of model parameters obtained by a plurality of corresponding NF training to obtain area aggregation model parameters, locally constructing a first machine learning model based on the area aggregation model parameters, and inputting first target network data into the first machine learning model when the first target network data are received so as to output a corresponding first data analysis result;
in specific implementation, the regional NWDAF node sends a request for obtaining model parameters to the NF, waits for the NF to respond to the model parameters, and aggregates a plurality of received model parameters to obtain regional aggregated model parameters if the number of the received model parameters is greater than or equal to 2 within the waiting time. As a possible implementation, the region aggregation model parameter may be obtained by performing weighted average on a plurality of model parameters.
Further, the regional NWDAF node is further configured to locally build a first machine learning model based on the aggregated regional aggregation model parameters, the first machine learning model being configured to perform data inference based on the first target network data received from the client.
The operator NWDAF node is used for aggregating region aggregation model parameters obtained by aggregating a plurality of corresponding region NWDAF nodes to obtain operator aggregation model parameters, building a second machine learning model locally based on the operator aggregation model parameters, and inputting second target network data into the second machine learning model when receiving the second target network data so as to output a corresponding second data analysis result;
in specific implementation, the carrier NWDAF node finds the Area NWDAF node through the NRF, where the parameters carried by the carrier NWDAF node may be slice granularity and TAI (Tracking Area Identity) granularity, sends a request for obtaining the model parameters to the Area NWDAF node, and waits for the Area NWDAF node to respond to the Area aggregation model parameters. And if the number of the received regional aggregation model parameters is greater than or equal to 2 in the waiting time, aggregating the received regional aggregation model parameters to obtain the operator aggregation model parameters. As a possible implementation, the operator aggregation model parameter may be obtained by performing weighted average on multiple regional aggregation model parameters.
Further, the carrier NWDAF node is further configured to build a second machine learning model locally based on the aggregated carrier aggregation model parameters, where the second machine learning model is configured to perform data inference based on second target network data received from the client.
The top-layer NWDAF node is used for aggregating operator aggregation model parameters obtained by aggregation of all operator NWDAF nodes to obtain top-layer aggregation model parameters, building a third machine learning model locally based on the top-layer aggregation model parameters, and inputting third target network data into the third machine learning model when receiving the third target network data so as to output a corresponding third data analysis result.
In a specific implementation, the top-level NWDAF node acquires service addresses of NWDAF nodes of different operators through the NEF, sends a request for acquiring model parameters to the operator NWDAF node based on the service addresses, and the operator NWDAF node waits for response to operator aggregated model parameters. And if the number of the received operator aggregation model parameters is greater than or equal to 2 in the waiting time, aggregating the received operator aggregation model parameters to obtain a top layer aggregation model parameter. As a possible implementation, the top-level aggregation model parameter may be obtained by performing weighted average on multiple operator aggregation model parameters. Further, the top-level NWDAF node distributes top-level aggregated model parameters to all corresponding operator NWDAF nodes, and all operator NWDAF nodes update the model parameters.
Further, the top-level NWDAF node is further configured to build a third machine learning model locally based on the aggregated top-level aggregation model parameters, where the third machine learning model is configured to perform data inference based on the target network data received from the client.
As a preferred embodiment, the carrier NWDAF node is further configured to return the carrier aggregation model parameters to all corresponding regional NWDAF nodes; the regional NWDAF node is further configured to update the local first machine learning model based on the received operator aggregated model parameters sent by the corresponding operator NWDAF node.
In a specific implementation, the carrier NWDAF node is further configured to return the carrier aggregation model parameters obtained through aggregation to all of the corresponding regional NWDAF nodes, and the regional NWDAF node is further configured to update the local first machine learning model based on the received carrier aggregation model parameters sent by the corresponding carrier NWDAF node.
As a preferred embodiment, the top-level NWDAF node is further configured to return the top-level aggregated model parameters to all of the carrier NWDAF nodes; the carrier NWDAF node is further to update a local second machine learning model based on the received top-level aggregated model parameters.
In a specific implementation, the top NWDAF node is further configured to return the top aggregation model parameters obtained by aggregation to all the regional NWDAF nodes, and the operator NWDAF node is further configured to update the local second machine learning model based on the received top aggregation model parameters.
According to the embodiment of the application, the NWDAF nodes are divided into a top-layer NWDAF node, an operator NWDAF node and an area NWDAF node according to the area size of data service, and no matter which type of NWDAF node is, the NWDAF nodes are only used for aggregation and data analysis of model parameters and are not used for training of a machine learning model, namely, the NWDAF nodes only store the model parameters of the machine learning model and do not store any training network data. The training of the machine learning model is completed by MTLF in NF, that is, the training network data is only stored in NF, thus ensuring that the original training network data of NF in the 5G core network can not be leaked when the NWDAF node is attacked. Therefore, the embodiment of the application realizes network data analysis on the premise of ensuring the data security of the 5G core network.
Referring to an application embodiment provided by the present application, as shown in fig. 4, fig. 4 is a multi-layer NWDAF architecture diagram in the application embodiment provided by the present application, where the multi-layer NWDAF architecture diagram includes a top layer NWDAF, a carrier NWDAF, and a zone NWDAF; regardless of the NWDAF, it includes only an aggregation module and an inference module, and no data training module.
The top-level NWDAF may be a third-party APP application, which provides a series of data analysis functions for the core network, the collected data includes all data of link and mobile, the carrier NWDAF includes china mobile and china link NWDAFs, supra (range of user ID indicated) registered in the NRF includes all data of the network, and the area NWDAF may be an NWDAF of a certain area, and the supra registered in the NRF corresponds to user data of only a certain area. The NF comprises AMF1, AMF2 and OAM, the area NWDAF can collect UE position data for position service from an AMF module and an OAM module in the area, and the machine learning model is used for predicting the UE movement data and UE movement position prediction information.
Fig. 5 is a flowchart illustrating training of a machine learning model in an embodiment of application provided by the present application. AMF1, AMF2 and OAM complete deployment of local MTLF through an event subscription process, AMF1, AMF2 and OAM send an event notification process to local MTLF after registering services to NRF, and local MTLF performs model training after receiving notification. Taking AMF1 as an example, the method comprises the following specific steps:
step S1: registering to AMF1 when MTLF is started, and returning a successful registration response to AMF 1;
step S2: AMF1 initiates a service registration process to NRF, and AMF1 service information is written into NRF, wherein the AMF1 service information carries relevant information for supporting machine learning;
and step S3: the MTLF configures service subscription data, a private interface is adopted between the MTLF and the AMF1, and the message data content can refer to corresponding Namf _ EventExposure _ Subscripte in 3GPP protocol 23502.H30-5.2.2.3.2 and corresponding Subscripte (Input) subscription service information in 28.532;
and step S4: AMF1 initiates an event notification process to local MTLF, and the notification message can refer to a protocol Namf _ EventExposure _ Notify definition and notifies MTLF to execute model training;
step S5: the MTLF performs a model training process.
Step S6: and the local MTLF uploads the model parameters obtained by training to an aggregation module of the regional NWDAF node through a Nwdaf _ AnalyticsInfo _ Request/Nwdaf _ AnalyticsInfo _ Subscripte process to complete aggregation.
The process of discovering the area NWDAF by the operator NWDAF, aggregating the model parameters by the operator NWDAF, and updating the model parameters by the area NWDAF is as follows:
step S1: the zone NWDAF registers with the NRF of its network;
step S2: the carrier NWDAF finds the area NWDAF through NRF, and the parameters carried by the area NWDAF may be slice granularity and TAI granularity;
and step S3: the carrier NWDAF sends a request for obtaining the model parameters to the obtained area NWDAF and waits for a response;
and step S4: when the operator NWDAF receives a response greater than or equal to 2 within the waiting time, performing weighted average of the model parameters, and then entering step S5; otherwise, entering step S6;
step S5: the carrier NWDAF distributes the aggregated model parameters to the obtained area NWDAF, and the area NWDAF updates the model parameters;
step S6: and the carrier NWDAF aggregation process fails, and waits for the next time to start the aggregation process.
The top-level NWDAF does not need to access the 5G core network, which obtains the model parameters of the carrier NWDAF through the NEF functions exposed by the respective networks. The top-level NWDAF discovers that Chinese mobile NWDAF and Chinese communication NWDAF are subjected to model parameter aggregation and the top-level NWDAF updates model parameters in the following processes:
step S1: china Mobile NWDAF or China Unicom NWDAF registers to the NRF of the network, and indicates the SupIrRange scope as all users of the network in the registration information;
step S2: the top-layer NWDAF requests NWDAF service from NEFs exposed by various networks;
and step S3: the NEF exposed by each network inquires SupiRange ranges of all operators NWDAF through the NRF of the network, and sends a request for acquiring model parameters to the NEF;
and step S4: the NEF exposed by each network collects the model parameters returned by the NWDAF and returns the result to the top-layer NWDAF;
step S5: the top-layer NWDAF aggregates the parameter models of the plurality of operators, and the aggregation result is used for reasoning service;
step S6: and the top-layer NWDAF distributes the aggregated result to the corresponding operator NWDAF through the NEF function exposed by each network, and the operator NWDAF updates the model parameters.
The flow of the user obtaining the data reasoning service is as follows:
step S1: the user finds the NWDAF node providing data analysis service for the user through the NEF;
s2, a user requests a data inference service from the NWDAF node and carries information such as a user ID, an AMF module, UE position data in an OAM module and the like;
and step S3: the NWDAF node acquires inference service from the inference module after receiving the information, if the inference module has no model parameters, the NWDAF node enters S4, otherwise, the inference module constructs a machine learning model based on the model parameters, and UE position data in the AMF module and the OAM module are input into the machine learning model to obtain an inference result containing UE movement data and UE movement position prediction information and return the inference result to a user;
and step S4: and the reasoning module informs the aggregation module to carry out aggregation operation, updates the aggregation result to the parameter of the reasoning module and carries out reasoning.
In the following, a network data analysis device provided by an embodiment of the present application is introduced, and a network data analysis device described below and a network data analysis method described above may be referred to each other.
Referring to fig. 6, a block diagram of a network data analysis apparatus according to an exemplary embodiment is shown, as shown in fig. 6, including:
a determining module 601, configured to determine multiple NFs with learning capability;
a fusion module 602, configured to obtain a plurality of trained model parameters sent by the NF, and aggregate the plurality of model parameters to obtain a regional aggregate model parameter; the MTLF in each NF is used for training a machine learning model based on training network data to obtain the trained model parameters;
a building module 603 configured to build a first machine learning model locally based on the region aggregation model parameters;
the analysis module 604 is configured to receive first target network data, input the first target network data into a first machine learning model that is constructed in advance, and output a corresponding first data analysis result.
In the embodiment of the present application, the area NWDAF nodes are only used for aggregation and data analysis of model parameters and not for training of the machine learning model, that is, the area NWDAF nodes only store model parameters of the machine learning model and do not store any training network data. The training of the machine learning model is completed by MTLF in NF, that is, the training network data is only stored in NF, thus ensuring that the original training network data of NF in the 5G core network can not be leaked when the NWDAF node is attacked. Therefore, the network data analysis device provided by the embodiment of the application realizes network data analysis on the premise of ensuring the data security of the 5G core network.
On the basis of the above embodiment, as a preferred embodiment, the method further includes:
a sending module, configured to send the region aggregation model parameter to a corresponding carrier NWDAF node; the operator NWDAF node is used for aggregating all the received regional aggregation model parameters to obtain operator aggregation model parameters and returning the operator aggregation model parameters to the regional NWDAF node;
and the updating module is used for receiving the operator aggregation model parameters sent by the operator NWDAF node and updating the local first machine learning model based on the operator aggregation model parameters.
On the basis of the foregoing embodiment, as a preferred implementation, the sending module is specifically configured to: when a first model parameter request of a corresponding operator NWDAF node is received, sending the region aggregation model parameter to the corresponding operator NWDAF node; the operator NWDAF node is configured to determine all corresponding area NWDAF nodes through the NRF, and send the first model parameter request to all corresponding area NWDAF nodes.
On the basis of the foregoing embodiment, as a preferred implementation manner, the carrier NWDAF node is further configured to build a second machine learning model locally based on the carrier aggregation model parameters, where the second machine learning model is configured to perform inference based on second target network data to obtain a corresponding second data analysis result.
On the basis of the foregoing embodiment, as a preferred implementation, the operator NWDAF node is further configured to send the operator aggregation model parameter to a top-level NWDAF node, where the top-level NWDAF node is configured to aggregate all the received operator aggregation model parameters to obtain a top-level aggregation model parameter, and return the top-level aggregation model parameter to the operator NWDAF node;
the carrier NWDAF node is further configured to receive the top-level aggregated model parameters sent by the top-level NWDAF node, and update the second machine learning model locally based on the top-level aggregated model parameters.
On the basis of the foregoing embodiment, as a preferred implementation manner, the carrier NWDAF node is configured to send the carrier aggregated model parameters to the top-level NWDAF node when receiving a second model parameter request of the top-level NWDAF node, and the top-level NWDAF node is further configured to acquire service addresses of all the carrier NWDAF nodes through NEF, and send the second model parameter request to all the carrier NWDAF nodes based on the service addresses.
On the basis of the foregoing embodiment, as a preferred implementation manner, the top-level NWDAF node is further configured to locally construct a third machine learning model based on the top-level aggregation model parameters, where the third machine learning model is configured to perform inference based on third target network data to obtain a corresponding third data analysis result.
On the basis of the foregoing embodiment, as a preferred implementation manner, the determining module 601 is specifically configured to: multiple NFs with learning capabilities were determined by NRF.
On the basis of the above examples, as a preferred implementation, each of the NFs is configured to: receiving a registration request of MTLF; wherein, the registration request includes information for indicating that the MTLF has learning ability; sending the registration request to an NRF to complete the registration of the MTLF; and transmitting training network data to the MTLF so that the MTLF trains a machine learning model based on the training network data to obtain the trained model parameters.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Based on the hardware implementation of the program module, and in order to implement the method according to the embodiment of the present application, an embodiment of the present application further provides an electronic device, and fig. 7 is a structural diagram of an electronic device according to an exemplary embodiment, as shown in fig. 7, the electronic device includes:
a communication interface 1 capable of performing information interaction with other devices such as network devices and the like;
and the processor 2 is connected with the communication interface 1 to realize information interaction with other equipment, and is used for executing the network data analysis method provided by one or more technical schemes when running a computer program. And the computer program is stored on the memory 3.
In practice, of course, the various components in the electronic device are coupled together by the bus system 4. It will be appreciated that the bus system 4 is used to enable connection communication between these components. The bus system 4 comprises, in addition to a data bus, a power bus, a control bus and a status signal bus. For the sake of clarity, however, the various buses are labeled as bus system 4 in fig. 7.
The memory 3 in the embodiment of the present application is used to store various types of data to support the operation of the electronic device. Examples of such data include: any computer program for operating on an electronic device.
It will be appreciated that the memory 3 may be either volatile memory or nonvolatile memory, and may include both volatile and nonvolatile memory. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a magnetic random access Memory (FRAM), a magnetic random access Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), synchronous Static Random Access Memory (SSRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), enhanced Synchronous Dynamic Random Access Memory (ESDRAM), enhanced Synchronous Dynamic Random Access Memory (Enhanced DRAM), synchronous Dynamic Random Access Memory (SLDRAM), direct Memory (DRmb Access), and Random Access Memory (DRAM). The memory 3 described in the embodiments of the present application is intended to comprise, without being limited to, these and any other suitable types of memory.
The method disclosed in the above embodiment of the present application may be applied to the processor 2, or implemented by the processor 2. The processor 2 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 2. The processor 2 described above may be a general purpose processor, a DSP, or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor 2 may implement or perform the methods, steps and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in a storage medium located in the memory 3, and the processor 2 reads the program in the memory 3 and in combination with its hardware performs the steps of the aforementioned method.
When the processor 2 executes the program, the corresponding processes in the methods according to the embodiments of the present application are realized, and for brevity, are not described herein again.
In an exemplary embodiment, the present application further provides a storage medium, specifically a computer-readable storage medium, for example, including a memory 3 storing a computer program, which is executable by a processor 2 to perform steps performed by any one of a top layer NWDAF node, a carrier NWDAF node, a zone NWDAF node, and an NF in a network data analysis system. The computer readable storage medium may be Memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash Memory, magnetic surface Memory, optical disk, or CD-ROM.
Those of ordinary skill in the art will understand that: all or part of the steps of implementing the embodiments of the method may be implemented by hardware related to program instructions, where the program may be stored in a computer-readable storage medium, and when executed, executes any one of steps executed by a top NWDAF node, an operator NWDAF node, a regional NWDAF node, and an NF in a network data analysis system; and the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
Alternatively, the integrated unit described above may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof that contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for enabling an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media capable of storing program code.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A method of network data analysis, applied to an area NWDAF node, the method comprising:
receiving first target network data, and inputting the first target network data into a first machine learning model which is constructed in advance so as to output a corresponding first data analysis result;
the construction process of the first machine learning model comprises the following steps:
determining a plurality of NFs with learning capabilities;
obtaining a plurality of trained model parameters sent by a plurality of NF, and aggregating the plurality of model parameters to obtain a regional aggregation model parameter; the MTLF in each NF is used for training a machine learning model based on training network data to obtain the trained model parameters;
locally building the first machine learning model based on the region aggregation model parameters.
2. The method according to claim 1, wherein after aggregating the plurality of model parameters to obtain a region aggregation model parameter, the method further comprises:
sending the region aggregation model parameters to corresponding operator NWDAF nodes; the operator NWDAF node is used for aggregating all the received regional aggregation model parameters to obtain operator aggregation model parameters and returning the operator aggregation model parameters to the regional NWDAF node;
and receiving the operator aggregation model parameters sent by the operator NWDAF node, and updating the local first machine learning model based on the operator aggregation model parameters.
3. The method of claim 2, wherein sending the region aggregation model parameters to a corresponding carrier NWDAF node comprises:
when a first model parameter request of a corresponding operator NWDAF node is received, sending the region aggregation model parameter to the corresponding operator NWDAF node; the operator NWDAF node is configured to determine all corresponding area NWDAF nodes through the NRF, and send the first model parameter request to all corresponding area NWDAF nodes.
4. The method of claim 2, wherein the carrier NWDAF node is further configured to locally construct a second machine learning model based on the carrier aggregation model parameters, and wherein the second machine learning model is configured to perform inference based on second target network data to obtain a corresponding second data analysis result.
5. The network data analysis method of claim 4, wherein the carrier NWDAF node is further configured to send the carrier aggregation model parameters to a top-level NWDAF node, and the top-level NWDAF node is configured to aggregate all the received carrier aggregation model parameters to obtain top-level aggregation model parameters, and return the top-level aggregation model parameters to the carrier NWDAF node;
the carrier NWDAF node is further configured to receive the top-level aggregated model parameters sent by the top-level NWDAF node, and update the second machine learning model locally based on the top-level aggregated model parameters.
6. The method of claim 5, wherein the carrier NWDAF node is configured to send the carrier aggregated model parameters to a top-level NWDAF node when receiving a second model parameter request from the top-level NWDAF node, and wherein the top-level NWDAF node is further configured to obtain service addresses of all the carrier NWDAF nodes through NEF, and send the second model parameter request to all the carrier NWDAF nodes based on the service addresses.
7. The method of claim 5, wherein the top-level NWDAF node is further configured to locally construct a third machine learning model based on the top-level aggregation model parameters, and wherein the third machine learning model is configured to perform inference based on third target network data to obtain a corresponding third data analysis result.
8. The method according to claim 1, wherein the determining a plurality of NFs with learning capabilities comprises:
multiple NFs with learning capabilities were determined by NRF.
9. The network data analysis method of claim 1, wherein each of the NFs is configured to: receiving a registration request of MTLF; the registration request comprises information used for indicating that the MTLF has learning capacity; sending the registration request to an NRF to complete the registration of the MTLF; and sending training network data to the MTLF so that the MTLF trains a machine learning model based on the training network data to obtain the trained model parameters.
10. A network data analysis apparatus, applied to an area NWDAF node, the apparatus comprising:
the analysis module is used for receiving first target network data, inputting the first target network data into a first machine learning model which is constructed in advance, and outputting a corresponding first data analysis result;
the device further comprises:
a determining module for determining a plurality of NFs with learning capabilities;
the fusion module is used for acquiring a plurality of trained model parameters sent by the NF and aggregating the model parameters to obtain regional aggregation model parameters; training a machine learning model by using MTLF in each NF based on training network data to obtain the trained model parameters;
a building module to build the first machine learning model locally based on the region aggregation model parameters.
11. A network data analysis system comprising a top layer NWDAF node, a plurality of operator NWDAF nodes, a plurality of regional NWDAF nodes, and a plurality of NFs, each of the operator NWDAF nodes corresponding to a plurality of the regional NWDAF nodes, each of the regional NWDAF nodes corresponding to a plurality of the NFs;
the MTLF in the NF is used for training a machine learning model based on training network data to obtain trained model parameters;
the regional NWDAF node is used for aggregating a plurality of model parameters obtained by a plurality of corresponding NF training to obtain regional aggregation model parameters, locally constructing a first machine learning model based on the regional aggregation model parameters, and inputting first target network data into the first machine learning model when receiving the first target network data so as to output a corresponding first data analysis result;
the operator NWDAF node is used for aggregating area aggregation model parameters obtained by aggregation of the corresponding area NWDAF nodes to obtain operator aggregation model parameters, building a second machine learning model locally based on the operator aggregation model parameters, and inputting second target network data into the second machine learning model when the second target network data are received so as to output a corresponding second data analysis result;
the top-layer NWDAF node is used for aggregating operator aggregation model parameters obtained by aggregation of all operator NWDAF nodes to obtain top-layer aggregation model parameters, building a third machine learning model locally based on the top-layer aggregation model parameters, and inputting third target network data into the third machine learning model when receiving the third target network data so as to output a corresponding third data analysis result.
12. The network data analysis system of claim 11, wherein the carrier NWDAF nodes are further configured to return the carrier aggregated model parameters to all corresponding regional NWDAF nodes;
the regional NWDAF node is further configured to update the local first machine learning model based on the received operator aggregated model parameters sent by the corresponding operator NWDAF node.
13. The network data analysis system of claim 11, wherein the top-level NWDAF node is further configured to return the top-level aggregated model parameters to all of the carrier NWDAF nodes;
the carrier NWDAF node is further to update a local second machine learning model based on the received top-level aggregated model parameters.
14. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the network data analysis method according to any one of claims 1 to 9 when executing the computer program.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the network data analysis method according to any one of claims 1 to 9.
CN202211384612.6A 2022-11-07 2022-11-07 Network data analysis method, device, system and equipment and storage medium Pending CN115759285A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211384612.6A CN115759285A (en) 2022-11-07 2022-11-07 Network data analysis method, device, system and equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211384612.6A CN115759285A (en) 2022-11-07 2022-11-07 Network data analysis method, device, system and equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115759285A true CN115759285A (en) 2023-03-07

Family

ID=85357001

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211384612.6A Pending CN115759285A (en) 2022-11-07 2022-11-07 Network data analysis method, device, system and equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115759285A (en)

Similar Documents

Publication Publication Date Title
WO2020224492A1 (en) Method and device for network data analysis
US11546439B2 (en) Method and system for predicting a geographic location of a network entity
CN117793952A (en) Communication method and device
EP3664372A1 (en) Network management method and related device
US11070433B2 (en) Network function NF management method and NF management device
CN111614563A (en) User plane path selection method and device
JP7200237B2 (en) Method and Apparatus for Proxying Between Different Architectures
JP2023526415A (en) Slice access method, apparatus and system
WO2017125025A1 (en) Call method, device, system, and storage medium
CN109391482B (en) Network function upgrading method and upgrading management entity
WO2021204299A1 (en) Policy determination method and device, and system
KR102178142B1 (en) Optimizing capacity expansion in a mobile network
KR20120066116A (en) Web service information processing method and web service compositing method and apparatus using the same
JP2023523473A (en) User plane function determination method, information provision method, device and medium
CN115086331B (en) Cloud equipment scheduling method, device and system, electronic equipment and storage medium
CN115022844A (en) Network data analysis function NWDAF changing method and device
US11632738B2 (en) System and method of access point name (APN) dynamic mapping
CN111406424A (en) Information processing method, terminal equipment and storage medium
CN113067907B (en) Method and related equipment for addressing edge application
CN115759285A (en) Network data analysis method, device, system and equipment and storage medium
CN110213778B (en) Method and device for intelligently pairing main network element and standby network element
CN108780391A (en) The method, apparatus and system of web services component access context data
CN115981670A (en) Container cluster service deployment method, device, server and storage medium
CN115426716A (en) Slice resource analysis and selection method, device, network element and admission control equipment
WO2021115464A1 (en) Network slice selection method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination