CN117441327A - Conflict detection in network management - Google Patents
Conflict detection in network management Download PDFInfo
- Publication number
- CN117441327A CN117441327A CN202180099060.1A CN202180099060A CN117441327A CN 117441327 A CN117441327 A CN 117441327A CN 202180099060 A CN202180099060 A CN 202180099060A CN 117441327 A CN117441327 A CN 117441327A
- Authority
- CN
- China
- Prior art keywords
- intents
- intent
- conflict
- indication
- event
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title abstract description 11
- 230000009471 action Effects 0.000 claims abstract description 89
- 238000000034 method Methods 0.000 claims abstract description 41
- 238000004590 computer program Methods 0.000 claims abstract description 11
- 238000012544 monitoring process Methods 0.000 claims description 52
- 230000006870 function Effects 0.000 claims description 33
- 238000005259 measurement Methods 0.000 claims description 33
- 230000004044 response Effects 0.000 claims description 16
- 238000010219 correlation analysis Methods 0.000 claims description 10
- 238000007405 data analysis Methods 0.000 claims description 7
- 238000003012 network analysis Methods 0.000 claims description 7
- 230000001419 dependent effect Effects 0.000 abstract description 6
- 238000007726 management method Methods 0.000 description 25
- 238000004458 analytical method Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 230000008901 benefit Effects 0.000 description 6
- 230000000737 periodic effect Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000002123 temporal effect Effects 0.000 description 4
- 238000012423 maintenance Methods 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000013468 resource allocation Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/08—Configuration management of networks or network elements
- H04L41/0866—Checking the configuration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/50—Network service management, e.g. ensuring proper service fulfilment according to agreements
- H04L41/5003—Managing SLA; Interaction between SLA and QoS
- H04L41/5009—Determining service level performance parameters or violations of service level contracts, e.g. violations of agreed response time or mean time between failures [MTBF]
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Debugging And Monitoring (AREA)
Abstract
Various example embodiments relate to intent-based context-dependent conflict detection. The intent-based network device may receive an indication of at least one event and action for each of a plurality of intents and determine at least one conflict between the plurality of intents based on the at least one event and action. The apparatus may receive an indication of at least one context state and provide a conflict report. Apparatus, methods, and computer programs are disclosed.
Description
Technical Field
Various example embodiments relate generally to collision detection in network management. Some example embodiments relate to intent-based conflict detection in network management, at least in part.
Background
In intent-based network management, a network administrator provides intents, such as setting specific services in specific areas, increasing capacity for certain use cases, reducing energy consumption in areas, or other similar intents. The administrator does not provide instructions on how to achieve the intended goal, but merely provides the goal, and then the network picks a policy, instruction, or rule to achieve this intent. Logic exists to translate intent into a set of configuration rules. However, the ability of the intent based network system to operate in a more optimal manner may be further enhanced.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. The scope of protection sought for the various embodiments of the present disclosure is set forth in the independent claims.
Example embodiments of the present disclosure enable the dynamic detection of context-dependent conflicts between intent of simultaneous activities. This and other benefits may be achieved by the features of the independent claims. Further advantageous implementations are provided in the dependent claims, the description and the figures.
According to a first aspect, an apparatus may include at least one processor and at least one memory including computer program code, the at least one memory and the computer code configured to, with the at least one processor, cause the apparatus at least to: receiving an indication of at least one event for each intent of a plurality of intents; receiving an indication of at least one action for each intent of a plurality of intents; determining at least one conflict between the plurality of intents based on at least one event for each of the plurality of intents, or based on at least one event for each of the plurality of intents, and at least one action for each of the plurality of intents; receiving an indication of at least one context state for each of a plurality of intent ranges; and providing a conflict report including at least one context state for the determined at least one conflict.
According to an example embodiment of the first aspect, the computer code may be further configured to, with the at least one processor, cause the apparatus to: monitoring the plurality of intents is initiated in response to receiving a request to monitor the plurality of intents based on a predetermined schedule, and/or in response to detecting expiration of a timer.
According to an example embodiment of the first aspect, the computer code may be further configured to, with the at least one processor, cause the apparatus to: determining the at least one conflict by performing a time correlation analysis between: at least one event for each intent of the plurality of intents; or at least one event for each of the plurality of intents and at least one action for each of the plurality of intents.
According to an example embodiment of the first aspect, the computer code may be further configured to, with the at least one processor, cause the apparatus to: the time correlation analysis is performed by examining the previous following: at least one event for each intent of the plurality of intents within a time window comprising a plurality of monitoring periods; or at least one event for each of the plurality of intents and at least one action for each of the plurality of intents within a time window comprising a plurality of monitoring periods.
According to an example embodiment of the first aspect, the computer code may be further configured to, with the at least one processor, cause the apparatus to: a series of context states for each of a plurality of intent ranges is received from a network analysis function or management data analysis service.
According to an example embodiment of the first aspect, the computer code may be further configured to, with the at least one processor, cause the apparatus to: an indication of at least one event for each of a plurality of intents and an indication of at least one action for each of the plurality of intents are received from a different closed loop of a network provisioning or management function.
According to an example embodiment of the first aspect, the computer code may be further configured to, with the at least one processor, cause the apparatus to: detecting at least one conflict between intents managed by a single intent manager; or at least one conflict between intents managed by the disagreement map manager.
According to an example embodiment of the first aspect, the computer code may be further configured to, with the at least one processor, cause the apparatus to: a conflict report is provided that includes an identifier of the intent of the conflict, at least one context state, and at least one measurement period when the conflict has been detected during the time window.
According to an example embodiment of the first aspect, the indication of the at least one event may comprise an indication of a series of events comprising a number of failed intents for each of the plurality of measurement periods.
According to an example embodiment of the first aspect, the indication of the at least one action may comprise an indication of a series of actions, the indication of the at least one action comprising an indication of a series of actions, the indication of a series of actions comprising, for each of a plurality of measurement periods, a number of inverse actions performed for recovering the failure intent.
According to an example embodiment of the first aspect, the indication of the at least one context state may comprise an indication of a series of context states, the indication of the series of context states comprising an identifier of the context state, and/or a description of the context state.
According to example embodiments of the first aspect, the range of intent between at least one pair of intents may be disjoint, partially overlapping, or identical.
According to an example embodiment of the first aspect, the apparatus may be further configured to operate in an intent based network.
According to a second aspect, a method may comprise: receiving an indication of at least one event for each intent of a plurality of intents; receiving an indication of at least one action for each intent of a plurality of intents; determining at least one conflict between the plurality of intents based on at least one event for each of the plurality of intents or based on at least one event for each of the plurality of intents and at least one action for each of the plurality of intents; receiving an indication of at least one context state for each of a plurality of intent ranges; and providing a conflict report including at least one context state for the determined at least one conflict.
According to an example embodiment of the second aspect, the method may further comprise: monitoring the plurality of intents is initiated in response to receiving a request to monitor the plurality of intents based on a predetermined schedule, and/or in response to detecting expiration of a timer.
According to an example embodiment of the second aspect, the method may further comprise: determining the at least one conflict by performing a time correlation analysis between: at least one event for each intent of the plurality of intents; or at least one event for each of the plurality of intents and at least one action for each of the plurality of intents.
According to an example embodiment of the second aspect, the method may further comprise: the time correlation analysis is performed by examining the previous following: at least one event for each intent of the plurality of intents within a time window comprising a plurality of monitoring periods; or at least one event for each of the plurality of intents and at least one action for each of the plurality of intents within a time window comprising a plurality of monitoring periods.
According to an example embodiment of the second aspect, the method may further comprise: a series of context states for each of a plurality of intent ranges is received from a network analysis function or management data analysis service.
According to an example embodiment of the second aspect, the method may further comprise: an indication of at least one event for each of a plurality of intents and an indication of at least one action for each of the plurality of intents are received from different closed loops of a network provisioning or management function.
According to an example embodiment of the second aspect, the method may further comprise: detecting at least one conflict between intents managed by a single intent manager; or at least one conflict between intents managed by the disagreement map manager.
According to an example embodiment of the second aspect, the method may further comprise: a conflict report is provided that includes an identifier of the conflict intent, at least one context state, and at least one measurement period when a conflict has been detected during a time window.
According to an example embodiment of the second aspect, the indication of the at least one event may comprise an indication of a series of events comprising a number of failed intents for each of the plurality of measurement periods.
According to an example embodiment of the second aspect, the indication of the at least one action may comprise an indication of a series of actions, the indication of the at least one action comprising an indication of a series of actions, the indication of a series of actions comprising, for each of a plurality of measurement periods, a number of counter actions performed for recovering the failure intention.
According to an example embodiment of the second aspect, the indication of the at least one context state may comprise an indication of a series of context states, the indication of the series of context states comprising an identifier of the context state, and/or a description of the context state.
According to an example embodiment of the second aspect, the range of intent between at least one pair of intents may be disjoint, partially overlapping or identical.
According to an example embodiment of the second aspect, the apparatus may be further configured to operate in an intent based network.
According to a third aspect, a computer program may comprise instructions for causing an apparatus to at least: receiving an indication of at least one event for each intent of a plurality of intents; receiving an indication of at least one action for each intent of a plurality of intents; determining at least one conflict between the plurality of intents based on at least one event for each of the plurality of intents or based on at least one event for each of the plurality of intents and at least one action for each of the plurality of intents; receiving an indication of at least one context state for each of a plurality of intent ranges; and providing a conflict report including at least one context state for the determined at least one conflict.
According to a fourth aspect, an apparatus may include means for receiving an indication of at least one event for each intent of a plurality of intents; means for receiving an indication of at least one action for each of a plurality of intents; means for determining at least one conflict between the plurality of intents based on at least one event for each of the plurality of intents or based on at least one event for each of the plurality of intents and at least one action for each of the plurality of intents; means for receiving an indication of at least one context state for each of a plurality of intent ranges; and means for providing a conflict report including at least one context state for the determined at least one conflict.
Any example embodiment may be combined with one or more other example embodiments. Many of the attendant features will be more readily appreciated as they become better understood by reference to the following detailed description considered in connection with the accompanying drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the example embodiments and are incorporated in and constitute a part of this specification, illustrate example embodiments and together with the description help to understand the example embodiments. In the drawings:
FIG. 1 illustrates an example of a network management architecture according to an example embodiment;
FIG. 2 illustrates an example of an apparatus for detecting collisions in a network according to an example embodiment;
FIG. 3 illustrates an example of a collision monitoring process according to an example embodiment;
FIGS. 4A and 4B illustrate examples of conflicts between two intents according to example embodiments;
FIGS. 5A and 5B illustrate examples of conflict monitoring architecture for intent management in accordance with example embodiments;
FIG. 6 illustrates an example of a content monitoring architecture for obtaining context data in accordance with example embodiments; and
fig. 7 illustrates an example of a method for monitoring collisions in a network according to an example embodiment.
Like reference numerals have been used throughout the drawings to designate like parts.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present examples may be constructed or utilized. The description sets forth the functions of the example and the sequence of steps or operations for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
According to an example embodiment, context-dependent conflicts between contemporaneous active intents are dynamically detected as an alternative to or as an auxiliary mechanism to static intent admission control. In addition, detected conflicts can be reported along with the specific context state at which the conflict occurred in order to provide context information to the operator or autonomous network management function responsible for resolving the conflict. The context information may allow for a full decision to be made on the appropriate action, including a decision to not take an action or making a context-dependent trade-off between conflicting intents.
Fig. 1 illustrates an example of monitoring conflicts in a network management architecture according to an example embodiment. Conflict monitoring may be related to closed loops and their management may be an intent based network. In the example of fig. 1, the apparatus 100 is part of a network. The network may be, for example, an intent-based network. The intent-based network may be such that: it can interpret intent and is equipped with intelligence that can take steps to achieve intent or to secure intent. The apparatus 100 may be located within a core network, such as a mobile communication network. The mobile communication network may also include a Radio Access Network (RAN) configured to enable radio access to the network. In this example, the radio access network is shown by three base stations 180 to 182, which together serve four user equipments 190 to 193. The mobile communication network of fig. 1 is only a simplified example. In general, a managed network may include a large number of network elements and connected user devices. However, the configurations may also be local, so they do not necessarily affect the entire network, but rather a few cells, base stations, network elements, users, etc. Thus, in addition to massive collisions, possible collisions and interference may also be local due to local configuration.
In the example of fig. 1, the apparatus 100 is monitoring a network configuration. The apparatus 100 may detect conflicts between monitored intents (e.g., pairs of monitored intents) that may only occur under certain network conditions or context states. The apparatus 100 may report these conflicts as well as an indication of the context state in which the conflict occurred. The apparatus 100 may be configured to perform dynamic, context-dependent collision detection and reporting. In the example of fig. 1, the apparatus 100 is a computing arrangement including at least one processor 202 configured to execute computer program code 206, at least one memory 204 configured to store the computer program code 206 and related data and necessary network connections, as shown in the example of fig. 2. Apparatus 100 may be implemented using a plurality of computing devices operating together in accordance with the principles discussed below. Furthermore, the apparatus 100 may use one or more databases instead of just the simple memory 204.
The apparatus 100 may include separate inputs for receiving indications of the event 110, the action 112, and the context state 114. The apparatus 100 may be implemented such that only one input is used for both the event 110 and the action 112.
According to an example embodiment, the apparatus 100 may be configured to receive an indication of at least one event 110 for each of a plurality of intents. The event 110 may occur when multiple intents may depend on the same resource (e.g., capacity of the same cell or cells), and in a particular context state, the load of the resource may become high due to the increased traffic that caused the event 110. The apparatus 100 may be configured to receive an indication of at least one action 112 for each of a plurality of intents. Act 112 may be to correct event 110 and resume the reaction of the desired objective specified by the intent. The plurality of events 110 or actions 112 may include a subset of events 110 and actions 112 that occur at the network. The apparatus 100 may be configured to determine at least one conflict between the plurality of intents based on the at least one event 110 for each of the plurality of intents. The apparatus 100 may alternatively be configured to determine at least one conflict between the plurality of intents based on at least one event 110 for each of the plurality of intents and at least one action 112 for each of the plurality of intents. The apparatus 100 may be configured to receive an indication of at least one context state 114 for each of a plurality of intent ranges. The context state 114 may be a description of the state in which the scope is during the measurement period, as well as an abstract identifier. For example, for a subset of RAN cells, the identifier may be "x" and the description may be "low traffic". The apparatus 100 may be configured to provide a conflict report 150, the conflict report 150 comprising at least one context state 114 for the determined at least one conflict.
In the example of fig. 1, the apparatus 100 includes a collision detector 120, such as a time correlation-based intent collision detector (TCCD). The conflict detector 120 may be responsible for monitoring the configured set of currently active intents. The conflict detector 120 may, for example, periodically receive intent-specific data from at least one warranty Closed Loop (CL) 130, 140 of the network warranty or management function. The intent manager function may execute CL 130 to secure a given intent. The collision detector 120 may also receive network context analysis, e.g., periodically, from a Network Analysis Function (NAF) 180.
According to an example embodiment, the event 110, action 112, and context state all have measurement periods, which may be the same or different. For example, they may all have a measurement period of five minutes. This means that the collision detector 120 can receive their new values every five minutes.
The guard Closed Loops (CL) 130, 140 may be responsible for maintenance intent. Implementation or assurance of intent may be performed by separate CLs 130, 140 for each intent (referred to as "intent #n assurance CLs") to maintain the intended goals of intent under dynamically changing conditions. According to an example embodiment, each intent guard CL 130, 140 may provide two simple and abstract measurements regarding its operation. The measurement results may be reported in the form of a time series with a certain monitoring period. The first measurement may be indicative of at least one event 110 and provided as an event rate time series 310, e.g., as shown in the example of fig. 3, which may be indicative of the number of events detected by the CL 130, 140 during the respective measurement period. An event may indicate that the intent is not to be implemented or secured for a while. This means that the indication of the at least one event 110 may comprise an indication of a series of events comprising a number of failed intents for each of a plurality of measurement periods. The actual activity counted in this value may be a lower level of alarm received by the CL 130, 140, a threshold violation of the measurement monitored by the CL 130, 140, and/or other types of activity.
The second measurement may indicate at least one action 112 and be provided as an action rate time series 312, for example as shown in the example of fig. 3, which may count the number of times that CL 130, 140 initiates certain reactions to recover the desired target during the respective measurement period. This means that the indication of at least one action 112 may comprise an indication of a series of actions including a number of counter actions performed for each of a plurality of measurement periods for recovering a failure intent. Such actions may include any reconfiguration request issued by CL 130, 140 to some lower level network controller.
The Network Analysis Function (NAF) 180 may provide a context state time sequence 314 for each of a plurality of intents, as shown in the example of fig. 3. This means that the context state 114 may be indicated as a series of context states, each context state in the series being indicated, for example, by an identifier of the context state and/or a description of the context state. The intent scope may be a specific portion or domain of the network to which a given intent applies. For example, in the case of a Radio Access Network (RAN) intent, the intent range may be a set of cells designated as part of an intent specification, and for this intent range, the intended target may be ensured. The two intended ranges of intent monitored by conflict detector 120 may be disjoint, partially overlapping, or identical. The context state 114 may be indicated by periodic measurements of the classification type, and it may be the output of the analysis performed by the NAF 180. The identifier may be an abstract value of the context state 114 in which the range of intent is located during the measurement period. The abstract context state identifier may only be meaningful to the NAF 180, so the NAF may also provide a textual description of the context state 114. The description may be inserted into conflict report 150 by conflict detector 120. For example, for a certain intended range surrounding a subset of RAN cells, the text description of its given context state identifier "x" may be "high traffic", "low traffic" or "large number of UEs". The apparatus 100 may receive an indication of at least one context state 114 for each of a plurality of intent ranges. It may receive at least one context state 114 for each of a plurality of intent ranges from the NAF 180 or Management Data Analysis Service (MDAS) producer 600. The NAF 180 may obtain Performance Management (PM) information 116, which may be performance management series data from an operation and maintenance system (O & M) 170. The PM information 116 may be measurements or Key Performance Indicators (KPIs).
According to an example embodiment, multiple intents may conflict if the implementation of one intent or the guarantee in a given context state 114 of the system impedes or negatively affects the implementation of one or more other intents. For example, if one intended implementation requires an increase in parameter a, while another intended implementation requires a decrease in parameter a, there may be an explicit conflict. Another less explicit example might be that two intents may not explicitly conflict in their control actions, but still affect network behavior in the opposite way. Collisions may cause the network to enter an uncertain or unexpected state, such as entering an oscillating or divergent control path, thereby rapidly compromising its performance or causing service failures.
According to an example embodiment, the CL 130, 140 may periodically report the number of events 110 and/or actions 112 that occurred in the last period. The period may be a measurement period or a reporting period of the event rate time series 310 and/or the action rate time series 312. The reporting period is, for example, a period of one hour or less. The CL 130, 140 may run its own internal monitoring period very often, e.g., it may receive PM measurements every five minutes and compare these measurements to a threshold value, and it may still provide event rate time values and/or action rate time values every hour. For example, the value may be the number of threshold violations detected and alarms received in the last hour.
According to an example embodiment, conflicts are detected by the time correlation of event rate time series 310 and action rate time series 312. The occurrence of a conflict and the period of time during which the conflict occurs may be detected. Context states from the context state time series 314 for those specific time periods may be collected.
According to an example embodiment, the time series may comprise values for consecutive time periods, e.g. one value per hour or 15 minutes. It can be described as a series of time-value pairs, where time is the start time of the measurement period.
According to an example embodiment, the apparatus 100 includes at least one processor 202 and at least one memory 204, the at least one memory 204 including a computer program 206 as shown in the example of fig. 2. The apparatus may further comprise at least one of: conflict detector 120, assurance closed loops 130, 140, network analysis function 180, management data analysis service producer 600, and/or operation and maintenance function 170, as shown in the examples of fig. 1 and 6.
Fig. 2 shows an example of an apparatus 100 for detecting collisions in a network according to an example embodiment. The apparatus 100 (e.g., a network device or network node) may be configured to implement at least the functionality of the collision detector 120. In general, the apparatus 100 may be configured to perform one or more network functions, for example, in accordance with the fifth generation (5G) 3GPP standards based on architecture (SBA) and service and system aspects 5 (SA 5) and the zero contact network and service management (ZSM) ETSI standard. The apparatus 100 may include at least one processor 202. The at least one processor 202 may include, for example, one or more of various processing devices or processor circuitry, such as, for example, a coprocessor, a microprocessor, a controller, a Digital Signal Processor (DSP), processing circuitry with or without accompanying DSP, or various other processing devices including integrated circuits, such as, for example, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a microcontroller unit (MCU), hardware (HW) accelerator, a special-purpose computer chip, or the like.
The apparatus 100 may also include at least one memory 204. The at least one memory 204 may be configured to store, for example, computer program code, etc., such as operating system software and application software. The at least one memory 204 may include one or more volatile memory devices, one or more non-volatile memory devices, and/or combinations thereof. For example, the at least one memory 204 may be embodied as a magnetic storage device (e.g., hard disk drive, floppy disk, magnetic strips, etc.), an opto-magnetic storage device, or a semiconductor memory (such as a mask ROM, PROM (programmable ROM), EPROM (erasable PROM), flash ROM, RAM (random Access memory), etc.).
The apparatus 100 may also include a communication interface 208 configured to enable the apparatus to send and/or receive information to/from other network devices, nodes, or functions, such as network management related information described herein. For example, the apparatus 100 may use the communication interface 208 to send or receive information over a Service Based Interface (SBI) message bus of the 5G SBA. Thus, the communication interface 208 may be used for internal communication within the apparatus or for external communication with other devices.
When the apparatus 100 is configured to implement some functionality, some components and/or multiple components of the apparatus 100 (such as, for example, the at least one processor 202 and/or the at least one memory 204) may be configured to implement this functionality. Further, when the at least one processor 202 is configured to implement a certain function, the function may be implemented using, for example, the program code 206 included in the at least one memory 204.
The functions described herein may be performed, at least in part, by one or more computer program product components, such as software components. According to an embodiment, the apparatus comprises a processor or processor circuitry (such as, for example, a microcontroller) that, when executed, is configured by program code to perform embodiments of the described operations and functions. Alternatively or additionally, the functions described herein may be performed, at least in part, by one or more hardware logic components. For example, but not limited to, illustrative types of hardware logic components that may be used include Field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), complex Programmable Logic Devices (CPLDs), graphics Processing Units (GPUs).
The apparatus 100 includes means for performing at least one example embodiment described herein. In one example, the component includes at least one processor 202, at least one memory 204, the at least one memory 204 including program code 206, the program code 206 configured to, when executed by the at least one processor, cause the apparatus 100 to perform the example embodiment(s).
Apparatus 100 may comprise, for example, a computing device, such as, for example, a server. Although the apparatus 100 is shown as a single device, it should be understood that the functionality of the apparatus 100 may be distributed to multiple devices, as applicable, for example, to implement the example embodiments as a cloud computing service.
Fig. 5A and 5B illustrate examples of conflict monitoring architecture for intent management in accordance with example embodiments. Fig. 5A and 5B show two possible options in which the conflict detector 120 function may be placed relative to the Intent Manager (IM) functions 500, 510, the Intent Manager (IM) functions 500, 510 being responsible for closed loop assurance of the monitored intent. In the example embodiment of FIG. 5A, the conflict detector 120 is placed outside of the IM 500, 510. inter-IM locations inside the cross-IM supervision function 520 may allow detection of conflicts between intents managed by different IM functions 500, 510. For example, the RAN, core network, and transport network domains may have separate intent managers. This may be accomplished through the use of event 110 and action 112 information that may abstract out the domain and intent specific details. The event 110 and action 112 information may include an event rate time series 310 and an action rate time series 312.
In the example embodiment of fig. 5B, conflict detector 120 is placed inside IM 500, which enables IM 500 to detect conflicts between its own processed intents. Each IM 500, 510 may have its own collision detector 120.IM 500 may be equipped with additional logic to take action on the conflict and resolve the conflict by coordinating the intent guarantee actions that it may trigger for various intents. If IM 500 is unable to complete such coordination, it may further upgrade the conflict in the form of conflict report 150.
The example embodiments of fig. 5A and 5B are not mutually exclusive but may be deployed in a complementary fashion. The conflict detector 120 within each IM 500, 510 may detect conflicts between intents managed by a single IM 500, 510. The conflict detector 120 on top of the IM 500, 510 or across the IM 500, 510 may be responsible for conflicts between intents handled by different IMs 500, 510. In the latter case, the collision detector 120 may also have the possibility to consider the following: intent submitted at different management hierarchies, or for different end-to-end (E2E) ranges. For example, intent to describe an E2E service and intent to describe a RAN service may conflict in a manner that is not visible to E2E IM or RAN-specific IM, as E2E IM and RAN-specific IM are unaware of each other's intent.
According to an example embodiment, the apparatus 100 is configured to detect at least one conflict between intents managed by a single intent manager or to detect at least one conflict between intents managed by different intent managers.
Fig. 3 illustrates an example of a collision monitoring process according to an example embodiment. A flowchart of a monitoring cycle performed periodically by the apparatus 100 or upon triggering (e.g., manually by an operator or in response to a request from a network function) is shown. The apparatus 100 may include a collision detector 120 to perform a monitoring process. The apparatus 100 may initiate monitoring the plurality of intents in response to receiving a request to monitor the plurality of intents, based on a predetermined schedule, and/or in response to detecting expiration of a timer. For example, the monitoring may be scheduled to be performed periodically, e.g., at time period of minute(s), hour(s), or day(s), or at irregular time intervals. Alternatively, a timer may be applied to ensure that monitoring is performed frequently enough. For example, the timer may be set to an initial value, which may be equal to the maximum expected time between successive monitoring. The timer may be initiated in response to execution of the monitoring. The monitoring may be initiated in response to detecting expiration of a timer. However, in combination with the predetermined schedule, a timer may be used to avoid monitoring being performed too frequently. For example, the timer may be set to an initial value, which may be equal to the minimum expected time between successive monitoring. The monitoring function may avoid initiating monitoring until the timer expires. For example, if monitoring is performed in response to detecting a performance problem or other unscheduled activity, then execution of subsequent monitoring scheduled immediately after the performed monitoring may be avoided. This enables unnecessary resource consumption in the communication network to be avoided. Thus, the initial time of the timer may indicate both the maximum time and the minimum time between instances of continuous monitoring. Example embodiments described herein enable monitoring to be performed passively (e.g., in response to performance issues) and/or actively (e.g., based on expiration of a timer or a predetermined schedule). Active monitoring enables the discovery of hidden anomalies that may not have been detected as a performance problem.
In the example embodiment of fig. 3, the apparatus 100 detects conflicting intent pairs in the monitored intent by performing a temporal correlation analysis of the event 110 and the action 112 between the intents (e.g., for each pair of intents). The events 110 and actions 112 may include an event rate time series 310 and an action rate time series 312. The temporal correlation analysis may be performed between at least one event 110 for each of the plurality of intents or between at least one event 110 for each of the plurality of intents and at least one action 112 for each of the plurality of intents. It may be performed within a time window comprising a plurality of monitoring cycles of the collision detector 120. For this analysis at operation 300, the apparatus 100 may review time series data within a retrograde history window, which may be longer than the periodicity of the monitoring period of the conflict detector 120. When a temporal correlation may be detected between the event rate time series 310 and the action rate time series 312 of intent, the apparatus 100 may collect disjoint (disjoin) time periods. Thus, when two intents may conflict during a reversing history window, the apparatus 100 may search for disjoint time periods, such as zero, one, or multiple time periods.
The result of this analysis is either no conflict or a list of one or more disjoint time periods within a reversed history window in which the time series correlation shows at least one conflict between intents. There may be two main types of dependencies: simultaneously and alternately.
If a correlation exists, the apparatus 100 collects all context states 114 occurring in the context state time series for a common range of two intents or two different ranges during a period of time when the operation 300 detects a conflict, in operation 340.
At operation 320, the apparatus 100 may add an entry for the pair of intents including the list of collected context states to the conflict report 150. The result may be a list of context states of at least one range in which the conflict already exists during the history of the backoff of the time window. Context states that may only occur during conflicting time periods may be collected. The scope of intent may be the object of intent, or the portion of the network to which intent is applied, a set of resources in the network, or a managed element. The two intents may have the same range (when they have a common range), they may also have different ranges. The different ranges may be completely separate sets or parts of the network, or they may have some intersection, but they are not equal. At operation 330, the apparatus 100 may provide the conflict report 150 to the conflict report consumer. The conflict report 150 may contain the following information for each detected conflict between specific intents: when a conflict exists, at least two conflicting identifiers of intents (e.g., intent a and intent B), and a list of at least one context state for each range of intents. For each context state listed, the list of context states may contain an abstract identifier and/or a textual state description. The list of context states may include only context states 114 in which the scope of different intents differ from each other. For example, where the scope of intent B is different from the scope of intent a, conflict report 150 may include a list of context states for the scope of intent a and a list of similar context states for the scope of intent B. Conflict report 150 may also include a relative portion of time during the history window that the conflict has been detectable, and/or other relevant details that may help evaluate the conflict by the reporting consumer.
According to an example embodiment, the apparatus 100 causes the recipient to initiate a reconfiguration of the communication network based on the provided conflict report 150.
According to an example embodiment, the collision detector 120 may have a monitoring period comprising at least two fixed time window parameters: the length of the periodic and retrograde history windows is monitored periodically. The periodic monitoring period may be a monitoring period, which may run periodically. For example, if the monitoring period is 1 hour, the algorithm depicted in FIG. 3 may be run periodically every hour. The conflict detector 120 may run a time correlation for the length of the retrograde history window. The collision detector 120 may not only use the latest single value of the time series, but may retain a previous history of the data. For example, it retains time-series data for the last 4 hours. When the conflict detector 120 runs the algorithm of fig. 3 every hour, for example, it may analyze the time series values over a period of time T-4 hours, T, where T is the current time it may perform the monitoring algorithm of fig. 3. The length of this previous history is, for example, "the length of the history window in reverse" as shown in fig. 3. It should be at least as long as the monitoring period of the collision detector 120, but may be longer.
Fig. 4A and 4B illustrate examples of conflicts between two intents according to example embodiments. Fig. 4A shows an example of an event rate time series 310 and an action rate time series 312 for two intents, and at the bottom in the form of an abstract mark (A, B, C) shown, for a measurement period of a context state series 314, a context state 114 value for a range of intents is shown. For both intents 1 and 2, there may be periods of time where: for this period, event rate time series 310 and action rate time series 312 have non-zero values at the same time. This means that the event rate time series 310 and action rate time series 312 may be simultaneous with respect to the intent of two conflicts. One possible conflict that causes a concurrent type of correlation may be when two intents depend on the same resource. For example, the capacity of the same cell or cells, and in a certain context state a, the load of the resource becomes high due to the increased traffic. This results in events detected by the two intended safeguard closed loops 130, 140 and in actions by both not being effective due to lack of resources.
Fig. 4B illustrates an example embodiment in which event rate time series 310 and action rate time series 312 alternate for two conflicting intents 1 and 2. Such temporal correlation may be more difficult to detect because the non-zero values in the event rate time series 310 and the action rate time series 312 do not occur simultaneously for both intents, but occur in an alternating fashion for one intention at a time. One possible conflict that may cause alternation to occur is: while two intended closed loops 130, 140 may configure the same scheduler to provide fewer or more common bandwidth shares to traffic of different service classes. For example, intent 1 may be more sensitive to increased loads on the modulator, e.g., due to stricter delay requirements. While in context state a, the scheduler load begins to increase and the closed loop 130 of fig. 1 configures the scheduler to give more common bandwidth shares to the service class of fig. 1. This solves the problem of FIG. 1, but may cause an event to FIG. 2, in response, the closed loop 140 of FIG. 2 reconfigures the same scheduler to give more shares to the class of service of FIG. 2, which again causes an event to FIG. 1.
The benefits of using both the intent-directed event rate time series 310 and the action rate time series 312 in the input of the conflict detector 120, rather than just the event rate time series 310, may be: the reliability of detecting the alternating type of time correlation can be improved. A clear pattern of peaks in one intended action rate time series 312 followed by peaks in another intended event rate time series 310 (and vice versa) may indicate that intended assurance CL 130, 140 actions are opposing each other.
Fig. 6 illustrates an example of a content monitoring architecture for obtaining context data, wherein the apparatus 100 obtains the context state 114 from a Management Data Analysis Service (MDAS) producer 600, according to an example embodiment. The context state 114 may include a time series 314 of context states for each range of intent. In this example, the MDAS producer 600 may take on the role of the NAF 180 and the collision detector 120 of the apparatus 100 may be responsible for processing. The device 100 may request 610 the context state 114 from the MDAS producer 600. The MDAS producer 600 may begin periodic transmission of the context state 114. The MDAS producer 600 may begin periodic transmission of the context state 114 by a delivery mechanism specified in the request. The transport mechanism may be based on streaming data or files, for example. The MDAS producer 600 may send a context state 114 to the device 100 as consumer, the context state 114 comprising a context analysis report provided periodically by the MDAS producer 600 for each measurement period.
In the following, a practical example will be given to better understand the example of fig. 6. In an example, table 1 will be discussed:
table 1: the context analyzes the information content of the report.
Table 1 shown above is provided for a better understanding of the description. The values in table 1 may be the names and descriptions of the context analysis reports. The context analysis report of the context state 114 may include at least one of: an object, context state identifier, and/or description of scope management. The object of scope management may include an intent scope. The scope managed object may be specified in the request. The context state identifier may include a numeric identifier or a short text label that identifies a particular context state. The context state identifier may be automatically generated, for example, by the MDAS producer 600. The same context state identifier may be used for each reporting period of the same context state 114 throughout the reporting session. The context state description may include a human-readable text description of the context state identifier.
Sending the text context state description in each reporting period may be redundant because the same context state identifier may be associated with the same text description. Such redundancy may be reduced by populating the context state description only within a reporting period when the context state is first detected (e.g., when a particular context state identifier value first appears in a context analysis report).
Thus, example embodiments of the present disclosure may enable detection of conflicts of the following types between intents: the conflict may be just a temporary runtime activity depending on dynamic cell, user and e2e system behavior and resource allocation. The detection of conflicts may be applicable to an intent without regard to domain-specific aspects of the intent or implementation or guarantee of the intent. The detection of conflicts may also be suitable for a wide range of intentions, as observable metrics that are independent of the detailed semantics of the intent may be considered.
Fig. 7 illustrates an example of a method for monitoring collisions in a network according to an example embodiment.
At operation 700, the method may include receiving an indication 110 of at least one event for each intent of a plurality of intents.
At operation 710, the method may include receiving an indication of at least one action 112 for each intent of a plurality of intents.
At operation 720, the method may include determining at least one conflict between the plurality of intents based on the at least one event 110 for each of the plurality of intents or based on the at least one event 110 for each of the plurality of intents and the at least one action 112 for each of the plurality of intents.
At operation 730, the method may include receiving an indication of at least one context state 114 for each of a plurality of intent ranges.
At operation 740, the method may include providing a conflict report 150, the conflict report 150 including at least one context state 114 for the determined at least one conflict.
Additional features of the method are, for example, directly generated by the functions and parameters of the device 100, the collision detector 120, the closed loop 130, 140, the NAF 180, the O & M170 or the MDAS producer 600, as described in the appended claims and throughout the specification, and are therefore not repeated here. Different variations of the method may also be applied, as described in connection with the various example embodiments.
An apparatus (e.g., a network device or network node) may be configured to perform or cause performance of any aspect of the methods described herein. Furthermore, the computer program may comprise instructions for causing the apparatus to perform any aspect of the methods described herein when executed. Furthermore, an apparatus may include means for performing any aspect of the methods described herein. According to an example embodiment, the apparatus includes at least one processor and at least one memory including program code, the at least one processor and the program code configured to cause execution of any aspect of the method(s) when executed by the at least one processor.
Any range or device value given herein may be extended or altered without losing the effect sought. In addition, any embodiment may be combined with another embodiment unless explicitly not permitted.
Although the subject matter has been described in language specific to structural features and/or acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example of implementing the claims, and other equivalent features and acts are intended to be within the scope of the claims.
It will be appreciated that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. Embodiments are not limited to embodiments that solve any or all of the problems or those embodiments that have any or all of the benefits and advantages. It will also be understood that references to "an" item may refer to one or more of those items.
The steps or operations of the methods described herein may be performed in any suitable order or concurrently where appropriate. In addition, individual blocks may be deleted from any of the methods without departing from the scope of the subject matter described herein. Any of the aspects of the embodiments described above may be combined with any of the aspects of the other embodiments described to form further embodiments without losing the effect sought.
The term "comprising" is used herein to mean including the identified method, block or element, but that such block or element does not include an exclusive list, and that the method or apparatus may include additional blocks or elements.
As used in this application, the term "circuitry" may refer to one or more or all of the following: (a) Pure hardware circuit implementations (such as implementations in analog and/or digital circuitry only) and (b) combinations of hardware circuits and software, e.g. (as applicable): (i) A combination of analog and/or digital hardware circuit(s) and software/firmware, and (ii) any portion of hardware processor(s) (including digital signal processor(s) having software, and memory(s) that work together to cause a device (such as a mobile phone or server) to perform various functions) and (c) a portion of hardware circuit(s) and/or processor(s), such as microprocessor(s) or microprocessor(s), that require software (e.g., firmware) to operate, but when operation does not require software, the software may not be present. This definition of circuitry applies to all uses of this term in this application, including in any claims.
As a further example, as used in this application, the term circuitry also encompasses hardware-only circuitry or a processor (or multiple processors) or an implementation of a hardware circuit or portion of a processor and its (or their) accompanying software and/or firmware. The term "circuitry" also encompasses, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in a server, cellular network device, or other computing or network device.
It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of the exemplary embodiments. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the scope of this disclosure.
Claims (28)
1. An apparatus, comprising:
at least one processor; and
at least one memory including computer program code, the at least one memory and the computer code configured to, with the at least one processor, cause the apparatus at least to:
Receiving an indication of at least one event for each intent of a plurality of intents;
receiving an indication of at least one action for each intent of the plurality of intents;
determining at least one conflict between the plurality of intents based on the at least one event for each of the plurality of intents or based on the at least one event for each of the plurality of intents and the at least one action for each of the plurality of intents;
receiving an indication of at least one context state for each of a plurality of intent ranges; and
providing a conflict report, the conflict report including the at least one context state for the at least one conflict determined.
2. The apparatus of claim 1, wherein the at least one memory and the computer code are further configured to, with the at least one processor, cause the apparatus to:
monitoring the plurality of intents is initiated in response to receiving a request to monitor the plurality of intents based on a predetermined schedule, and/or in response to detecting expiration of a timer.
3. The apparatus of claim 1 or claim 2, wherein the at least one memory and the computer code are further configured to, with the at least one processor, cause the apparatus to:
Determining the at least one conflict by performing a time correlation analysis between:
the at least one event for each intent of the plurality of intents; or alternatively
The at least one event for each of the plurality of intents, and the at least one action for each of the plurality of intents.
4. The apparatus of claim 3, wherein the at least one memory and the computer code are further configured to, with the at least one processor, cause the apparatus to:
the time correlation analysis is performed by examining the previous following:
at least one event for each intent of the plurality of intents within a time window comprising a plurality of monitoring periods; or alternatively
At least one event for each intent of the plurality of intents and the at least one action for each intent of the plurality of intents within a time window comprising a plurality of monitoring periods.
5. The apparatus according to any one of claims 1 to 4, wherein the at least one memory and the computer code are further configured to, with the at least one processor, cause the apparatus to:
A series of context states for each of the plurality of intent ranges is received from a network analysis function or management data analysis service.
6. The apparatus according to any one of claims 1 to 5, wherein the at least one memory and the computer code are further configured to, with the at least one processor, cause the apparatus to:
the method further includes receiving the indication of the at least one event for each of the plurality of intents and the indication of the at least one action for each of the plurality of intents from a different closed loop of a network provisioning or management function.
7. The apparatus according to any one of claims 1 to 6, wherein the at least one memory and the computer code are further configured to, with the at least one processor, cause the apparatus to:
detecting the at least one conflict between intents managed by a single intent manager; or alternatively
The at least one conflict between intents managed by different intent managers is detected.
8. The apparatus according to any one of claims 1 to 7, wherein the at least one memory and the computer code are further configured to, with the at least one processor, cause the apparatus to:
Providing the conflict report, the conflict report including an identifier of the intent of the conflict, the at least one context state, and at least one measurement period when the conflict has been detected during the time window.
9. The apparatus of any of claims 1-8, wherein the indication of the at least one event comprises an indication of a series of events including a number of failed intents for each of a plurality of measurement periods.
10. The apparatus of any of claims 1 to 9, wherein the indication of the at least one action comprises an indication of a series of actions including, for each of a plurality of measurement periods, a number of counter actions performed to recover failure intent.
11. The apparatus according to any one of claims 1 to 10, wherein the indication of the at least one context state comprises an indication of a series of context states, the indication of a series of context states comprising an identifier of the context state, and/or a description of the context state.
12. The apparatus of any one of claims 1 to 11, wherein the range of intent between at least one pair of intents is disjoint, partially overlapping, or identical.
13. The apparatus according to any one of claims 1 to 12, wherein the apparatus is configured to operate in an intent-based network.
14. A method, comprising:
receiving an indication of at least one event for each intent of a plurality of intents;
receiving an indication of at least one action for each intent of the plurality of intents;
determining at least one conflict between the plurality of intents based on the at least one event for each of the plurality of intents or based on the at least one event for each of the plurality of intents and the at least one action for each of the plurality of intents;
receiving an indication of at least one context state for each of a plurality of intent ranges; and
providing a conflict report, the conflict report including the at least one context state for the at least one conflict determined.
15. The method of claim 14, further comprising:
monitoring the plurality of intents is initiated in response to receiving a request to monitor the plurality of intents based on a predetermined schedule, and/or in response to detecting expiration of a timer.
16. The method of claim 14 or claim 15, further comprising:
determining the at least one conflict by performing a time correlation analysis between:
the at least one event for each intent of the plurality of intents; or alternatively
The at least one event for each of the plurality of intents, and the at least one action for each of the plurality of intents.
17. The method of claim 16, further comprising:
the time correlation analysis is performed by examining the previous following:
at least one event for each intent of the plurality of intents within a time window comprising a plurality of monitoring periods; or alternatively
At least one event for each intent of the plurality of intents and the at least one action for each intent of the plurality of intents within a time window comprising a plurality of monitoring periods.
18. The method of any of claims 14 to 17, further comprising:
a series of context states for each of the plurality of intent ranges is received from a network analysis function or management data analysis service.
19. The method of any of claims 14 to 18, further comprising:
the method further includes receiving the indication of the at least one event for each of the plurality of intents and the indication of the at least one action for each of the plurality of intents from a different closed loop of a network provisioning or management function.
20. The method of any of claims 14 to 19, further comprising:
detecting the at least one conflict between intents managed by a single intent manager; or alternatively
The at least one conflict between intents managed by different intent managers is detected.
21. The method of any of claims 14 to 20, further comprising:
providing the conflict report, the conflict report including an identifier of the intent of the conflict, the at least one context state, and at least one measurement period when the conflict has been detected during the time window.
22. The method of any of claims 14 to 21, wherein the indication of the at least one event comprises an indication of a series of events including a number of failed intents for each of a plurality of measurement periods.
23. The method of any of claims 14 to 22, wherein the indication of the at least one action comprises an indication of a series of actions including, for each of a plurality of measurement periods, a number of counter actions performed to recover failure intent.
24. The method of any of claims 14 to 23, wherein the indication of the at least one context state comprises an indication of a series of context states, the indication of a series of context states comprising an identifier of the context state, and/or a description of the context state.
25. The method of any one of claims 14 to 24, wherein the range of intent between at least one pair of intents is disjoint, partially overlapping, or identical.
26. The method of any of claims 14 to 25, wherein the apparatus is configured to operate in an intent-based network.
27. A computer program comprising instructions for causing an apparatus to at least perform:
receiving an indication of at least one event for each intent of a plurality of intents;
receiving an indication of at least one action for each intent of the plurality of intents;
Determining at least one conflict between the plurality of intents based on the at least one event for each of the plurality of intents or based on the at least one event for each of the plurality of intents and the at least one action for each of the plurality of intents;
receiving an indication of at least one context state for each of a plurality of intent ranges; and
providing a conflict report, the conflict report including the at least one context state for the at least one conflict determined.
28. An apparatus, comprising:
means for receiving an indication of at least one event for each intent of a plurality of intents;
means for receiving an indication of at least one action for each intent of the plurality of intents;
means for determining at least one conflict between the plurality of intents based on the at least one event for each of the plurality of intents or based on the at least one event for each of the plurality of intents and the at least one action for each of the plurality of intents;
Means for receiving an indication of at least one context state for each of a plurality of intent ranges; and
means for providing a conflict report including the at least one context state for the at least one conflict determined.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2021/067942 WO2023274515A1 (en) | 2021-06-30 | 2021-06-30 | Conflict detection in network management |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117441327A true CN117441327A (en) | 2024-01-23 |
Family
ID=76807618
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202180099060.1A Pending CN117441327A (en) | 2021-06-30 | 2021-06-30 | Conflict detection in network management |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP4364374A1 (en) |
CN (1) | CN117441327A (en) |
WO (1) | WO2023274515A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230344714A1 (en) * | 2022-04-22 | 2023-10-26 | Microsoft Technology Licensing, Llc | Global intent-based configuration to local intent targets |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12034610B2 (en) * | 2018-11-30 | 2024-07-09 | Nokia Solutions And Networks Oy | Network objectives management |
-
2021
- 2021-06-30 CN CN202180099060.1A patent/CN117441327A/en active Pending
- 2021-06-30 EP EP21739045.9A patent/EP4364374A1/en active Pending
- 2021-06-30 WO PCT/EP2021/067942 patent/WO2023274515A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
EP4364374A1 (en) | 2024-05-08 |
WO2023274515A1 (en) | 2023-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10069684B2 (en) | Core network analytics system | |
US8656219B2 (en) | System and method for determination of the root cause of an overall failure of a business application service | |
Yu et al. | Practical online failure prediction for blue gene/p: Period-based vs event-driven | |
US9836952B2 (en) | Alarm causality templates for network function virtualization | |
EP2997756B1 (en) | Method and network device for cell anomaly detection | |
Bronevetsky et al. | Automatic fault characterization via abnormality-enhanced classification | |
US10467087B2 (en) | Plato anomaly detection | |
CN102929773B (en) | information collecting method and device | |
Bhaduri et al. | Detecting abnormal machine characteristics in cloud infrastructures | |
Li et al. | Fighting the fog of war: Automated incident detection for cloud systems | |
Becker et al. | Towards aiops in edge computing environments | |
US20200099570A1 (en) | Cross-domain topological alarm suppression | |
US8214480B2 (en) | Method of identifying a root cause of a network event | |
Effah et al. | Survey: Faults, fault detection and fault tolerance techniques in wireless sensor networks | |
CN117441327A (en) | Conflict detection in network management | |
CN117891641A (en) | Fault object positioning method and device, storage medium and electronic device | |
Kubacki et al. | Exploring operational profiles and anomalies in computer performance logs | |
Zhang et al. | A hybrid diagnosis approach for QoS management in service-oriented architecture | |
Hossain et al. | Dynamic bayesian network based approach for modeling and assessing resilience of smart grid system | |
Alkasem et al. | Utility cloud: a novel approach for diagnosis and self-healing based on the uncertainty in anomalous metrics | |
Xian et al. | Adaptive sampling and quick anomaly detection in large networks | |
Dundjerski et al. | Automatic database troubleshooting of Azure SQL Databases | |
CN115687009A (en) | Running state monitoring method and device, big data cluster and storage medium | |
Chao et al. | An alarm management framework for automated network fault identification | |
Jha et al. | Holistic measurement-driven system assessment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |