WO2022152515A1 - Apparatus and method for enabling analytics feedback - Google Patents

Apparatus and method for enabling analytics feedback Download PDF

Info

Publication number
WO2022152515A1
WO2022152515A1 PCT/EP2021/086737 EP2021086737W WO2022152515A1 WO 2022152515 A1 WO2022152515 A1 WO 2022152515A1 EP 2021086737 W EP2021086737 W EP 2021086737W WO 2022152515 A1 WO2022152515 A1 WO 2022152515A1
Authority
WO
WIPO (PCT)
Prior art keywords
data analytics
data
analytics
exceed
range
Prior art date
Application number
PCT/EP2021/086737
Other languages
French (fr)
Inventor
Konstantinos Samdanis
Jürgen Goerge
Janne ALI-TOLPPA
Márton KAJÓ
Original Assignee
Nokia Technologies Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy filed Critical Nokia Technologies Oy
Publication of WO2022152515A1 publication Critical patent/WO2022152515A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • H04W24/02Arrangements for optimising operational condition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/06Generation of reports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/14Network analysis or design
    • H04L41/147Network analysis or design for predicting network behaviour

Definitions

  • TITLE APPARATUS AND METHOD FOR ENABLING ANALYTICS FEEDBACK
  • Some example embodiments may generally relate to mobile or wireless telecommunication systems, such as Long Term Evolution (LTE) or fifth generation (5G) radio access technology or new radio (NR) access technology, or other communications systems.
  • LTE Long Term Evolution
  • 5G fifth generation
  • NR new radio
  • certain embodiments may relate to systems and/or methods for enabling analytics feedback.
  • Examples of mobile or wireless telecommunication systems may include the Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access Network (UTRAN), Long Term Evolution (LTE) Evolved UTRAN (E-UTRAN), LTE- Advanced (LTE- A), MulteFire, LTE-A Pro, and/or fifth generation (5G) radio access technology or new radio (NR) access technology.
  • UMTS Universal Mobile Telecommunications System
  • UTRAN Long Term Evolution
  • E-UTRAN Evolved UTRAN
  • LTE-A LTE- Advanced
  • MulteFire LTE-A Pro
  • 5G wireless systems refer to the next generation (NG) of radio systems and network architecture.
  • 5G is mostly built on a new radio (NR), but a 5G (or NG) network can also build on E-UTRA radio.
  • NR may provide bitrates on the order of 10-20 Gbit/s or higher, and may support at least enhanced mobile broadband (eMBB) and ultrareliable low-latency-communication (URLLC) as well as massive machine type communication (mMTC).
  • eMBB enhanced mobile broadband
  • URLLC ultrareliable low-latency-communication
  • mMTC massive machine type communication
  • NR is expected to deliver extreme broadband and ultra-robust, low latency connectivity and massive networking to support the Internet of Things (IoT).
  • IoT Internet of Things
  • loT and machine-to-machine (M2M) communication becoming more widespread, there will be a growing need for networks that meet the needs of lower power, low data rate, and long battery life.
  • the nodes that can provide radio access functionality to a user equipment may be named gNB when built on NR radio and may be named NG- eNB when built on E-UTRA radio.
  • a method may include receiving data analytics from an analytics function.
  • the method may include transmitting feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics.
  • the feedback information may include one or more quality indicators.
  • the one or more information elements may indicate one or more dimensions of the data analytics.
  • the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services.
  • the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
  • the method may further include detecting that the data analytics deviate from expected values based on at least one of whether the data analytics are within a range of the expected values, whether the data analytics exceed, or fail to exceed, the expected values, and an amount that the data analytics exceed or fail to exceed, whether the data analytics exceed, or fail to exceed, the expected values in a certain geographic location and an amount that the data analytics exceed or fail to exceed, or whether the data analytics exceed, or fail to exceed, the expected values at a certain time window and an amount that the data analytics exceed or fail to exceed.
  • the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators.
  • the one or more accuracy ratings may be based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
  • an apparatus may include at least one processor and at least one memory comprising computer program code.
  • the at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to receive data analytics from an analytics function.
  • the at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to transmit feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics.
  • the feedback information may include one or more quality indicators.
  • the one or more information elements may indicate one or more dimensions of the data analytics.
  • the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services.
  • the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
  • the at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to detect that the data analytics deviate from expected values based on at least one of whether the data analytics are within a range of the expected values, whether the data analytics exceed, or fail to exceed, the expected values, and an amount that the data analytics exceed or fail to exceed, whether the data analytics exceed, or fail to exceed, the expected values in a certain geographic location and an amount that the data analytics exceed or fail to exceed, or whether the data analytics exceed, or fail to exceed, the expected values at a certain time window and an amount that the data analytics exceed or fail to exceed.
  • the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators.
  • the one or more accuracy ratings may be based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
  • a method may include transmitting data analytics to a data consumer.
  • the method may include receiving feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics.
  • the feedback information may comprise one or more quality indicators.
  • the one or more information elements may indicate one or more dimensions of the data analytics.
  • the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services.
  • the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
  • the method may further include assessing the feedback information based on one or more of a quantity of data used for the data analytics in a range of values or with respect to a geographic location indicated in the feedback information, a reinforcement learning rule used for generating the data analytics, an algorithm used for generating the data analytics, or whether a source of data used for generating the data analytics is an abnormal source of the data.
  • the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators.
  • the one or more accuracy ratings may be based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
  • the method may further include performing one or more actions that at least one of stopping the analytics function, re-training a model, replacing the model, or using a new reinforcement rule in the model.
  • an apparatus may include at least one processor and at least one memory comprising computer program code.
  • the at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to transmit, by an analytics function hosted on the apparatus, data analytics to a data consumer.
  • the at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to receive feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics.
  • the feedback information may comprise one or more quality indicators.
  • the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services.
  • the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
  • the at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to assess the feedback information based on one or more of a quantity of data used for the data analytics in a range of values or with respect to a geographic location indicated in the feedback information, a reinforcement learning rule used for generating the data analytics, an algorithm used for generating the data analytics, or whether a source of data used for generating the data analytics is an abnormal source of the data.
  • the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators.
  • the one or more accuracy ratings may be based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
  • the at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to perform one or more actions that comprise at least one of stopping the analytics function, re-training a model, replacing the model, or using a new reinforcement rule in the model.
  • a fifth embodiment may be directed to an apparatus that may include circuitry configured to cause the apparatus to perform at least operations according to the first embodiment, the second embodiment, the third embodiment, or the fourth embodiment, or any of the variants discussed above.
  • a sixth embodiment may be directed to an apparatus that may include means for performing at least operations according to the first embodiment, the second embodiment, the third embodiment, or the fourth embodiment, or any of the variants discussed above. Examples of the means may include one or more processors, memory, and/or computer program codes for causing the performance of the operation.
  • a seventh embodiment may be directed to a computer readable medium comprising program instructions stored thereon for causing an apparatus to perform at least operations according to the first embodiment, the second embodiment, the third embodiment, or the fourth embodiment, or any of the variants discussed above.
  • An eighth embodiment may be directed to a computer program product encoding instructions for causing an apparatus to perform at least operations according to the first embodiment, the second embodiment, the third embodiment, or the fourth embodiment, or any of the variants discussed above.
  • Fig. 1 illustrates an example of a data analytics system comprising a network data analytics function (e.g. NWDAF), according to some embodiments;
  • NWDAF network data analytics function
  • Fig. 2 illustrates an example of a data analytics system comprising a management data analytics function (e.g. MDAF), according to some embodiments;
  • a management data analytics function e.g. MDAF
  • FIG. 3 illustrates an example flow diagram of a method, according to some embodiments.
  • FIG. 4 illustrates an example flow diagram of a method, according to some embodiments
  • Fig. 5a illustrates an example block diagram of an apparatus, according to an embodiment
  • Fig. 5b illustrates an example block diagram of an apparatus, according to another embodiment.
  • Network data analytics may be utilized in several network architectures.
  • the network architectures include the Third Generation Partnership Project (3 GPP) core network with the advent of the NWDAF and in the management plane with the management data analytics (MDA) in an end-to-end management domain, as well as in subordinated management domains for specific parts of the network.
  • 3 GPP Third Generation Partnership Project
  • MDA management data analytics
  • an operator may have to define the scope of data, e.g., the numeric range of data, that the analytics function or service (e.g., NWDAF, MDAF, or similar functions) may consume for training purposes for machine learning (ML) models.
  • the analytics function or service e.g., NWDAF, MDAF, or similar functions
  • An analytics function may be able to use just a subset of the data collected by a telecommunication system.
  • One limiting factor is the amount of data that the various network functions (NFs) in the telecommunication system are collecting.
  • Another factor includes the availability of data via standardized interfaces (e.g., standardized format and standardized semantics) because a network designer may have to integrate NFs from multiple vendors to report their data to the analytics functions. Therefore, the operator may have to decide, at design time, on a subset of data that appears reasonable to support the desired use case of the analytics function.
  • NFs may collect more data about their operations than reported to the analytics function. Therefore, during run-time, the NF may have a broader knowledge about their environment and their own operations due to data that is kept private. As a result, there may be a need for improved accuracy of statistics or predictions provided by the analytics function.
  • the results of an analytics function may be based on just subsets of data.
  • the individual NF might detect, by using its own private data, that the results of the analytics function are not as expected for the specific environment and operation point of the specific network function (e.g., a specific private metric may exceed a threshold which does not conform to the analytics result).
  • Such errors may affect predictions, including: a future time (e.g., a prediction that refer to timeframes more than an amount of time in the future may be erroneous); a time window (e.g., a prediction for a particular day may be erroneous); an area (e.g., a prediction relating to the geographic location may not be usable); a target (e.g., a user equipment (UE), group of UEs, or a network slice); a data range (e.g., a prediction above or below a certain data value or data range may not be accurate or may have a different accuracy degree); and a service (e.g., a prediction for certain services may be different if a certain service relies on a private feature and consequently a private metric).
  • a future time e.g., a prediction that refer to timeframes more than an amount of time in the future may be erroneous
  • a time window e.g., a prediction for a
  • the consumers of the analytics function may not have the possibility to inform the analytics function that the results are not as expected.
  • the analytics function may not have a chance to take such feedback into account, e.g., to trigger model re-training.
  • the analytics function e.g., NWDAF or MDAF, may just use newly collected data for improving its accuracy.
  • This process may be too slow since (i) the cycles of re-training may take a significant amount of time; (ii) there may be no indication of the range of data and/or region where inaccuracy was experienced; (iii) there may be no indication as to which application the data deviated from the expectation; and (iv) there may be no means to check if the source of the data experiences an issue, such as a fault or a hijacking.
  • the process of using newly collected data to improve the accuracy of the analytics function may not work in cases where the analytics function is working with models that have been trained by other entities (e.g., by different instances of an NWDAF or management data analytics service (MDAS)) and where models may be exchanged between analytics functions.
  • NWDAF NWDAF
  • MDAS management data analytics service
  • the analytics function may just receive data required for inference and may just include business logic for making an inference.
  • the analytics function in addition, may neither be able to receive the data needed to assess the quality of the results nor be able to perform retraining. As can be understood by this, there may be a need for timely feedback to an analytics function.
  • the operator or designer of the analytics function may have a limited chance to assess the situation, because the network functions or management services may not offer a standardized format to relate performance data to a specific scope like geographic area, a future time, or a time window of the problems.
  • indicating with a feedback from the consumer that the statistics or predictions were not as expected can improve the output of the corresponding analytics function.
  • the indication that the statistics or predictions were not as expected may be given, for example, as a direct feedback from the consumer.
  • the direct feedback may be immediate or without delay, wherein the delay may be defined based on the application or use scenario.
  • a 3 GPP NWDAF may offer the means of enhancing the operation of NFs, AFs, and 0AM functions using analytics.
  • the NFs, AFs, or 0AM function may not have means to rate the accuracy of the NWDAF analytics for improving the provided statistics or predictions.
  • Some aspects of NR may include sharing trained data models among multiple NWDAF instances.
  • NR may include use of an artificial intelligence (Al) function that trains and provides Al Models to NFs, federated machine learning (ML) training that handles model training based on data sets that are distributed in different NFs.
  • Al artificial intelligence
  • ML federated machine learning
  • Each client NWDAF may train, locally, the related ML model with its own data and then may share it to a server NWDAF, which aggregates to create a global or optimal ML model.
  • a provider NWDAF instance may supply the trained data model to a consumer NWDAF instance via a data model provision service and may register this capability (e.g., to expose a trained data model, in the network repository function (NRF)).
  • NRF network repository function
  • aspects of NR may not provide feedback on an analytics model in order to improve its performance.
  • the MDAS producer may provide root case analysis for various performance and fault management issues and other network optimization processes.
  • the MDAS producer may classify the input data to separate it into: (i) ML model training data and (ii) data for analytic services, while following the ML model training process, it may provide the ML model training report as one kind of output to the corresponding MDAS consumer.
  • the MDAS producer may perform such data classification considering various input data sources. Although the MDAS consumer may be involved in the ML model training process, there is no way for the MDAS to provide validation or feedback.
  • Some embodiments described herein may provide for analytics feedback. For example, certain embodiments may improve the estimates of an analytics function by enabling the use of data that is available in the network services or applications using data analytics (e.g., for labelling, in order to improve the internal algorithms and models of the analytics function). Consumers of the analytics services may provide feedback about the quality of the estimates determined by the analytics function. To facilitate this, analytics consumers (e.g., network functions (NFs), application functions (AFs), or management services (MnSs)) may provide feedback and may rate the accuracy of the statistics and/or predictions from the analytics function. Such feedback may include information elements that allow the analytic service to correlate or match the feedback to specific data analytics (e.g., regarding certain applications, geographical areas, and times).
  • NFs network functions
  • AFs application functions
  • MnSs management services
  • Such feedback may include information elements that allow the analytic service to correlate or match the feedback to specific data analytics (e.g., regarding certain applications, geographical areas, and times).
  • the feedback may include a quality indicator (e.g., an action quality indicator). Additionally, or alternatively, the feedback may include information to indicate a scope the feedback.
  • the scope may include a future time (e.g., a prediction that refers to a timeframe more than an amount of time in the future may be incorrect). Additionally, or alternatively, the scope may include a time window (e.g., a prediction for a particular day may be incorrect). Additionally, or alternatively, the scope may include an area (e.g., a prediction relating to a location or geographic area may not be usable).
  • the analytics function may not be capable of knowing private metrics of the consumer.
  • the interface and payload may be standardized to facilitate multi-vendor deployments.
  • certain embodiments may utilize one or more interfaces that facilitate the NFs, AFs, and MnSs to provide feedback to the analytics function.
  • feedback e.g., a message or signalling
  • the feedback may indicate a geographic area of usage or an application of the data analytics (e.g., a session management function (SMF) selection of a user plane function (UPF)). Additionally, or alternatively, the feedback may indicate a time window to which the feedback relates. Additionally, or alternatively, the feedback may include a range of values experienced (e.g., actual values rather than predicted values).
  • SMF session management function
  • UPF user plane function
  • the analytics function after receiving the feedback and a corresponding accuracy rating, may assess the feedback.
  • the assessment may be based on a quantity of data values in a range indicated in the feedback (e.g., too few data may be an indication that more training data is needed (considering the scope of the feedback). Additionally, or alternatively, the assessment may include use of a reinforcement learning rule (e.g., which can serve as an additional source for a reward function in a ML model). Additionally, or alternatively, the assessment may include usage of various algorithms for certain applications (e.g., different algorithms may be better suited for different usages of data analytics). Additionally, or alternatively, the assessment may include an evaluation of whether a source of data used for the data analytics is an abnormal source of data (e.g., if the data received contains metadata that indicates the source).
  • certain embodiments may provide for data analytics and feedback provided by a data analytics consumer, thereby reducing or eliminating the need to rely on a training report exchange. This facilitates improvement in the performance of an analytics function. For example, certain embodiments may improve the accuracy of statistics or predictions provided by the analytics function by providing a way to rate the accuracy of analytics data with respect to the particular use or application and feeding this information back to the analytics function. In addition, certain embodiments may provide a way to validate the correctness of the data sources, thereby further improving operations of an analytics function.
  • Fig. 1 illustrates an example 100 of a data analytics system comprising an NWDAF, according to some embodiments.
  • the example 100 may include one or more data analytics sources 102 (e.g., one or more NFs, one or more AFs, a unified data repository (UDR), and an operations, administration, and management (0AM) function).
  • the example 100 may include one or more NWDAFs 104 comprising a corresponding analytics instance deployment (analytics ID) 106.
  • analytics ID 106 may include an instance of a ML or Al model used to generate data analytics (e.g., statistics or predictions) based on the data from the data analytics sources 102.
  • the example 100 may include one or more data analytics outputs 108 (e.g., one or more NFs, one or more AFs, and an OAM function).
  • the data analytics outputs 108 may include destinations for data analytics.
  • the data analytics sources 102 may provide data to the NWDAFs 104.
  • the NWDAFs 104 may generate data analytics from the provided data using the analytics ID 106. After generating the data analytics, the NWDAFs 104 may provide the data analytics to the data analytics outputs 108.
  • the analytics consumer of the data analytics outputs 108 may detect that the data analytics deviate from expected values, and may transmit feedback information related to a quality of the data analytics to the NWDAFs 104.
  • the feedback information may include one or more data elements that the NWDAFs 104 can use to match the feedback information to the provided data analytics.
  • the NWDAFs 104 may assess the feedback information (e.g., to determine a cause of the deviation from the expected values, or to try to improve the data analytics). The NWDAFs 104 may then perform, based on assessing the feedback information, one or more actions related to the model that was used to generate the data analytics. These and other aspects of certain embodiments are described elsewhere herein.
  • NWDAF 104 deployment scenarios For example, just one NWDAF 104 may be deployed in the network, or multiple different instances of the NWDAF 104, supporting certain analytics capabilities residing at one or more distinct locations, may be deployed. Certain embodiments may provide for configurations of an interface and payload, which may be standardized to enable multi-vendor deployments. Certain embodiments may be deployed in other scenarios, for example, the NWDAFs 104 may be deployed in a UE, in a base station, or in a non-real-time/near-real-time radio intelligent controller (non-RT/near-RT RIC).
  • Fig. 1 is provided as an example. Other examples are possible, according to some embodiments.
  • Fig. 2 illustrates an example 200 of a data analytics system comprising a MDAF, according to some embodiments.
  • the example 200 may include a management data analytics service (MDAS) consumer 202.
  • the example 200 may include a MDAF 204, which may include a MDAS producer 206, an analytics model 208, a MDAS consumer 210, a management services (MnS) consumer 212, and a NWDAF subscriber 214.
  • the example 200 may include another MDAS producer 216, a MnS producer 218, and a NWDAF 220.
  • the elements illustrated in Fig. 2 may perform operations similar to that described elsewhere herein.
  • the MDAS producer 216, the MnS producer 218, and the NWDAF 220 may be data analytics sources that provide data to the MDAF 204.
  • the MDAF 204 may generate data analytics using the analytics model 208, and may provide the data analytics to the MDAS consumer 202. After detecting that the data analytics deviate from expected values, the MDAS consumer 202 may provide feedback information to the MDAF 204, and the MDAF 204 may assess the feedback information.
  • the MDAF 204 may provide for analytics and automation in the management plane for MnSs, self-organizing network (SON) functions, optimization tools, human operators, and or a core network. Using a servicebased management architecture (SBMA), the MDAS consumer 202 may collect MnS performance measurements, network configurations, alarms and/or trace data as well as quality of experience (QoE) and minimization of drive test (MDT) reports.
  • QoE quality of experience
  • MDT minimization of
  • Analytics consumers may provide an accuracy rating of the received analytics towards the corresponding analytics function, e.g., the MDAF 204, after using it.
  • the MDAS producer 206 may receive rapid feedback regarding data analytics.
  • Such accuracy rating may include a high, medium, or low accuracy rating.
  • the accuracy rating may include other levels related to a range values of use (e.g., a low accuracy for a network slice management function (NSMF) consumer or gNB). This may indicate to the MDAF 204 that the analytics model is behaving abnormally.
  • the MDAS producer 206 may receive, from a SON function hosted on a network node (e.g., a gNB), feedback that the predicted radio conditions or expected radio utilization was exceeding indicated values.
  • a network node e.g., a gNB
  • the timeframe of a prediction may indicate an accuracy for a subtimeframe duration (e.g., a prediction more than 20 minutes of a time frame of 30 minutes may show a certain percentage drop in accuracy, or a predictions on a particular day of the week between 10:00am- 12:00pm may show a certain percentage drop in accuracy).
  • a geographic area may indicate an accuracy (e.g., a load prediction in a train station may be of low accuracy). The geographic area may be identified using coordinates, a cell, a tracking area, and/or the like. Mobility pattern may affect prediction accuracy (e.g., a prediction of radio conditions may have different accuracy levels with respect to different user mobility patterns). An application or service that is used may affect an accuracy of a prediction.
  • an application or a service may rely on private metrics of a data analytics consumer (e.g., MDAS consumer 202), and these metrics may not be visible to the analytics function.
  • a prediction of radio availability may be different for a vehicle service compared to a service for drones, due to the difference in elevation and reception, or due to different selection of a UPF by an SMF.
  • certain embodiments may address the different deployment needs of different applications or services, thereby improving data analytics.
  • An inaccuracy can also be reported in various ways including, e.g., the range of values experienced instead of the ones expected, the average deviation from the expected value, the deviation function from the expected value, and/or the like.
  • Such consumer feedback may trigger an analytics function, to perform one or more actions.
  • the analytics function may perform model re-training. For example, in the case of an ML algorithm, a range of data indicated as inaccurate or an indication that a data set does not include a sufficient amount of data may be an indication that more training data is needed. This can be combined with other attributes, e.g., the time frame or with respect to a certain geographic area.
  • the analytics function may perform model replacement, if there is information regarding the capabilities of an analytics model (e.g., if the analytics model cannot capture certain value ranges or behaviour with a desired accuracy).
  • the analytics function may use a new reinforcement rule to capture a new behaviour or a certain range of values or for capturing the needs of a new application or service.
  • certain embodiments may improve the accuracy of an MDAS producer report, by providing feedback directly by an MDAS consumer, e.g., after using a data analytics report, data related to the quality of data analytics and may additionally rate the accuracy of the provided statistics or predictions.
  • an MDAS consumer may detect by using its own private data that the data analytics results are not as expected for the environment and operation of the MDAS consumer (or an application thereof), e.g., a private metric might exceed a threshold which is not according to the analytics result.
  • Such errors may affect predictions related to a future time where, e.g., predictions that refer to timeframes more than an amount of days in the future are wrong. Additionally, or alternatively, the errors may affect predictions related to a certain time window where, e.g., predictions between times on a day of the week were inaccurate. Additionally, or alternatively, the errors may affect predictions related to a certain geographical area where, e.g., predictions relating to a location are not usable. Additionally, or alternatively, the errors may affect predictions related to a target UE or a group of UEs, or a network slice.
  • the errors may affect predictions related to a certain data range where, e.g., statistics or predictions above or below a certain data value or data range may not be accurate or may have a different accuracy degree. Additionally, or alternatively, the errors may affect predictions related to a MnS where, e.g., statistics or prediction may be different if the MnS relies on a private feature and/or a private metric.
  • the MDAS consumers may not have means to inform the MDAS producer of the above-described issues that the analytics results are not as expected. Thus, the MDAS producer may not have the opportunity to update data analytics based on such feedback, e.g., to trigger re-training of the internal algorithms.
  • Certain embodiments described herein may involve a MDAS consumer that can provide feedback to the MDAS producer related to the received analytics report, which may include data and a rating of the accuracy of statistics or predictions in the analytics report.
  • the feedback from the MDAS consumer may allow the MDAS producer to correlate the feedback to specific analytics results regarding a certain MnSs, geographical areas, UE groups, times (or time windows), and/or the like.
  • the MDAS consumer may provide an accuracy rating of the received analytics report after using it.
  • Such accuracy rating may indicate an overall accuracy for data analytics, e.g., a high, medium or low accuracy or inaccuracy, or may include ratings for certain sub-categories of accuracy, such as: an accuracy of the analytic service considering a range of expected or not expected values, a range of inaccuracy (e.g., higher or lower than expected and/or a deviation from the expected value), a geographical area of usage and/or MnS usage, a time frame of statistics or a prediction, and/or a range of actual values experienced at the MDAS consumer. In this way, the MDAS producer may receive rapid feedback.
  • an accuracy of the analytic service considering a range of expected or not expected values, a range of inaccuracy (e.g., higher or lower than expected and/or a deviation from the expected value), a geographical area of usage and/or MnS usage, a time frame of statistics or a prediction, and/or a range of actual values experienced at the MDAS consumer.
  • the MDAS producer may receive rapid feedback
  • Such MDAS consumer feedback may trigger to the MDAS producer to perform, e.g., model re-training, model replacement, or the introduction of a new rule into a model.
  • the feedback may include information that identifies a geographic area, a time, an MnS usage, and/or the like associated with the feedback so that the MDAS producer can match the feedback to data analytics provided to the data consumer and/or a model used to generate the data analytics.
  • the MDAS producer may not have an indication with respect to which MDAS consumer the data analytics results did not meet the expectation
  • the MDAS producer may not have means to check if the sources of the data experiences a fault or has been hijacked.
  • Fig. 2 is provided as an example. Other examples are possible, according to some embodiments.
  • Fig. 3 illustrates an example flow diagram of a method 300, according to some embodiments.
  • Fig. 3 may illustrate example operations of a network node (e.g., apparatus 10 illustrated in, and described with respect to, Fig. 5a) or a UE (e.g., apparatus 20 illustrated in, and described with respect to, Fig. 5b).
  • Fig. 3 may illustrate example operations of an analytics function hosted on the network node or the UE.
  • Some of the operations illustrated in Fig. 3 may be similar to some operations shown in, and described with respect to, Figs. 1 and 2.
  • the method may include, at 302, transmitting data analytics to a data consumer.
  • the method may include, at 304, receiving feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics.
  • the feedback information may comprise one or more quality indicators.
  • the method illustrated in Fig. 3 may include one or more additional aspects described below or elsewhere herein.
  • the one or more information elements may indicate one or more dimensions of the data analytics (e.g., one or more characteristics of the data analytics).
  • the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services.
  • the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
  • the method may further include assessing the feedback information based on one or more of a quantity of data used for the data analytics in a range of values or with respect to a geographic location indicated in the feedback information, a reinforcement learning rule used for generating the data analytics, an algorithm used for generating the data analytics, or whether a source of data used for generating the data analytics is an abnormal source of the data.
  • the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators.
  • the one or more accuracy ratings may be based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
  • the method may further include performing one or more actions that include at least one of stopping the analytics function, re-training the model, replacing a model, or using a new reinforcement rule in the model.
  • the feedback may explicitly trigger performance of the one or more actions.
  • the decision of whether to perform the one or more actions may be made by the analytics function or the data consumer (or analytics service consumer).
  • the data consumer may not know whether the analytics function uses an Al and/or ML model, and the decision of whether to perform one or more actions may be made by the analytics function.
  • the data consumer may decide on the action that it expects from the analytics function, and may send the feedback information to the analytics function for performance of that action.
  • Fig. 3 is provided as an example. Other examples are possible according to some embodiments.
  • Fig. 4 illustrates an example flow diagram of a method 400, according to some embodiments.
  • Fig. 4 may illustrate example operations of a network node (e.g., apparatus 10 illustrated in, and described with respect to, Fig. 5a) or a UE (e.g., apparatus 20 illustrated in, and described with respect to, Fig. 5b).
  • Fig. 4 may illustrate example operations of a data consumer hosted on the network node or the UE.
  • Some of the operations illustrated in Fig. 4 may be similar to some operations shown in, and described with respect to, Figs. 1 and 2.
  • the method may include, at 402, receiving data analytics from an analytics function.
  • the method may include, at 404, transmitting feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics.
  • the feedback information may include one or more quality indicators.
  • the method illustrated in Fig. 4 may include one or more additional aspects described below or elsewhere herein.
  • the one or more information elements may indicate one or more dimensions of the data analytics.
  • the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services.
  • the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
  • the method may further include detecting that the data analytics deviate from expected values based on at least one of whether the data analytics are within a range of the expected values, whether the data analytics exceed, or fail to exceed, the expected values, and an amount that the data analytics exceed or fail to exceed, whether the data analytics exceed, or fail to exceed, the expected values in a certain geographic location and an amount that the data analytics exceed or fail to exceed, or whether the data analytics exceed, or fail to exceed, the expected values at a certain time window and an amount that the data analytics exceed or fail to exceed.
  • the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators.
  • the one or more accuracy ratings are based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
  • Fig. 4 is provided as an example. Other examples are possible according to some embodiments.
  • certain operations of the method 300 or the method 400 may be performed by a UE or a network node.
  • the UE may depend on an analytics functions that predicts the surroundings of the UE or its environment (e.g., the number of UEs, the movement of objects, or other devices, in proximity to the UE, interference from other UE or cells, beam patterns, etc.).
  • the UE may detect that these predictions are not as expected and may report this to the network node.
  • a UE may operate in a maimer similar to a network function, such as a data consumer, described herein.
  • O-RAN open radio access network
  • RIC RAN intelligent controller
  • BTS base transceiver station
  • UE may send feedback to the RIC.
  • apparatus 10 may be a node, host, or server in a communications network or serving such a network.
  • apparatus 10 may be a network node, satellite, base station, a Node B, an evolved Node B (eNB), 5G Node B or access point, next generation Node B (NG-NB or gNB), and/or a WLAN access point, associated with a radio access network, such as a LTE network, 5G or NR.
  • apparatus 10 may be an eNB in LTE or gNB in 5G.
  • apparatus 10 may host a NF, AF, UDR, 0AM function, NWDAF, analytics model, and/or the like described elsewhere herein.
  • apparatus 10 may be comprised of an edge cloud server as a distributed computing system where the server and the radio node may be stand-alone apparatuses communicating with each other via a radio path or via a wired connection, or they may be located in a same entity communicating via a wired connection.
  • apparatus 10 represents a gNB
  • it may be configured in a central unit (CU) and distributed unit (DU) architecture that divides the gNB functionality.
  • the CU may be a logical node that includes gNB functions such as transfer of user data, mobility control, radio access network sharing, positioning, and/or session management, etc.
  • the CU may control the operation of DU(s) over a front-haul interface.
  • the DU may be a logical node that includes a subset of the gNB functions, depending on the functional split option. It should be noted that one of ordinary skill in the art would understand that apparatus 10 may include components or features not shown in Fig. 5a.
  • apparatus 10 may include a processor 12 for processing information and executing instructions or operations.
  • processor 12 may be any type of general or specific purpose processor.
  • processor 12 may include one or more of general-purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs), field-programmable gate arrays (FPGAs), applicationspecific integrated circuits (ASICs), and processors based on a multi-core processor architecture, as examples. While a single processor 12 is shown in Fig. 5a, multiple processors may be utilized according to other embodiments.
  • apparatus 10 may include two or more processors that may form a multiprocessor system (e.g., in this case processor 12 may represent a multiprocessor) that may support multiprocessing.
  • processor 12 may represent a multiprocessor
  • the multiprocessor system may be tightly coupled or loosely coupled (e.g., to form a computer cluster).
  • Processor 12 may perform functions associated with the operation of apparatus 10, which may include, for example, precoding of antenna gain/phase parameters, encoding and decoding of individual bits forming a communication message, formatting of information, and overall control of the apparatus 10, including processes related to management of communication or communication resources.
  • Apparatus 10 may further include or be coupled to a memory 14 (internal or external), which may be coupled to processor 12, for storing information and instructions that may be executed by processor 12.
  • Memory 14 may be one or more memories and of any type suitable to the local application environment, and may be implemented using any suitable volatile or nonvolatile data storage technology such as a semiconductor-based memory device, a magnetic memory device and system, an optical memory device and system, fixed memory, and/or removable memory.
  • memory 14 can be comprised of any combination of random access memory (RAM), read only memory (ROM), static storage such as a magnetic or optical disk, hard disk drive (HDD), or any other type of non-transitory machine or computer readable media.
  • the instructions stored in memory 14 may include program instructions or computer program code that, when executed by processor 12, enable the apparatus 10 to perform tasks as described herein.
  • apparatus 10 may further include or be coupled to (internal or external) a drive or port that is configured to accept and read an external computer readable storage medium, such as an optical disc, USB drive, flash drive, or any other storage medium.
  • an external computer readable storage medium such as an optical disc, USB drive, flash drive, or any other storage medium.
  • the external computer readable storage medium may store a computer program or software for execution by processor 12 and/or apparatus 10.
  • apparatus 10 may also include or be coupled to one or more antennas 15 for transmitting and receiving signals and/or data to and from apparatus 10.
  • Apparatus 10 may further include or be coupled to a transceiver 18 configured to transmit and receive information.
  • the transceiver 18 may include, for example, a plurality of radio interfaces that may be coupled to the anteima(s) 15.
  • the radio interfaces may correspond to a plurality of radio access technologies including one or more of GSM, NB- loT, LTE, 5G, WLAN, Bluetooth, BT-LE, NFC, radio frequency identifier (RFID), ultrawideband (UWB), MulteFire, and the like.
  • the radio interface may include components, such as filters, converters (for example, digital-to- analog converters and the like), mappers, a Fast Fourier Transform (FFT) module, and the like, to generate symbols for a transmission via one or more downlinks and to receive symbols (for example, via an uplink).
  • components such as filters, converters (for example, digital-to- analog converters and the like), mappers, a Fast Fourier Transform (FFT) module, and the like, to generate symbols for a transmission via one or more downlinks and to receive symbols (for example, via an uplink).
  • FFT Fast Fourier Transform
  • transceiver 18 may be configured to modulate information on to a carrier waveform for transmission by the anteima(s) 15 and demodulate information received via the anteima(s) 15 for further processing by other elements of apparatus 10.
  • transceiver 18 may be capable of transmitting and receiving signals or data directly.
  • apparatus 10 may include an input and/or output device (I/O device).
  • memory 14 may store software modules that provide functionality when executed by processor 12.
  • the modules may include, for example, an operating system that provides operating system functionality for apparatus 10.
  • the memory may also store one or more functional modules, such as an application or program, to provide additional functionality for apparatus 10.
  • the components of apparatus 10 may be implemented in hardware, or as any suitable combination of hardware and software.
  • processor 12 and memory 14 may be included in or may form a part of processing circuitry or control circuitry.
  • transceiver 18 may be included in or may form a part of transceiver circuitry.
  • circuitry may refer to hardware-only circuitry implementations (e.g., analog and/or digital circuitry), combinations of hardware circuits and software, combinations of analog and/or digital hardware circuits with software/firmware, any portions of hardware processor(s) with software (including digital signal processors) that work together to cause an apparatus (e.g., apparatus 10) to perform various functions, and/or hardware circuit(s) and/or processor(s), or portions thereof, that use software for operation but where the software may not be present when it is not needed for operation.
  • hardware-only circuitry implementations e.g., analog and/or digital circuitry
  • combinations of hardware circuits and software e.g., combinations of analog and/or digital hardware circuits with software/firmware
  • any portions of hardware processor(s) with software including digital signal processors
  • circuitry may also cover an implementation of merely a hardware circuit or processor (or multiple processors), or portion of a hardware circuit or processor, and its accompanying software and/or firmware.
  • the term circuitry may also cover, for example, a baseband integrated circuit in a server, cellular network node or device, or other computing or network device.
  • apparatus 10 may be a network node or RAN node, such as a base station, access point, Node B, eNB, gNB, WLAN access point, or the like.
  • apparatus 10 may be controlled by memory 14 and processor 12 to perform the functions associated with any of the embodiments described herein, such as some operations illustrated in, or described with respect to, Figs. 1-4.
  • apparatus 10 may be controlled by memory 14 and processor 12 to perform the methods of Figs. 3 or 4.
  • Fig. 5b illustrates an example of an apparatus 20 according to another embodiment.
  • apparatus 20 may be a node or element in a communications network or associated with such a network, such as a UE, mobile equipment (ME), mobile station, mobile device, stationary device, loT device, or other device.
  • a UE mobile equipment
  • ME mobile station
  • mobile device mobile device
  • stationary device stationary device
  • loT device loT device
  • a UE may alternatively be referred to as, for example, a mobile station, mobile equipment, mobile unit, mobile device, user device, subscriber station, wireless terminal, tablet, smart phone, loT device, sensor or NB-IoT device, a watch or other wearable, a head-mounted display (HMD), a vehicle, a drone, a medical device and applications thereof (e.g., remote surgery), an industrial device and applications thereof (e.g., a robot and/or other wireless devices operating in an industrial and/or an automated processing chain context), a consumer electronics device, a device operating on commercial and/or industrial wireless networks, or the like.
  • apparatus 20 may be implemented in, for instance, a wireless handheld device, a wireless plugin accessory, or the like.
  • apparatus 20 may include one or more processors, one or more computer-readable storage medium (for example, memory, storage, or the like), one or more radio access components (for example, a modem, a transceiver, or the like), and/or a user interface.
  • apparatus 20 may be configured to operate using one or more radio access technologies, such as GSM, LTE, LTE-A, NR, 5G, WLAN, WiFi, NB-IoT, Bluetooth, NFC, MulteFire, and/or any other radio access technologies. It should be noted that one of ordinary skill in the art would understand that apparatus 20 may include components or features not shown in Fig. 5b.
  • apparatus 20 may include or be coupled to a processor 22 for processing information and executing instructions or operations.
  • processor 22 may be any type of general or specific purpose processor.
  • processor 22 may include one or more of general- purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs), field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), and processors based on a multi-core processor architecture, as examples. While a single processor 22 is shown in Fig. 5b, multiple processors may be utilized according to other embodiments.
  • apparatus 20 may include two or more processors that may form a multiprocessor system (e.g., in this case processor 22 may represent a multiprocessor) that may support multiprocessing.
  • processor 22 may represent a multiprocessor
  • the multiprocessor system may be tightly coupled or loosely coupled (e.g., to form a computer cluster).
  • Processor 22 may perform functions associated with the operation of apparatus 20 including, as some examples, precoding of antenna gain/phase parameters, encoding and decoding of individual bits forming a communication message, formatting of information, and overall control of the apparatus 20, including processes related to management of communication resources.
  • Apparatus 20 may further include or be coupled to a memory 24 (internal or external), which may be coupled to processor 22, for storing information and instructions that may be executed by processor 22.
  • Memory 24 may be one or more memories and of any type suitable to the local application environment, and may be implemented using any suitable volatile or nonvolatile data storage technology such as a semiconductor-based memory device, a magnetic memory device and system, an optical memory device and system, fixed memory, and/or removable memory.
  • memory 24 can be comprised of any combination of random access memory (RAM), read only memory (ROM), static storage such as a magnetic or optical disk, hard disk drive (HDD), or any other type of non-transitory machine or computer readable media.
  • the instructions stored in memory 24 may include program instructions or computer program code that, when executed by processor 22, enable the apparatus 20 to perform tasks as described herein.
  • apparatus 20 may further include or be coupled to (internal or external) a drive or port that is configured to accept and read an external computer readable storage medium, such as an optical disc, USB drive, flash drive, or any other storage medium.
  • an external computer readable storage medium such as an optical disc, USB drive, flash drive, or any other storage medium.
  • the external computer readable storage medium may store a computer program or software for execution by processor 22 and/or apparatus 20.
  • apparatus 20 may also include or be coupled to one or more antennas 25 for receiving a downlink signal and for transmitting via an uplink from apparatus 20.
  • Apparatus 20 may further include a transceiver 28 configured to transmit and receive information.
  • the transceiver 28 may also include a radio interface (e.g., a modem) coupled to the antenna 25.
  • the radio interface may correspond to a plurality of radio access technologies including one or more of GSM, LTE, LTE-A, 5G, NR, WLAN, NB-IoT, Bluetooth, BT-LE, NFC, RFID, UWB, and the like.
  • the radio interface may include other components, such as filters, converters (for example, digital-to-analog converters and the like), symbol demappers, signal shaping components, an Inverse Fast Fourier Transform (IFFT) module, and the like, to process symbols, such as OFDMA symbols, carried by a downlink or an uplink.
  • transceiver 28 may be configured to modulate information on to a carrier waveform for transmission by the antenna(s) 25 and demodulate information received via the antenna(s) 25 for further processing by other elements of apparatus 20.
  • transceiver 28 may be capable of transmitting and receiving signals or data directly.
  • apparatus 20 may include an input and/or output device (I/O device).
  • apparatus 20 may further include a user interface, such as a graphical user interface or touchscreen.
  • memory 24 stores software modules that provide functionality when executed by processor 22.
  • the modules may include, for example, an operating system that provides operating system functionality for apparatus 20.
  • the memory may also store one or more functional modules, such as an application or program, to provide additional functionality for apparatus 20.
  • the components of apparatus 20 may be implemented in hardware, or as any suitable combination of hardware and software.
  • apparatus 20 may optionally be configured to communicate with apparatus 10 via a wireless or wired communications link 70 according to any radio access technology, such as NR.
  • processor 22 and memory 24 may be included in or may form a part of processing circuitry or control circuitry.
  • transceiver 28 may be included in or may form a part of transceiving circuitry.
  • apparatus 20 may be a UE, mobile device, mobile station, ME, loT device and/or NB-IoT device, for example.
  • apparatus 20 may be controlled by memory 24 and processor 22 to perform the functions associated with any of the embodiments described herein, such as some operations illustrated in, or described with respect to, Figs. 1-4.
  • apparatus 20 may be controlled by memory 24 and processor 22 to perform the methods of Figs. 3 or 4.
  • an apparatus may include means for performing a method or any of the variants discussed herein, e.g., a method described with reference to Figs. 3 or 4.
  • Examples of the means may include one or more processors, memory, and/or computer program code for causing the performance of the operation.
  • certain example embodiments provide several technological improvements, enhancements, and/or advantages over existing technological processes. For example, one benefit of some example embodiments is facilitating use of feedback to update data analytics. Accordingly, the use of some example embodiments results in improved functioning of communications networks and their nodes and, therefore constitute an improvement at least to the technological field of analytics-based device operations, among others.
  • any of the methods, processes, signaling diagrams, algorithms or flow charts described herein may be implemented by software and/or computer program code or portions of code stored in memory or other computer readable or tangible media, and executed by a processor.
  • an apparatus may be included or be associated with at least one software application, module, unit or entity configured as arithmetic operation(s), or as a program or portions of it (including an added or updated software routine), executed by at least one operation processor.
  • Programs also called program products or computer programs, including software routines, applets and macros, may be stored in any apparatus-readable data storage medium and may include program instructions to perform particular tasks.
  • a computer program product may include one or more computerexecutable components which, when the program is run, are configured to carry out some example embodiments.
  • the one or more computer-executable components may be at least one software code or portions of code. Modifications and configurations used for implementing functionality of an example embodiment may be performed as routine(s), which may be implemented as added or updated software routine(s). In one example, software routine(s) may be downloaded into the apparatus.
  • software or a computer program code or portions of code may be in a source code form, object code form, or in some intermediate form, and it may be stored in some sort of carrier, distribution medium, or computer readable medium, which may be any entity or device capable of carrying the program.
  • carrier may include a record medium, computer memory, read-only memory, photoelectrical and/or electrical carrier signal, telecommunications signal, and/or software distribution package, for example.
  • the computer program may be executed in a single electronic digital computer or it may be distributed amongst a number of computers.
  • the computer readable medium or computer readable storage medium may be a non-transitory medium.
  • the functionality may be performed by hardware or circuitry included in an apparatus (e.g., apparatus 10 or apparatus 20), for example through the use of an application specific integrated circuit (ASIC), a programmable gate array (PGA), a field programmable gate array (FPGA), or any other combination of hardware and software.
  • ASIC application specific integrated circuit
  • PGA programmable gate array
  • FPGA field programmable gate array
  • the functionality may be implemented as a signal, such as a non-tangible means that can be carried by an electromagnetic signal downloaded from the Internet or other network.
  • an apparatus such as a node, device, or a corresponding component, may be configured as circuitry, a computer or a microprocessor, such as single-chip computer element, or as a chipset, which may include at least a memory for providing storage capacity used for arithmetic operation(s) and/or an operation processor for executing the arithmetic operation(s).
  • Example embodiments described herein apply equally to both singular and plural implementations, regardless of whether singular or plural language is used in connection with describing certain embodiments. For example, an embodiment that describes operations of a single network node equally applies to embodiments that include multiple instances of the network node, and vice versa.

Abstract

Certain example embodiments provide systems, methods, apparatuses, and computer program products for analytics feedback. For example, certain embodiments may improve the estimates of an analytics function by enabling the use data that is available in the network services or applications using data analytics (e.g., for labelling, in order to improve the internal algorithms and models of the analytics function). Consumers of the analytics services may provide feedback about the quality of the estimates determined by the analytics function. To facilitate this, analytics consumers (e.g., network functions (NFs), application functions (AFs), or management services (MnSs)) may provide feedback and may rate the accuracy of the statistics and/or predictions from the analytics function.

Description

TITLE: APPARATUS AND METHOD FOR ENABLING ANALYTICS FEEDBACK
FIELD:
[0001] Some example embodiments may generally relate to mobile or wireless telecommunication systems, such as Long Term Evolution (LTE) or fifth generation (5G) radio access technology or new radio (NR) access technology, or other communications systems. For example, certain embodiments may relate to systems and/or methods for enabling analytics feedback.
BACKGROUND:
[0002] Examples of mobile or wireless telecommunication systems may include the Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access Network (UTRAN), Long Term Evolution (LTE) Evolved UTRAN (E-UTRAN), LTE- Advanced (LTE- A), MulteFire, LTE-A Pro, and/or fifth generation (5G) radio access technology or new radio (NR) access technology. 5G wireless systems refer to the next generation (NG) of radio systems and network architecture. 5G is mostly built on a new radio (NR), but a 5G (or NG) network can also build on E-UTRA radio. It is estimated that NR may provide bitrates on the order of 10-20 Gbit/s or higher, and may support at least enhanced mobile broadband (eMBB) and ultrareliable low-latency-communication (URLLC) as well as massive machine type communication (mMTC). NR is expected to deliver extreme broadband and ultra-robust, low latency connectivity and massive networking to support the Internet of Things (IoT). With loT and machine-to-machine (M2M) communication becoming more widespread, there will be a growing need for networks that meet the needs of lower power, low data rate, and long battery life. It is noted that, in 5G, the nodes that can provide radio access functionality to a user equipment (i.e., similar to Node B in UTRAN or eNB in LTE) may be named gNB when built on NR radio and may be named NG- eNB when built on E-UTRA radio.
SUMMARY:
[0003] According to a first embodiment, a method may include receiving data analytics from an analytics function. The method may include transmitting feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics. The feedback information may include one or more quality indicators.
[0004] In a variant, the one or more information elements may indicate one or more dimensions of the data analytics. In a variant, the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services. In a variant, the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
[0005] In a variant, the method may further include detecting that the data analytics deviate from expected values based on at least one of whether the data analytics are within a range of the expected values, whether the data analytics exceed, or fail to exceed, the expected values, and an amount that the data analytics exceed or fail to exceed, whether the data analytics exceed, or fail to exceed, the expected values in a certain geographic location and an amount that the data analytics exceed or fail to exceed, or whether the data analytics exceed, or fail to exceed, the expected values at a certain time window and an amount that the data analytics exceed or fail to exceed. In a variant, the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators. In a variant, the one or more accuracy ratings may be based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
[0006] According to a second embodiment, an apparatus may include at least one processor and at least one memory comprising computer program code. The at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to receive data analytics from an analytics function. The at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to transmit feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics. The feedback information may include one or more quality indicators.
[0007] In a variant, the one or more information elements may indicate one or more dimensions of the data analytics. In a variant, the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services. In a variant, the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
[0008] In a variant, the at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to detect that the data analytics deviate from expected values based on at least one of whether the data analytics are within a range of the expected values, whether the data analytics exceed, or fail to exceed, the expected values, and an amount that the data analytics exceed or fail to exceed, whether the data analytics exceed, or fail to exceed, the expected values in a certain geographic location and an amount that the data analytics exceed or fail to exceed, or whether the data analytics exceed, or fail to exceed, the expected values at a certain time window and an amount that the data analytics exceed or fail to exceed. In a variant, the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators. In a variant, the one or more accuracy ratings may be based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
[0009] According to a third embodiment, a method may include transmitting data analytics to a data consumer. The method may include receiving feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics. The feedback information may comprise one or more quality indicators.
[0010] In a variant, the one or more information elements may indicate one or more dimensions of the data analytics. In a variant, the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services. In a variant, the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer. [0011] In a variant, the method may further include assessing the feedback information based on one or more of a quantity of data used for the data analytics in a range of values or with respect to a geographic location indicated in the feedback information, a reinforcement learning rule used for generating the data analytics, an algorithm used for generating the data analytics, or whether a source of data used for generating the data analytics is an abnormal source of the data. In a variant, the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators. In a variant, the one or more accuracy ratings may be based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics. In a variant, the method may further include performing one or more actions that at least one of stopping the analytics function, re-training a model, replacing the model, or using a new reinforcement rule in the model.
[0012] According to a fourth embodiment, an apparatus may include at least one processor and at least one memory comprising computer program code. The at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to transmit, by an analytics function hosted on the apparatus, data analytics to a data consumer. The at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to receive feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics. The feedback information may comprise one or more quality indicators.
[0013] In a variant, the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services. In a variant, the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
[0014] In a variant, the at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to assess the feedback information based on one or more of a quantity of data used for the data analytics in a range of values or with respect to a geographic location indicated in the feedback information, a reinforcement learning rule used for generating the data analytics, an algorithm used for generating the data analytics, or whether a source of data used for generating the data analytics is an abnormal source of the data. In a variant, the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators. In a variant, the one or more accuracy ratings may be based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics. In a variant, the at least one memory and computer program code may be configured, with the at least one processor, to cause the apparatus at least to perform one or more actions that comprise at least one of stopping the analytics function, re-training a model, replacing the model, or using a new reinforcement rule in the model.
[0015] A fifth embodiment may be directed to an apparatus that may include circuitry configured to cause the apparatus to perform at least operations according to the first embodiment, the second embodiment, the third embodiment, or the fourth embodiment, or any of the variants discussed above. [0016] A sixth embodiment may be directed to an apparatus that may include means for performing at least operations according to the first embodiment, the second embodiment, the third embodiment, or the fourth embodiment, or any of the variants discussed above. Examples of the means may include one or more processors, memory, and/or computer program codes for causing the performance of the operation.
[0017] A seventh embodiment may be directed to a computer readable medium comprising program instructions stored thereon for causing an apparatus to perform at least operations according to the first embodiment, the second embodiment, the third embodiment, or the fourth embodiment, or any of the variants discussed above.
[0018] An eighth embodiment may be directed to a computer program product encoding instructions for causing an apparatus to perform at least operations according to the first embodiment, the second embodiment, the third embodiment, or the fourth embodiment, or any of the variants discussed above.
BRIEF DESCRIPTION OF THE DRAWINGS:
[0019] For proper understanding of example embodiments, reference should be made to the accompanying drawings, wherein:
[0020] Fig. 1 illustrates an example of a data analytics system comprising a network data analytics function (e.g. NWDAF), according to some embodiments;
[0021] Fig. 2 illustrates an example of a data analytics system comprising a management data analytics function (e.g. MDAF), according to some embodiments;
[0022] Fig. 3 illustrates an example flow diagram of a method, according to some embodiments;
[0023] Fig. 4 illustrates an example flow diagram of a method, according to some embodiments; [0024] Fig. 5a illustrates an example block diagram of an apparatus, according to an embodiment; and
[0025] Fig. 5b illustrates an example block diagram of an apparatus, according to another embodiment.
DETAILED DESCRIPTION:
[0026] It will be readily understood that the components of certain example embodiments, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of some example embodiments of systems, methods, apparatuses, and computer program products for enabling analytics feedback is not intended to limit the scope of certain embodiments but is representative of selected example embodiments.
[0027] The features, structures, or characteristics of example embodiments described throughout this specification may be combined in any suitable maimer in one or more example embodiments. For example, the usage of the phrases “certain embodiments,” “some embodiments,” or other similar language, throughout this specification refers to the fact that a particular feature, structure, or characteristic described in connection with an embodiment may be included in at least one embodiment. Thus, appearances of the phrases “in certain embodiments,” “in some embodiments,” “in other embodiments,” or other similar language, throughout this specification do not necessarily all refer to the same group of embodiments, and the described features, structures, or characteristics may be combined in any suitable manner in one or more example embodiments. In addition, the phrase “set of’ refers to a set that includes one or more of the referenced set members. As such, the phrases “set of,” “one or more of,” and “at least one of,” or equivalent phrases, may be used interchangeably. Further, “or” is intended to mean “and/or,” unless explicitly stated otherwise. [0028] Additionally, if desired, the different functions or operations discussed below may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the described functions or operations may be optional or may be combined. As such, the following description should be considered as merely illustrative of the principles and teachings of certain example embodiments, and not in limitation thereof.
[0029] Network data analytics may be utilized in several network architectures. For example, the network architectures include the Third Generation Partnership Project (3 GPP) core network with the advent of the NWDAF and in the management plane with the management data analytics (MDA) in an end-to-end management domain, as well as in subordinated management domains for specific parts of the network.
[0030] During network planning, when launching an analytics service, an operator may have to define the scope of data, e.g., the numeric range of data, that the analytics function or service (e.g., NWDAF, MDAF, or similar functions) may consume for training purposes for machine learning (ML) models.
[0031] An analytics function may be able to use just a subset of the data collected by a telecommunication system. One limiting factor is the amount of data that the various network functions (NFs) in the telecommunication system are collecting. Another factor includes the availability of data via standardized interfaces (e.g., standardized format and standardized semantics) because a network designer may have to integrate NFs from multiple vendors to report their data to the analytics functions. Therefore, the operator may have to decide, at design time, on a subset of data that appears reasonable to support the desired use case of the analytics function.
[0032] NFs may collect more data about their operations than reported to the analytics function. Therefore, during run-time, the NF may have a broader knowledge about their environment and their own operations due to data that is kept private. As a result, there may be a need for improved accuracy of statistics or predictions provided by the analytics function.
[0033] During runtime, the results of an analytics function (e.g., a NWDAF, MDAF, or analytics functions in zero-touch network and service management (ZSM) architecture) may be based on just subsets of data. The individual NF might detect, by using its own private data, that the results of the analytics function are not as expected for the specific environment and operation point of the specific network function (e.g., a specific private metric may exceed a threshold which does not conform to the analytics result). Such errors may affect predictions, including: a future time (e.g., a prediction that refer to timeframes more than an amount of time in the future may be erroneous); a time window (e.g., a prediction for a particular day may be erroneous); an area (e.g., a prediction relating to the geographic location may not be usable); a target (e.g., a user equipment (UE), group of UEs, or a network slice); a data range (e.g., a prediction above or below a certain data value or data range may not be accurate or may have a different accuracy degree); and a service (e.g., a prediction for certain services may be different if a certain service relies on a private feature and consequently a private metric).
[0034] In certain architectures, the consumers of the analytics function may not have the possibility to inform the analytics function that the results are not as expected. Thus, the analytics function may not have a chance to take such feedback into account, e.g., to trigger model re-training. For example, the analytics function, e.g., NWDAF or MDAF, may just use newly collected data for improving its accuracy. This process may be too slow since (i) the cycles of re-training may take a significant amount of time; (ii) there may be no indication of the range of data and/or region where inaccuracy was experienced; (iii) there may be no indication as to which application the data deviated from the expectation; and (iv) there may be no means to check if the source of the data experiences an issue, such as a fault or a hijacking. The process of using newly collected data to improve the accuracy of the analytics function may not work in cases where the analytics function is working with models that have been trained by other entities (e.g., by different instances of an NWDAF or management data analytics service (MDAS)) and where models may be exchanged between analytics functions. In such cases, the analytics function may just receive data required for inference and may just include business logic for making an inference. The analytics function, in addition, may neither be able to receive the data needed to assess the quality of the results nor be able to perform retraining. As can be understood by this, there may be a need for timely feedback to an analytics function.
[0035] Further, the operator or designer of the analytics function may have a limited chance to assess the situation, because the network functions or management services may not offer a standardized format to relate performance data to a specific scope like geographic area, a future time, or a time window of the problems. Hence, indicating with a feedback from the consumer that the statistics or predictions were not as expected can improve the output of the corresponding analytics function. The indication that the statistics or predictions were not as expected may be given, for example, as a direct feedback from the consumer. The direct feedback may be immediate or without delay, wherein the delay may be defined based on the application or use scenario.
[0036] A 3 GPP NWDAF may offer the means of enhancing the operation of NFs, AFs, and 0AM functions using analytics. However, the NFs, AFs, or 0AM function may not have means to rate the accuracy of the NWDAF analytics for improving the provided statistics or predictions. Some aspects of NR may include sharing trained data models among multiple NWDAF instances. In addition, NR may include use of an artificial intelligence (Al) function that trains and provides Al Models to NFs, federated machine learning (ML) training that handles model training based on data sets that are distributed in different NFs. Each client NWDAF may train, locally, the related ML model with its own data and then may share it to a server NWDAF, which aggregates to create a global or optimal ML model. A provider NWDAF instance may supply the trained data model to a consumer NWDAF instance via a data model provision service and may register this capability (e.g., to expose a trained data model, in the network repository function (NRF)). However, aspects of NR may not provide feedback on an analytics model in order to improve its performance. For example, in the management plane, the MDAS producer may provide root case analysis for various performance and fault management issues and other network optimization processes. The MDAS producer may classify the input data to separate it into: (i) ML model training data and (ii) data for analytic services, while following the ML model training process, it may provide the ML model training report as one kind of output to the corresponding MDAS consumer. The MDAS producer may perform such data classification considering various input data sources. Although the MDAS consumer may be involved in the ML model training process, there is no way for the MDAS to provide validation or feedback.
[0037] Some embodiments described herein may provide for analytics feedback. For example, certain embodiments may improve the estimates of an analytics function by enabling the use of data that is available in the network services or applications using data analytics (e.g., for labelling, in order to improve the internal algorithms and models of the analytics function). Consumers of the analytics services may provide feedback about the quality of the estimates determined by the analytics function. To facilitate this, analytics consumers (e.g., network functions (NFs), application functions (AFs), or management services (MnSs)) may provide feedback and may rate the accuracy of the statistics and/or predictions from the analytics function. Such feedback may include information elements that allow the analytic service to correlate or match the feedback to specific data analytics (e.g., regarding certain applications, geographical areas, and times). [0038] Certain embodiments may utilize an interface that enables the consumer of data analytics to provide feedback to an analytics function. In certain embodiments, the feedback may include a quality indicator (e.g., an action quality indicator). Additionally, or alternatively, the feedback may include information to indicate a scope the feedback. For example, the scope may include a future time (e.g., a prediction that refers to a timeframe more than an amount of time in the future may be incorrect). Additionally, or alternatively, the scope may include a time window (e.g., a prediction for a particular day may be incorrect). Additionally, or alternatively, the scope may include an area (e.g., a prediction relating to a location or geographic area may not be usable).
[0039] In certain embodiments, the analytics function may not be capable of knowing private metrics of the consumer. Additionally, or alternatively, the interface and payload may be standardized to facilitate multi-vendor deployments. As such, certain embodiments may utilize one or more interfaces that facilitate the NFs, AFs, and MnSs to provide feedback to the analytics function. In such an interface, feedback (e.g., a message or signalling) may indicate an accuracy of the data analytics service (e.g., considering a range of expected or unexpected values) or a range of an inaccuracy (e.g., higher or lower than expected or a deviation from the expected value). Additionally, or alternatively, the feedback may indicate a geographic area of usage or an application of the data analytics (e.g., a session management function (SMF) selection of a user plane function (UPF)). Additionally, or alternatively, the feedback may indicate a time window to which the feedback relates. Additionally, or alternatively, the feedback may include a range of values experienced (e.g., actual values rather than predicted values).
[0040] The analytics function, after receiving the feedback and a corresponding accuracy rating, may assess the feedback. The assessment may be based on a quantity of data values in a range indicated in the feedback (e.g., too few data may be an indication that more training data is needed (considering the scope of the feedback). Additionally, or alternatively, the assessment may include use of a reinforcement learning rule (e.g., which can serve as an additional source for a reward function in a ML model). Additionally, or alternatively, the assessment may include usage of various algorithms for certain applications (e.g., different algorithms may be better suited for different usages of data analytics). Additionally, or alternatively, the assessment may include an evaluation of whether a source of data used for the data analytics is an abnormal source of data (e.g., if the data received contains metadata that indicates the source).
[0041] In this way, certain embodiments may provide for data analytics and feedback provided by a data analytics consumer, thereby reducing or eliminating the need to rely on a training report exchange. This facilitates improvement in the performance of an analytics function. For example, certain embodiments may improve the accuracy of statistics or predictions provided by the analytics function by providing a way to rate the accuracy of analytics data with respect to the particular use or application and feeding this information back to the analytics function. In addition, certain embodiments may provide a way to validate the correctness of the data sources, thereby further improving operations of an analytics function.
[0042] Fig. 1 illustrates an example 100 of a data analytics system comprising an NWDAF, according to some embodiments. As illustrated in Fig. 1, the example 100 may include one or more data analytics sources 102 (e.g., one or more NFs, one or more AFs, a unified data repository (UDR), and an operations, administration, and management (0AM) function). Additionally, the example 100 may include one or more NWDAFs 104 comprising a corresponding analytics instance deployment (analytics ID) 106. For example, analytics ID 106 may include an instance of a ML or Al model used to generate data analytics (e.g., statistics or predictions) based on the data from the data analytics sources 102. Further, the example 100 may include one or more data analytics outputs 108 (e.g., one or more NFs, one or more AFs, and an OAM function). The data analytics outputs 108 may include destinations for data analytics.
[0043] The data analytics sources 102 may provide data to the NWDAFs 104. The NWDAFs 104 may generate data analytics from the provided data using the analytics ID 106. After generating the data analytics, the NWDAFs 104 may provide the data analytics to the data analytics outputs 108. The analytics consumer of the data analytics outputs 108 may detect that the data analytics deviate from expected values, and may transmit feedback information related to a quality of the data analytics to the NWDAFs 104. The feedback information may include one or more data elements that the NWDAFs 104 can use to match the feedback information to the provided data analytics. After receiving the feedback information, the NWDAFs 104 may assess the feedback information (e.g., to determine a cause of the deviation from the expected values, or to try to improve the data analytics). The NWDAFs 104 may then perform, based on assessing the feedback information, one or more actions related to the model that was used to generate the data analytics. These and other aspects of certain embodiments are described elsewhere herein.
[0044] Some embodiments may utilize different NWDAF 104 deployment scenarios. For example, just one NWDAF 104 may be deployed in the network, or multiple different instances of the NWDAF 104, supporting certain analytics capabilities residing at one or more distinct locations, may be deployed. Certain embodiments may provide for configurations of an interface and payload, which may be standardized to enable multi-vendor deployments. Certain embodiments may be deployed in other scenarios, for example, the NWDAFs 104 may be deployed in a UE, in a base station, or in a non-real-time/near-real-time radio intelligent controller (non-RT/near-RT RIC). [0045] As described above, Fig. 1 is provided as an example. Other examples are possible, according to some embodiments.
[0046] Fig. 2 illustrates an example 200 of a data analytics system comprising a MDAF, according to some embodiments. As illustrated in Fig. 2, the example 200 may include a management data analytics service (MDAS) consumer 202. Additionally, the example 200 may include a MDAF 204, which may include a MDAS producer 206, an analytics model 208, a MDAS consumer 210, a management services (MnS) consumer 212, and a NWDAF subscriber 214. Additionally, the example 200 may include another MDAS producer 216, a MnS producer 218, and a NWDAF 220.
[0047] The elements illustrated in Fig. 2 may perform operations similar to that described elsewhere herein. For example, the MDAS producer 216, the MnS producer 218, and the NWDAF 220 may be data analytics sources that provide data to the MDAF 204. The MDAF 204 may generate data analytics using the analytics model 208, and may provide the data analytics to the MDAS consumer 202. After detecting that the data analytics deviate from expected values, the MDAS consumer 202 may provide feedback information to the MDAF 204, and the MDAF 204 may assess the feedback information. [0048] The MDAF 204 may provide for analytics and automation in the management plane for MnSs, self-organizing network (SON) functions, optimization tools, human operators, and or a core network. Using a servicebased management architecture (SBMA), the MDAS consumer 202 may collect MnS performance measurements, network configurations, alarms and/or trace data as well as quality of experience (QoE) and minimization of drive test (MDT) reports.
[0049] Analytics consumers may provide an accuracy rating of the received analytics towards the corresponding analytics function, e.g., the MDAF 204, after using it. In this way, the MDAS producer 206 may receive rapid feedback regarding data analytics. Such accuracy rating may include a high, medium, or low accuracy rating. The accuracy rating may include other levels related to a range values of use (e.g., a low accuracy for a network slice management function (NSMF) consumer or gNB). This may indicate to the MDAF 204 that the analytics model is behaving abnormally. Similarly, the MDAS producer 206 may receive, from a SON function hosted on a network node (e.g., a gNB), feedback that the predicted radio conditions or expected radio utilization was exceeding indicated values.
[0050] The timeframe of a prediction may indicate an accuracy for a subtimeframe duration (e.g., a prediction more than 20 minutes of a time frame of 30 minutes may show a certain percentage drop in accuracy, or a predictions on a particular day of the week between 10:00am- 12:00pm may show a certain percentage drop in accuracy). A geographic area may indicate an accuracy (e.g., a load prediction in a train station may be of low accuracy). The geographic area may be identified using coordinates, a cell, a tracking area, and/or the like. Mobility pattern may affect prediction accuracy (e.g., a prediction of radio conditions may have different accuracy levels with respect to different user mobility patterns). An application or service that is used may affect an accuracy of a prediction. For example, an application or a service may rely on private metrics of a data analytics consumer (e.g., MDAS consumer 202), and these metrics may not be visible to the analytics function. For example, a prediction of radio availability may be different for a vehicle service compared to a service for drones, due to the difference in elevation and reception, or due to different selection of a UPF by an SMF. By considering application or service, certain embodiments may address the different deployment needs of different applications or services, thereby improving data analytics. An inaccuracy can also be reported in various ways including, e.g., the range of values experienced instead of the ones expected, the average deviation from the expected value, the deviation function from the expected value, and/or the like.
[0051] Such consumer feedback may trigger an analytics function, to perform one or more actions. For example, the analytics function may perform model re-training. For example, in the case of an ML algorithm, a range of data indicated as inaccurate or an indication that a data set does not include a sufficient amount of data may be an indication that more training data is needed. This can be combined with other attributes, e.g., the time frame or with respect to a certain geographic area. The analytics function may perform model replacement, if there is information regarding the capabilities of an analytics model (e.g., if the analytics model cannot capture certain value ranges or behaviour with a desired accuracy). The analytics function may use a new reinforcement rule to capture a new behaviour or a certain range of values or for capturing the needs of a new application or service.
[0052] In this way, certain embodiments may improve the accuracy of an MDAS producer report, by providing feedback directly by an MDAS consumer, e.g., after using a data analytics report, data related to the quality of data analytics and may additionally rate the accuracy of the provided statistics or predictions. For instance, an MDAS consumer may detect by using its own private data that the data analytics results are not as expected for the environment and operation of the MDAS consumer (or an application thereof), e.g., a private metric might exceed a threshold which is not according to the analytics result.
[0053] Such errors may affect predictions related to a future time where, e.g., predictions that refer to timeframes more than an amount of days in the future are wrong. Additionally, or alternatively, the errors may affect predictions related to a certain time window where, e.g., predictions between times on a day of the week were inaccurate. Additionally, or alternatively, the errors may affect predictions related to a certain geographical area where, e.g., predictions relating to a location are not usable. Additionally, or alternatively, the errors may affect predictions related to a target UE or a group of UEs, or a network slice. Additionally, or alternatively, the errors may affect predictions related to a certain data range where, e.g., statistics or predictions above or below a certain data value or data range may not be accurate or may have a different accuracy degree. Additionally, or alternatively, the errors may affect predictions related to a MnS where, e.g., statistics or prediction may be different if the MnS relies on a private feature and/or a private metric. As explained above, in NR the MDAS consumers may not have means to inform the MDAS producer of the above-described issues that the analytics results are not as expected. Thus, the MDAS producer may not have the opportunity to update data analytics based on such feedback, e.g., to trigger re-training of the internal algorithms.
[0054] Certain embodiments described herein may involve a MDAS consumer that can provide feedback to the MDAS producer related to the received analytics report, which may include data and a rating of the accuracy of statistics or predictions in the analytics report. The feedback from the MDAS consumer may allow the MDAS producer to correlate the feedback to specific analytics results regarding a certain MnSs, geographical areas, UE groups, times (or time windows), and/or the like. The MDAS consumer may provide an accuracy rating of the received analytics report after using it. Such accuracy rating may indicate an overall accuracy for data analytics, e.g., a high, medium or low accuracy or inaccuracy, or may include ratings for certain sub-categories of accuracy, such as: an accuracy of the analytic service considering a range of expected or not expected values, a range of inaccuracy (e.g., higher or lower than expected and/or a deviation from the expected value), a geographical area of usage and/or MnS usage, a time frame of statistics or a prediction, and/or a range of actual values experienced at the MDAS consumer. In this way, the MDAS producer may receive rapid feedback.
[0055] Such MDAS consumer feedback may trigger to the MDAS producer to perform, e.g., model re-training, model replacement, or the introduction of a new rule into a model. Furthermore, in certain embodiments, the feedback may include information that identifies a geographic area, a time, an MnS usage, and/or the like associated with the feedback so that the MDAS producer can match the feedback to data analytics provided to the data consumer and/or a model used to generate the data analytics.
[0056] This may solve problems in NR where an MDAS producer just relies on newly collected data for improving its accuracy, a process that may be further improved for certain scenarios since: (i) the cycles of data collection may take too much time, (ii) the MDAS producer may not have an indication of the range of data and/or region where consumers experience inaccuracy,
(iii) the MDAS producer may not have an indication with respect to which MDAS consumer the data analytics results did not meet the expectation, and
(iv) the MDAS producer may not have means to check if the sources of the data experiences a fault or has been hijacked.
[0057] As indicated above, Fig. 2 is provided as an example. Other examples are possible, according to some embodiments.
[0058] Fig. 3 illustrates an example flow diagram of a method 300, according to some embodiments. For example, Fig. 3 may illustrate example operations of a network node (e.g., apparatus 10 illustrated in, and described with respect to, Fig. 5a) or a UE (e.g., apparatus 20 illustrated in, and described with respect to, Fig. 5b). Specifically, Fig. 3 may illustrate example operations of an analytics function hosted on the network node or the UE. Some of the operations illustrated in Fig. 3 may be similar to some operations shown in, and described with respect to, Figs. 1 and 2.
[0059] In an embodiment, the method may include, at 302, transmitting data analytics to a data consumer. The method may include, at 304, receiving feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics. The feedback information may comprise one or more quality indicators.
[0060] The method illustrated in Fig. 3 may include one or more additional aspects described below or elsewhere herein. In some embodiments, the one or more information elements may indicate one or more dimensions of the data analytics (e.g., one or more characteristics of the data analytics). In some embodiments, the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services. In some embodiments, the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
[0061] In some embodiments, the method may further include assessing the feedback information based on one or more of a quantity of data used for the data analytics in a range of values or with respect to a geographic location indicated in the feedback information, a reinforcement learning rule used for generating the data analytics, an algorithm used for generating the data analytics, or whether a source of data used for generating the data analytics is an abnormal source of the data. In some embodiments, the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators. In some embodiments, the one or more accuracy ratings may be based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics. In some embodiments, the method may further include performing one or more actions that include at least one of stopping the analytics function, re-training the model, replacing a model, or using a new reinforcement rule in the model. In certain embodiments, the feedback may explicitly trigger performance of the one or more actions. Alternatively, the decision of whether to perform the one or more actions may be made by the analytics function or the data consumer (or analytics service consumer). In some embodiments, the data consumer may not know whether the analytics function uses an Al and/or ML model, and the decision of whether to perform one or more actions may be made by the analytics function. In other embodiments, the data consumer may decide on the action that it expects from the analytics function, and may send the feedback information to the analytics function for performance of that action.
[0062] As described above, Fig. 3 is provided as an example. Other examples are possible according to some embodiments.
[0063] Fig. 4 illustrates an example flow diagram of a method 400, according to some embodiments. For example, Fig. 4 may illustrate example operations of a network node (e.g., apparatus 10 illustrated in, and described with respect to, Fig. 5a) or a UE (e.g., apparatus 20 illustrated in, and described with respect to, Fig. 5b). Specifically, Fig. 4 may illustrate example operations of a data consumer hosted on the network node or the UE. Some of the operations illustrated in Fig. 4 may be similar to some operations shown in, and described with respect to, Figs. 1 and 2.
[0064] In an embodiment, the method may include, at 402, receiving data analytics from an analytics function. The method may include, at 404, transmitting feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics. The feedback information may include one or more quality indicators.
[0065] The method illustrated in Fig. 4 may include one or more additional aspects described below or elsewhere herein. In some embodiments, the one or more information elements may indicate one or more dimensions of the data analytics. In some embodiments, the one or more dimensions may include at least one of one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services. In some embodiments, the quality may be based on at least one of an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
[0066] In some embodiments, the method may further include detecting that the data analytics deviate from expected values based on at least one of whether the data analytics are within a range of the expected values, whether the data analytics exceed, or fail to exceed, the expected values, and an amount that the data analytics exceed or fail to exceed, whether the data analytics exceed, or fail to exceed, the expected values in a certain geographic location and an amount that the data analytics exceed or fail to exceed, or whether the data analytics exceed, or fail to exceed, the expected values at a certain time window and an amount that the data analytics exceed or fail to exceed. In some embodiments, the quality may be indicated by one or more accuracy ratings included in the one or more quality indicators. In some embodiments, the one or more accuracy ratings are based on at least one of a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
[0067] As described above, Fig. 4 is provided as an example. Other examples are possible according to some embodiments.
[0068] As described above, certain operations of the method 300 or the method 400 may be performed by a UE or a network node. For example, the UE may depend on an analytics functions that predicts the surroundings of the UE or its environment (e.g., the number of UEs, the movement of objects, or other devices, in proximity to the UE, interference from other UE or cells, beam patterns, etc.). The UE may detect that these predictions are not as expected and may report this to the network node. Based on this, a UE may operate in a maimer similar to a network function, such as a data consumer, described herein. This may be applicable to various deployment scenarios, including open radio access network (O-RAN), where the RAN intelligent controller (RIC) may host the Al and/or ML model and may send information or commands to a base transceiver station (BTS), which may forward these to the UE. Then the UE may send feedback to the RIC.
[0069] Fig. 5a illustrates an example of an apparatus 10 according to an embodiment. In an embodiment, apparatus 10 may be a node, host, or server in a communications network or serving such a network. For example, apparatus 10 may be a network node, satellite, base station, a Node B, an evolved Node B (eNB), 5G Node B or access point, next generation Node B (NG-NB or gNB), and/or a WLAN access point, associated with a radio access network, such as a LTE network, 5G or NR. In some example embodiments, apparatus 10 may be an eNB in LTE or gNB in 5G. In some embodiments, apparatus 10 may host a NF, AF, UDR, 0AM function, NWDAF, analytics model, and/or the like described elsewhere herein.
[0070] It should be understood that, in some example embodiments, apparatus 10 may be comprised of an edge cloud server as a distributed computing system where the server and the radio node may be stand-alone apparatuses communicating with each other via a radio path or via a wired connection, or they may be located in a same entity communicating via a wired connection. For instance, in certain example embodiments where apparatus 10 represents a gNB, it may be configured in a central unit (CU) and distributed unit (DU) architecture that divides the gNB functionality. In such an architecture, the CU may be a logical node that includes gNB functions such as transfer of user data, mobility control, radio access network sharing, positioning, and/or session management, etc. The CU may control the operation of DU(s) over a front-haul interface. The DU may be a logical node that includes a subset of the gNB functions, depending on the functional split option. It should be noted that one of ordinary skill in the art would understand that apparatus 10 may include components or features not shown in Fig. 5a.
[0071] As illustrated in the example of Fig. 5a, apparatus 10 may include a processor 12 for processing information and executing instructions or operations. Processor 12 may be any type of general or specific purpose processor. In fact, processor 12 may include one or more of general-purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs), field-programmable gate arrays (FPGAs), applicationspecific integrated circuits (ASICs), and processors based on a multi-core processor architecture, as examples. While a single processor 12 is shown in Fig. 5a, multiple processors may be utilized according to other embodiments. For example, it should be understood that, in certain embodiments, apparatus 10 may include two or more processors that may form a multiprocessor system (e.g., in this case processor 12 may represent a multiprocessor) that may support multiprocessing. In certain embodiments, the multiprocessor system may be tightly coupled or loosely coupled (e.g., to form a computer cluster).
[0072] Processor 12 may perform functions associated with the operation of apparatus 10, which may include, for example, precoding of antenna gain/phase parameters, encoding and decoding of individual bits forming a communication message, formatting of information, and overall control of the apparatus 10, including processes related to management of communication or communication resources.
[0073] Apparatus 10 may further include or be coupled to a memory 14 (internal or external), which may be coupled to processor 12, for storing information and instructions that may be executed by processor 12. Memory 14 may be one or more memories and of any type suitable to the local application environment, and may be implemented using any suitable volatile or nonvolatile data storage technology such as a semiconductor-based memory device, a magnetic memory device and system, an optical memory device and system, fixed memory, and/or removable memory. For example, memory 14 can be comprised of any combination of random access memory (RAM), read only memory (ROM), static storage such as a magnetic or optical disk, hard disk drive (HDD), or any other type of non-transitory machine or computer readable media. The instructions stored in memory 14 may include program instructions or computer program code that, when executed by processor 12, enable the apparatus 10 to perform tasks as described herein.
[0074] In an embodiment, apparatus 10 may further include or be coupled to (internal or external) a drive or port that is configured to accept and read an external computer readable storage medium, such as an optical disc, USB drive, flash drive, or any other storage medium. For example, the external computer readable storage medium may store a computer program or software for execution by processor 12 and/or apparatus 10.
[0075] In some embodiments, apparatus 10 may also include or be coupled to one or more antennas 15 for transmitting and receiving signals and/or data to and from apparatus 10. Apparatus 10 may further include or be coupled to a transceiver 18 configured to transmit and receive information. The transceiver 18 may include, for example, a plurality of radio interfaces that may be coupled to the anteima(s) 15. The radio interfaces may correspond to a plurality of radio access technologies including one or more of GSM, NB- loT, LTE, 5G, WLAN, Bluetooth, BT-LE, NFC, radio frequency identifier (RFID), ultrawideband (UWB), MulteFire, and the like. The radio interface may include components, such as filters, converters (for example, digital-to- analog converters and the like), mappers, a Fast Fourier Transform (FFT) module, and the like, to generate symbols for a transmission via one or more downlinks and to receive symbols (for example, via an uplink).
[0076] As such, transceiver 18 may be configured to modulate information on to a carrier waveform for transmission by the anteima(s) 15 and demodulate information received via the anteima(s) 15 for further processing by other elements of apparatus 10. In other embodiments, transceiver 18 may be capable of transmitting and receiving signals or data directly. Additionally or alternatively, in some embodiments, apparatus 10 may include an input and/or output device (I/O device).
[0077] In an embodiment, memory 14 may store software modules that provide functionality when executed by processor 12. The modules may include, for example, an operating system that provides operating system functionality for apparatus 10. The memory may also store one or more functional modules, such as an application or program, to provide additional functionality for apparatus 10. The components of apparatus 10 may be implemented in hardware, or as any suitable combination of hardware and software.
[0078] According to some embodiments, processor 12 and memory 14 may be included in or may form a part of processing circuitry or control circuitry. In addition, in some embodiments, transceiver 18 may be included in or may form a part of transceiver circuitry.
[0079] As used herein, the term “circuitry” may refer to hardware-only circuitry implementations (e.g., analog and/or digital circuitry), combinations of hardware circuits and software, combinations of analog and/or digital hardware circuits with software/firmware, any portions of hardware processor(s) with software (including digital signal processors) that work together to cause an apparatus (e.g., apparatus 10) to perform various functions, and/or hardware circuit(s) and/or processor(s), or portions thereof, that use software for operation but where the software may not be present when it is not needed for operation. As a further example, as used herein, the term “circuitry” may also cover an implementation of merely a hardware circuit or processor (or multiple processors), or portion of a hardware circuit or processor, and its accompanying software and/or firmware. The term circuitry may also cover, for example, a baseband integrated circuit in a server, cellular network node or device, or other computing or network device. [0080] As introduced above, in certain embodiments, apparatus 10 may be a network node or RAN node, such as a base station, access point, Node B, eNB, gNB, WLAN access point, or the like.
[0081] According to certain embodiments, apparatus 10 may be controlled by memory 14 and processor 12 to perform the functions associated with any of the embodiments described herein, such as some operations illustrated in, or described with respect to, Figs. 1-4. For instance, apparatus 10 may be controlled by memory 14 and processor 12 to perform the methods of Figs. 3 or 4.
[0082] Fig. 5b illustrates an example of an apparatus 20 according to another embodiment. In an embodiment, apparatus 20 may be a node or element in a communications network or associated with such a network, such as a UE, mobile equipment (ME), mobile station, mobile device, stationary device, loT device, or other device. As described herein, a UE may alternatively be referred to as, for example, a mobile station, mobile equipment, mobile unit, mobile device, user device, subscriber station, wireless terminal, tablet, smart phone, loT device, sensor or NB-IoT device, a watch or other wearable, a head-mounted display (HMD), a vehicle, a drone, a medical device and applications thereof (e.g., remote surgery), an industrial device and applications thereof (e.g., a robot and/or other wireless devices operating in an industrial and/or an automated processing chain context), a consumer electronics device, a device operating on commercial and/or industrial wireless networks, or the like. As one example, apparatus 20 may be implemented in, for instance, a wireless handheld device, a wireless plugin accessory, or the like.
[0083] In some example embodiments, apparatus 20 may include one or more processors, one or more computer-readable storage medium (for example, memory, storage, or the like), one or more radio access components (for example, a modem, a transceiver, or the like), and/or a user interface. In some embodiments, apparatus 20 may be configured to operate using one or more radio access technologies, such as GSM, LTE, LTE-A, NR, 5G, WLAN, WiFi, NB-IoT, Bluetooth, NFC, MulteFire, and/or any other radio access technologies. It should be noted that one of ordinary skill in the art would understand that apparatus 20 may include components or features not shown in Fig. 5b.
[0084] As illustrated in the example of Fig. 5b, apparatus 20 may include or be coupled to a processor 22 for processing information and executing instructions or operations. Processor 22 may be any type of general or specific purpose processor. In fact, processor 22 may include one or more of general- purpose computers, special purpose computers, microprocessors, digital signal processors (DSPs), field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), and processors based on a multi-core processor architecture, as examples. While a single processor 22 is shown in Fig. 5b, multiple processors may be utilized according to other embodiments. For example, it should be understood that, in certain embodiments, apparatus 20 may include two or more processors that may form a multiprocessor system (e.g., in this case processor 22 may represent a multiprocessor) that may support multiprocessing. In certain embodiments, the multiprocessor system may be tightly coupled or loosely coupled (e.g., to form a computer cluster).
[0085] Processor 22 may perform functions associated with the operation of apparatus 20 including, as some examples, precoding of antenna gain/phase parameters, encoding and decoding of individual bits forming a communication message, formatting of information, and overall control of the apparatus 20, including processes related to management of communication resources.
[0086] Apparatus 20 may further include or be coupled to a memory 24 (internal or external), which may be coupled to processor 22, for storing information and instructions that may be executed by processor 22. Memory 24 may be one or more memories and of any type suitable to the local application environment, and may be implemented using any suitable volatile or nonvolatile data storage technology such as a semiconductor-based memory device, a magnetic memory device and system, an optical memory device and system, fixed memory, and/or removable memory. For example, memory 24 can be comprised of any combination of random access memory (RAM), read only memory (ROM), static storage such as a magnetic or optical disk, hard disk drive (HDD), or any other type of non-transitory machine or computer readable media. The instructions stored in memory 24 may include program instructions or computer program code that, when executed by processor 22, enable the apparatus 20 to perform tasks as described herein.
[0087] In an embodiment, apparatus 20 may further include or be coupled to (internal or external) a drive or port that is configured to accept and read an external computer readable storage medium, such as an optical disc, USB drive, flash drive, or any other storage medium. For example, the external computer readable storage medium may store a computer program or software for execution by processor 22 and/or apparatus 20.
[0088] In some embodiments, apparatus 20 may also include or be coupled to one or more antennas 25 for receiving a downlink signal and for transmitting via an uplink from apparatus 20. Apparatus 20 may further include a transceiver 28 configured to transmit and receive information. The transceiver 28 may also include a radio interface (e.g., a modem) coupled to the antenna 25. The radio interface may correspond to a plurality of radio access technologies including one or more of GSM, LTE, LTE-A, 5G, NR, WLAN, NB-IoT, Bluetooth, BT-LE, NFC, RFID, UWB, and the like. The radio interface may include other components, such as filters, converters (for example, digital-to-analog converters and the like), symbol demappers, signal shaping components, an Inverse Fast Fourier Transform (IFFT) module, and the like, to process symbols, such as OFDMA symbols, carried by a downlink or an uplink. [0089] For instance, transceiver 28 may be configured to modulate information on to a carrier waveform for transmission by the antenna(s) 25 and demodulate information received via the antenna(s) 25 for further processing by other elements of apparatus 20. In other embodiments, transceiver 28 may be capable of transmitting and receiving signals or data directly. Additionally or alternatively, in some embodiments, apparatus 20 may include an input and/or output device (I/O device). In certain embodiments, apparatus 20 may further include a user interface, such as a graphical user interface or touchscreen.
[0090] In an embodiment, memory 24 stores software modules that provide functionality when executed by processor 22. The modules may include, for example, an operating system that provides operating system functionality for apparatus 20. The memory may also store one or more functional modules, such as an application or program, to provide additional functionality for apparatus 20. The components of apparatus 20 may be implemented in hardware, or as any suitable combination of hardware and software. According to an example embodiment, apparatus 20 may optionally be configured to communicate with apparatus 10 via a wireless or wired communications link 70 according to any radio access technology, such as NR.
[0091] According to some embodiments, processor 22 and memory 24 may be included in or may form a part of processing circuitry or control circuitry. In addition, in some embodiments, transceiver 28 may be included in or may form a part of transceiving circuitry. As discussed above, according to some embodiments, apparatus 20 may be a UE, mobile device, mobile station, ME, loT device and/or NB-IoT device, for example. According to certain embodiments, apparatus 20 may be controlled by memory 24 and processor 22 to perform the functions associated with any of the embodiments described herein, such as some operations illustrated in, or described with respect to, Figs. 1-4. For instance, in one embodiment, apparatus 20 may be controlled by memory 24 and processor 22 to perform the methods of Figs. 3 or 4.
[0092] In some embodiments, an apparatus (e.g., apparatus 10 and/or apparatus 20) may include means for performing a method or any of the variants discussed herein, e.g., a method described with reference to Figs. 3 or 4. Examples of the means may include one or more processors, memory, and/or computer program code for causing the performance of the operation. [0093] Therefore, certain example embodiments provide several technological improvements, enhancements, and/or advantages over existing technological processes. For example, one benefit of some example embodiments is facilitating use of feedback to update data analytics. Accordingly, the use of some example embodiments results in improved functioning of communications networks and their nodes and, therefore constitute an improvement at least to the technological field of analytics-based device operations, among others.
[0094] In some example embodiments, the functionality of any of the methods, processes, signaling diagrams, algorithms or flow charts described herein may be implemented by software and/or computer program code or portions of code stored in memory or other computer readable or tangible media, and executed by a processor.
[0095] In some example embodiments, an apparatus may be included or be associated with at least one software application, module, unit or entity configured as arithmetic operation(s), or as a program or portions of it (including an added or updated software routine), executed by at least one operation processor. Programs, also called program products or computer programs, including software routines, applets and macros, may be stored in any apparatus-readable data storage medium and may include program instructions to perform particular tasks.
[0096] A computer program product may include one or more computerexecutable components which, when the program is run, are configured to carry out some example embodiments. The one or more computer-executable components may be at least one software code or portions of code. Modifications and configurations used for implementing functionality of an example embodiment may be performed as routine(s), which may be implemented as added or updated software routine(s). In one example, software routine(s) may be downloaded into the apparatus.
[0097] As an example, software or a computer program code or portions of code may be in a source code form, object code form, or in some intermediate form, and it may be stored in some sort of carrier, distribution medium, or computer readable medium, which may be any entity or device capable of carrying the program. Such carriers may include a record medium, computer memory, read-only memory, photoelectrical and/or electrical carrier signal, telecommunications signal, and/or software distribution package, for example. Depending on the processing power needed, the computer program may be executed in a single electronic digital computer or it may be distributed amongst a number of computers. The computer readable medium or computer readable storage medium may be a non-transitory medium.
[0098] In other example embodiments, the functionality may be performed by hardware or circuitry included in an apparatus (e.g., apparatus 10 or apparatus 20), for example through the use of an application specific integrated circuit (ASIC), a programmable gate array (PGA), a field programmable gate array (FPGA), or any other combination of hardware and software. In yet another example embodiment, the functionality may be implemented as a signal, such as a non-tangible means that can be carried by an electromagnetic signal downloaded from the Internet or other network.
[0099] According to an example embodiment, an apparatus, such as a node, device, or a corresponding component, may be configured as circuitry, a computer or a microprocessor, such as single-chip computer element, or as a chipset, which may include at least a memory for providing storage capacity used for arithmetic operation(s) and/or an operation processor for executing the arithmetic operation(s).
[0100] Example embodiments described herein apply equally to both singular and plural implementations, regardless of whether singular or plural language is used in connection with describing certain embodiments. For example, an embodiment that describes operations of a single network node equally applies to embodiments that include multiple instances of the network node, and vice versa.
[0101] One having ordinary skill in the art will readily understand that the example embodiments as discussed above may be practiced with operations in a different order, and/or with hardware elements in configurations which are different than those which are disclosed. Therefore, although some embodiments have been described based upon these example embodiments, it would be apparent to those of skill in the art that certain modifications, variations, and alternative constructions would be apparent, while remaining within the spirit and scope of example embodiments.
PARTIAL GLOSSARY
[0102] Application Function AF
[0103] Management Data Analytics Function MDAF
[0104] Management Data Analytics Service MDAS
[0105] Management Service MnS
[0106] Network Data Analytics Function NWDAF
[0107] Network Function NF

Claims

35 WE CLAIM:
1. An apparatus, comprising: at least one processor; and at least one memory including computer program code, wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to: transmit, by an analytics function hosted on the apparatus, data analytics to a data consumer; and receive feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics, wherein the feedback information comprises one or more quality indicators.
2. The apparatus according to claim 1, wherein the one or more information elements indicate one or more dimensions of the data analytics.
3. The apparatus according to claim 2, wherein the one or more dimensions comprise at least one of: one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services.
4. The apparatus according to one or more of claims 1-3, wherein the quality is based on at least one of: an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer. 36
5. The apparatus according to one or more of claims 1-4, wherein the at least one memory and the computer program code are configured to, with the at least one processor, further cause the apparatus at least to: assess the feedback information based on one or more of: a quantity of data used for the data analytics in a range of values or with respect to a geographic location indicated in the feedback information, a reinforcement learning rule used for generating the data analytics, an algorithm used for generating the data analytics, or whether a source of data used for generating the data analytics is an abnormal source of the data.
6. The apparatus according to one or more of claims 1-5, wherein the quality is indicated by one or more accuracy ratings included in the one or more quality indicators.
7. The apparatus according to claim 6, wherein the one or more accuracy ratings are based on at least one of: a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
8. The apparatus according to one or more of claims 1-7, wherein the at least one memory and the computer program code are configured to, with the at least one processor, further cause the apparatus at least to: perform one or more actions comprising at least one of: stopping the analytics function, re-training a model, replacing the model, or using a new reinforcement rule in the model.
9. An apparatus, comprising: at least one processor; and at least one memory including computer program code, wherein the at least one memory and the computer program code are configured to, with the at least one processor, cause the apparatus at least to: receive, by a data consumer hosted on the apparatus, data analytics from an analytics function; and transmit feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics, wherein the feedback information comprises one or more quality indicators.
10. The apparatus according to claim 9, wherein the one or more information elements indicate one or more dimensions of the data analytics.
11. The apparatus according to claim 10, wherein the one or more dimensions comprise at least one of: one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services.
12. The apparatus according to one or more of claims 9-11, wherein the quality is based on at least one of: an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
13. The apparatus according to one or more of claims 9-12, wherein the at least one memory and the computer program code are configured to, with the at least one processor, further cause the apparatus at least to: detect that the data analytics deviate from expected values based on at least one of: whether the data analytics are within a range of the expected values, whether the data analytics exceed, or fail to exceed, the expected values, and an amount that the data analytics exceed or fail to exceed, whether the data analytics exceed, or fail to exceed, the expected values in a certain geographic location and an amount that the data analytics exceed or fail to exceed, or whether the data analytics exceed, or fail to exceed, the expected values at a certain time window and an amount that the data analytics exceed or fail to exceed.
14. The apparatus according to one or more of claims 9-13, wherein the quality is indicated by one or more accuracy ratings included in the one or more quality indicators. 39
15. The apparatus according to claim 14, wherein the one or more accuracy ratings are based on at least one of: a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
16. A method, comprising: transmitting, by an analytics function hosted on a device, data analytics to a data consumer; and receiving feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics, wherein the feedback information comprises one or more quality indicators.
17. The method according to claim 16, wherein the one or more information elements indicate one or more dimensions of the data analytics.
18. The method according to claim 17, wherein the one or more dimensions comprise at least one of: one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services. 40
19. The method according to one or more of claims 16-18, wherein the quality is based on at least one of: an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
20. The method according to one or more of claims 16-19, further comprising: assessing the feedback information based on one or more of: a quantity of data used for the data analytics in a range of values or with respect to a geographic location indicated in the feedback information, a reinforcement learning rule used for generating the data analytics, an algorithm used for generating the data analytics, or whether a source of data used for generating the data analytics is an abnormal source of the data.
21. The method according to one or more of claims 16-20, wherein the quality is indicated by one or more accuracy ratings included in the one or more quality indicators.
22. The method according to claim 21, wherein the one or more accuracy ratings are based on at least one of: a range of values used for the data analytics, a range of values experienced at the data consumer, a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, 41 an application associated with the data analytics, or a service associated with the data analytics.
23. The method according to one or more of claims 16-23, further comprising: performing one or more actions comprising at least one of: re-training the model, replacing the model, or using a new reinforcement rule in the model.
24. A method, comprising: receiving, by a data consumer hosted on a device, data analytics from an analytics function; and transmitting feedback information related to a quality of the data analytics and one or more information elements associated with matching the feedback information with respect to an application to the data analytics, wherein the feedback information comprises one or more quality indicators.
25. The method according to claim 24, wherein the one or more information elements indicate one or more dimensions of the data analytics.
26. The method according to claim 25, wherein the one or more dimensions indicate at least one of: one or more future time periods, one or more time windows, one or more user equipment or network slices, one or more locations or geographic areas, or one or more applications or services. 42
27. The method according to one or more of claims 24-26, wherein the quality is based on at least one of: an accuracy or an inaccuracy of the data analytics, a range of the accuracy or the inaccuracy, or a range of values experienced at the data consumer.
28. The method according to one or more of claims 24-27, further comprising: detecting that the data analytics deviate from expected values based on at least one of: whether the data analytics are within a range of the expected values, whether the data analytics exceed, or fail to exceed, the expected values, and an amount that the data analytics exceed or fail to exceed, whether the data analytics exceed, or fail to exceed, the expected values in a certain geographic location and an amount that the data analytics exceed or fail to exceed, or whether the data analytics exceed, or fail to exceed, the expected values at a certain time window and an amount that the data analytics exceed or fail to exceed.
29. The method according to one or more of claims 24-28, wherein the quality is indicated by one or more accuracy ratings included in the one or more quality indicators.
30. The method according to claim 29, wherein the one or more accuracy ratings are based on at least one of: a range of values used for the data analytics, a range of values experienced at the data consumer, 43 a range of an accuracy or an inaccuracy of the data analytics, a time period of the data analytics, a geographic area associated with the data analytics, a mobility pattern used for the data analytics, an application associated with the data analytics, or a service associated with the data analytics.
31. An apparatus, comprising : means for performing the method according to any of claims 16-30.
32. An apparatus, comprising: circuitry configured to perform the method according to any of claims 16-30.
33. A non-transitory computer readable medium comprising program instructions stored thereon for performing the method according to any of claims 16-30.
PCT/EP2021/086737 2021-01-13 2021-12-20 Apparatus and method for enabling analytics feedback WO2022152515A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163136934P 2021-01-13 2021-01-13
US63/136,934 2021-01-13

Publications (1)

Publication Number Publication Date
WO2022152515A1 true WO2022152515A1 (en) 2022-07-21

Family

ID=80112374

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/086737 WO2022152515A1 (en) 2021-01-13 2021-12-20 Apparatus and method for enabling analytics feedback

Country Status (1)

Country Link
WO (1) WO2022152515A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024027941A1 (en) * 2022-08-03 2024-02-08 Lenovo (Singapore) Pte. Ltd Improved accuracy of analytics in a wireless communications network
WO2024078402A1 (en) * 2022-10-12 2024-04-18 维沃移动通信有限公司 Model supervision processing method and apparatus, and network side device and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021001085A1 (en) * 2019-06-30 2021-01-07 Telefonaktiebolaget Lm Ericsson (Publ) Estimating quality metric for latency sensitive traffic flows in communication networks
WO2021052556A1 (en) * 2019-09-16 2021-03-25 Huawei Technologies Co., Ltd. A device for applying artificial intelligence in a communication network
WO2021231734A1 (en) * 2020-05-14 2021-11-18 Intel Corporation Techniques for management data analytics (mda) process and service

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021001085A1 (en) * 2019-06-30 2021-01-07 Telefonaktiebolaget Lm Ericsson (Publ) Estimating quality metric for latency sensitive traffic flows in communication networks
WO2021052556A1 (en) * 2019-09-16 2021-03-25 Huawei Technologies Co., Ltd. A device for applying artificial intelligence in a communication network
WO2021231734A1 (en) * 2020-05-14 2021-11-18 Intel Corporation Techniques for management data analytics (mda) process and service

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"3rd Generation Partnership Project; Technical Specification Group Services and System Aspects; Architecture enhancements for 5G System (5GS) to support network data analytics services (Release 16)", 3GPP STANDARD; TECHNICAL SPECIFICATION; 3GPP TS 23.288, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, no. V1.0.0, 3 June 2019 (2019-06-03), pages 1 - 52, XP051753923 *
"3rd Generation Partnership Project; Technical Specification Group Services and System Aspects; Study of Enablers for Network Automation for 5G (Release 16)", 3GPP STANDARD; TECHNICAL REPORT; 3GPP TR 23.791, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, vol. SA WG2, no. V16.2.0, 11 June 2019 (2019-06-11), pages 1 - 124, XP051753968 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024027941A1 (en) * 2022-08-03 2024-02-08 Lenovo (Singapore) Pte. Ltd Improved accuracy of analytics in a wireless communications network
WO2024078402A1 (en) * 2022-10-12 2024-04-18 维沃移动通信有限公司 Model supervision processing method and apparatus, and network side device and readable storage medium

Similar Documents

Publication Publication Date Title
WO2022152515A1 (en) Apparatus and method for enabling analytics feedback
US20230209384A1 (en) Machine learning assisted operations control
US20220046410A1 (en) Systems and methods to enable representative user equipment sampling for user equipment-related analytics services
CN106465166A (en) A network node and a method therein for estimating a convergence time of interference processing in a user equipment in a radio communications network
WO2022175084A1 (en) Machine learning based channel state information estimation and feedback configuration
WO2022122208A1 (en) Determining radio frequency (rf) conditions using sensing information
US11617147B2 (en) Methods and apparatuses for efficient registration in an area where a service is supported partially
US20230276283A1 (en) Indication of feasible quasi-colocation (qcl) sources for fast beam indication
US11889425B2 (en) Methods and apparatuses for mitigating reduced complexity features impact on positioning performance
US20240056836A1 (en) Methods and apparatuses for testing user equipment (ue) machine learning-assisted radio resource management (rrm) functionalities
US20220377652A1 (en) Resource availability check
US20230262498A1 (en) Network data analytics function accuracy enhancement
US20240162956A1 (en) Early channel state information acquisition for target cell in layer one / layer two inter-cell mobility
EP4369790A1 (en) Early channel state information acquisition for target cell in layer one / layer two inter-cell mobility
WO2024065401A1 (en) Paging early indication monitoring in user equipment specific discontinuous reception
US20230370883A1 (en) Utilizing user equipment (ue) detection of radio state conditions
US20230379774A1 (en) Systems, apparatuses and methods for cognitive service driven handover optimization
US20240056844A1 (en) User equipment downlink transmission beam prediction framework with machine learning
US20230231793A1 (en) Buffering of layer three messages for subscriber tracing
US20220201703A1 (en) Handling of multiple minimization of drive test contexts in multiple radio resource controller states
US20230043492A1 (en) Protocol data unit (pdu) error probability feedback
WO2022053301A1 (en) User equipment (ue) data anonymization
WO2024027911A1 (en) Task specific models for wireless networks
WO2024047430A1 (en) Energy resource level based paging response
CN118042485A (en) Early channel state information acquisition for target cells in inter-layer 1/layer 2 cell mobility

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21843897

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21843897

Country of ref document: EP

Kind code of ref document: A1