WO2024049338A1 - Codage causal d'informations d'état de canal - Google Patents
Codage causal d'informations d'état de canal Download PDFInfo
- Publication number
- WO2024049338A1 WO2024049338A1 PCT/SE2023/050763 SE2023050763W WO2024049338A1 WO 2024049338 A1 WO2024049338 A1 WO 2024049338A1 SE 2023050763 W SE2023050763 W SE 2023050763W WO 2024049338 A1 WO2024049338 A1 WO 2024049338A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- csi
- data
- computing device
- model
- measurement
- Prior art date
Links
- 230000001364 causal effect Effects 0.000 title claims abstract description 102
- 238000005259 measurement Methods 0.000 claims abstract description 197
- 238000010801 machine learning Methods 0.000 claims abstract description 131
- 238000000034 method Methods 0.000 claims abstract description 77
- 230000003213 activating effect Effects 0.000 claims abstract description 8
- 238000012545 processing Methods 0.000 claims description 64
- 230000015654 memory Effects 0.000 claims description 58
- 230000006870 function Effects 0.000 claims description 39
- 238000004590 computer program Methods 0.000 claims description 25
- 238000013468 resource allocation Methods 0.000 claims description 15
- 238000003860 storage Methods 0.000 claims description 15
- 238000012549 training Methods 0.000 claims description 14
- 230000007613 environmental effect Effects 0.000 claims description 11
- 238000009826 distribution Methods 0.000 claims description 9
- 238000013528 artificial neural network Methods 0.000 claims description 6
- 238000004422 calculation algorithm Methods 0.000 claims description 6
- 230000003044 adaptive effect Effects 0.000 claims description 5
- 238000004891 communication Methods 0.000 description 53
- 238000010586 diagram Methods 0.000 description 26
- 238000004088 simulation Methods 0.000 description 13
- 230000003287 optical effect Effects 0.000 description 11
- 238000013459 approach Methods 0.000 description 8
- 230000011664 signaling Effects 0.000 description 8
- 230000008901 benefit Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000006978 adaptation Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 230000006855 networking Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 230000006835 compression Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000010267 cellular communication Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 239000000654 additive Substances 0.000 description 1
- 230000000996 additive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- QVFWZNCVPCJQOP-UHFFFAOYSA-N chloralodol Chemical compound CC(O)(C)CC(C)OC(O)C(Cl)(Cl)Cl QVFWZNCVPCJQOP-UHFFFAOYSA-N 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000012517 data analytics Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000001556 precipitation Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L25/00—Baseband systems
- H04L25/02—Details ; arrangements for supplying electrical power along data transmission lines
- H04L25/0202—Channel estimation
- H04L25/024—Channel estimation channel estimation algorithms
- H04L25/0254—Channel estimation channel estimation algorithms using neural network algorithms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/042—Knowledge-based neural networks; Logical representations of neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
- G06N3/0455—Auto-encoder networks; Encoder-decoder networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B17/00—Monitoring; Testing
- H04B17/30—Monitoring; Testing of propagation channels
- H04B17/309—Measuring or estimating channel quality parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B17/00—Monitoring; Testing
- H04B17/30—Monitoring; Testing of propagation channels
- H04B17/391—Modelling the propagation channel
- H04B17/3913—Predictive models, e.g. based on neural network models
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L25/00—Baseband systems
- H04L25/02—Details ; arrangements for supplying electrical power along data transmission lines
- H04L25/03—Shaping networks in transmitter or receiver, e.g. adaptive shaping networks
- H04L25/03006—Arrangements for removing intersymbol interference
- H04L25/03165—Arrangements for removing intersymbol interference using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/047—Probabilistic or stochastic networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/088—Non-supervised learning, e.g. competitive learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/098—Distributed learning, e.g. federated learning
-
- H—ELECTRICITY
- H03—ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M7/00—Conversion of a code where information is represented by a given sequence or number of digits to a code where the same, similar or subset of information is represented by a different sequence or number of digits
- H03M7/30—Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B17/00—Monitoring; Testing
- H04B17/30—Monitoring; Testing of propagation channels
- H04B17/309—Measuring or estimating channel quality parameters
- H04B17/345—Interference values
Definitions
- the present disclosure relates generally to computer-implemented methods performed by a first computing device for causal encoding of channel state information (CSI), and related methods and apparatuses.
- CSI channel state information
- CSI can be sent between a base station (BS) and a user equipment (UE).
- BS base station
- UE user equipment
- potential gains may be realized in a massive multi-input-multi-output (MIMO) system; handovers with other antenna types; in a frequency division duplexing (FDD) system; a point to point system for obtaining downlink CSI from uplink; distributed-MINO (D-MIMO) for obtaining CSI from a subset of access points for use with other remaining access points, etc.
- MIMO massive multi-input-multi-output
- FDD frequency division duplexing
- D-MIMO distributed-MINO
- Overhead of bidirectional signaling on channel state estimation may sometimes offset benefits of sending CSI.
- Some approaches may include compression of CSI feedback using autoencoders.
- a computer- implemented method performed by a first computing device for causal encoding of CSI includes obtaining an indication of incorrect reporting of CSI; retrieving parameter data including a plurality of specific parameters and a plurality of general parameters; obtaining a measurement of CSI data; and activating a machine learning (ML) model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measurement of CSI data.
- the method further includes encoding the measurement of CSI data to obtain encoded CSI measurement data; applying the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data; and transmitting the adapted CSI measurement data to a second computing device.
- a computer-implemented method performed by a second computing device for decoding causally encoded CSI includes receiving an adapted CSI data from a first computing device, where the adapted CSI data was output from a ML model for CSI reporting including causal relationships among parameter data comprising a plurality of specific parameters and a plurality of general parameters and a measurement of CSI data.
- the method further includes decoding the adapted CSI measurement data to obtain decoded CSI; and using the decoded CSI for a resource allocation.
- a computer-implemented method performed by a second computing device for causal encoding and decoding of CSI includes detecting an incorrect reporting of CSI for a first computing device; retrieving parameter data comprising a plurality of specific parameters and a plurality of general parameters; and obtaining a measurement of CSI data from the first computing device.
- the method further includes activating a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measured CSI data.
- the method further includes encoding the measurement CSI data to obtain encoded CSI measurement data; and applying the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data.
- a first computing device is provided.
- the first computing device is configured for causal encoding of CSI.
- the first computing device includes processing circuitry; and at least one memory coupled with the processing circuitry.
- the memory includes instructions that when executed by the processing circuitry causes the first computing device to perform operations.
- the operations include to obtain an indication of incorrect reporting of CSI; retrieve parameter data including a plurality of specific parameters and a plurality of general parameters; obtain a measurement of CSI data; and activate a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measurement of CSI data.
- the operations further include to encode the measurement of CSI data to obtain encoded CSI measurement data; apply the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data; and transmit the adapted CSI measurement data to a second computing device.
- a first computing device is provided that is configured for causal encoding of CSI.
- the first computing device is adapted to perform operations.
- the operations include to obtain an indication of incorrect reporting of CSI; retrieve parameter data including a plurality of specific parameters and a plurality of general parameters; obtain a measurement of CSI data; and activate a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measurement of CSI data.
- the operations further include to encode the measurement of CSI data to obtain encoded CSI measurement data; apply the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data; and transmit the adapted CSI measurement data to a second computing device.
- a computer program comprising program code is provided to be executed by processing circuitry of a first computing device configured for causal encoding of CSI. Execution of the program code causes the first computing device to perform operations. The operations include to obtain an indication of incorrect reporting of CSI; retrieve parameter data including a plurality of specific parameters and a plurality of general parameters; obtain a measurement of CSI data; and activate a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measurement of CSI data.
- the operations further include to encode the measurement of CSI data to obtain encoded CSI measurement data; apply the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data; and transmit the adapted CSI measurement data to a second computing device.
- a computer program product comprising a non-transitory storage medium including program code to be executed by processing circuitry of a first computing device configured for causal encoding of CSI. Execution of the program code causes the first computing device to perform operations. The operations include to obtain an indication of incorrect reporting of CSI; retrieve parameter data including a plurality of specific parameters and a plurality of general parameters; obtain a measurement of CSI data; and activate a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measurement of CSI data.
- the operations further include to encode the measurement of CSI data to obtain encoded CSI measurement data; apply the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data; and transmit the adapted CSI measurement data to a second computing device.
- a second computing device is provided.
- the second computing device is configured for decoding causally encoded CSI.
- the second computing device includes processing circuitry; and at least one memory coupled with the processing circuitry.
- the memory includes instructions that when executed by the processing circuitry causes the second computing device to perform operations.
- the operations include to receive an adapted CSI data from a first computing device, where the adapted CSI data was output from a ML model for CSI reporting including causal relationships among parameter data comprising a plurality of specific parameters and a plurality of general parameters and a measurement of CSI data.
- the operations further include to decode the adapted CSI measurement data to obtain decoded CSI; and use the decoded CSI for a resource allocation.
- a second computing device is provided that is configured for decoding causally encoded CSI.
- the second computing device is adapted to perform operations.
- the operations include to receive an adapted CSI data from a first computing device, where the adapted CSI data was output from a ML model for CSI reporting including causal relationships among parameter data comprising a plurality of specific parameters and a plurality of general parameters and a measurement of CSI data.
- the operations further include to decode the adapted CSI measurement data to obtain decoded CSI; and use the decoded CSI for a resource allocation.
- a computer program comprising program code is provided to be executed by processing circuitry of a second computing device configured for decoding causally encoded CSI. Execution of the program code causes the second computing device to perform operations.
- the operations include to receive an adapted CSI data from a first computing device, where the adapted CSI data was output from a ML model for CSI reporting including causal relationships among parameter data comprising a plurality of specific parameters and a plurality of general parameters and a measurement of CSI data.
- the operations further include to decode the adapted CSI measurement data to obtain decoded CSI; and use the decoded CSI for a resource allocation.
- a computer program product including a non-transitory storage medium including program code to be executed by processing circuitry of a second computing device configured for decoding causally encoded CSI.
- Execution of the program code causes the second computing device to perform operations.
- the operations include to receive an adapted CSI data from a first computing device, where the adapted CSI data was output from a ML model for CSI reporting including causal relationships among parameter data comprising a plurality of specific parameters and a plurality of general parameters and a measurement of CSI data.
- the operations further include to decode the adapted CSI measurement data to obtain decoded CSI; and use the decoded CSI for a resource allocation.
- a second computing device is provided.
- the second computing device is configured for causal encoding and decoding of CSI.
- the second computing device includes processing circuitry; and at least one memory coupled with the processing circuitry.
- the memory includes instructions that when executed by the processing circuitry causes the second computing device to perform operations.
- the operations include to detect an incorrect reporting of CSI for a first computing device; retrieve parameter data comprising a plurality of specific parameters and a plurality of general parameters; and obtain a measurement of CSI data from the first computing device.
- the operations further include to activate a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measured CSI data.
- the operations further include to encode the measurement CSI data to obtain encoded CSI measurement data; and apply the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data.
- a second computing device is provided that is configured for causal encoding and decoding of CSI.
- the second computing device is adapted to perform operations.
- the operations include to detect an incorrect reporting of CSI for a first computing device; retrieve parameter data comprising a plurality of specific parameters and a plurality of general parameters; and obtain a measurement of CSI data from the first computing device.
- the operations further include to activate a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measured CSI data.
- the operations further include to encode the measurement CSI data to obtain encoded CSI measurement data; and apply the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data.
- a computer program comprising program code is provided to be executed by processing circuitry of a second computing device configured for causal encoding and decoding of CSI.
- Execution of the program code causes the second computing device to perform operations.
- the operations include to detect an incorrect reporting of CSI for a first computing device; retrieve parameter data comprising a plurality of specific parameters and a plurality of general parameters; and obtain a measurement of CSI data from the first computing device.
- the operations further include to activate a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measured CSI data.
- the operations further include to encode the measurement CSI data to obtain encoded CSI measurement data; and apply the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data.
- a computer program product including a non-transitory storage medium including program code to be executed by processing circuitry of a second computing device configured for causal encoding and decoding of CSI. Execution of the program code causes the second computing device to perform operations.
- the operations include to detect an incorrect reporting of CSI for a first computing device; retrieve parameter data comprising a plurality of specific parameters and a plurality of general parameters; and obtain a measurement of CSI data from the first computing device.
- the operations further include to activate a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measured CSI data.
- the operations further include to encode the measurement CSI data to obtain encoded CSI measurement data; and apply the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data.
- Certain embodiments may provide one or more of the following technical advantages. Based on the inclusion of causal aware CSI encoding, domain adaption and/or power savings may be achieved.
- Figure 1 is a schematic diagram illustrating an overview of operations in accordance with some embodiments of the present disclosure
- Figure 2 is a signalling diagram illustrating operations performed by a first computing device (e.g., a UE) for causal encoding of CSI in accordance with the present disclosure
- Figure 3 is a signalling diagram illustrating operations performed by a second computing device (e.g., a gNB) for causal encoding and/or decoding of CSI in accordance with some embodiments of the present disclosure
- Figure 4 is a block diagram illustrating a causally aware CSI encoding system in accordance with some embodiments of the present disclosure
- Figure 5 is a block diagram of an encoder that captures causal relationships in accordance with some embodiments of the present disclosure
- Figure 6 is images of sample CSI data from an indoor and an outdoor setting
- Figures 7 and 8 are train and test loss plots, respectively, for a simulation of an example embodiment of the present disclosure
- Figures 9A-D are images of sample reconstructions from a variational autoencoder (VAE) of the simulation of the example embodiment
- Figures 10A-F are images of results of VAE reconstruction where causal parameters were changed in the simulation of the example embodiment
- Figure 11 is an original CSI image and a CSI image reconstructed in the simulation of the example embodiment
- Figure 12 is an original CSI image and a CSI image reconstructed with intervention on scale in the simulation of the example embodiment
- Figure 13 is an original CSI image and a CSI counterfactual image reconstructed in the simulation of the example embodiment
- Figure 14 is a flow chart of operations of a first computing device in accordance with some embodiments of the present disclosure.
- Figures 15 and 16 is a flow chart of operations of a second computing device in accordance with some embodiments of the present disclosure
- Figure 17 is a block diagram of a network in accordance with some embodiments
- Figure 18 is a block diagram of a computing device in accordance with some embodiments of the present disclosure.
- Figure 19 is a block diagram of a computing device in accordance with some embodiments of the present disclosure.
- Figure 20 is a block diagram of a virtualization environment in accordance with some embodiments of the present disclosure.
- first computing device refers to equipment capable, configured, arranged, and/or operable for causal encoding of CSL
- first computing devices include, but are not limited to, a computer, a decentralized edge device, a decentralized edge server, a distributed collection of access points (e.g., base stations) that cooperate via a central processing unit (CPU), and a UE.
- CPU central processing unit
- the UE may include, e.g., a smart phone, mobile phone, cell phone, voice over IP (VoIP) phone, wireless local loop phone, desktop computer, personal digital assistant (PDA), wireless cameras, gaming console or device, music storage device, playback appliance, wearable terminal device, wireless endpoint, mobile station, tablet, laptop, laptop-embedded equipment (LEE), laptop-mounted equipment (LME), smart device, wireless customer-premise equipment (CPE), vehicle-mounted or vehicle embedded/integrated wireless device, etc.
- Other examples include any UE identified by the 3rd Generation Partnership Project (3GPP), including a narrow band internet of things (NB-loT) UE, a machine type communication (MTC) UE, and/or an enhanced MTC (eMTC) UE.
- 3GPP 3rd Generation Partnership Project
- NB-loT narrow band internet of things
- MTC machine type communication
- eMTC enhanced MTC
- the first computing device may include a distributed collection of access points (APs) that cooperate via a CPU, and channel estimation can be done at an AP or centrally at the CPU where channel estimates from all APs in the distributed collection can be combined for precoding or receive combining.
- APs access points
- ML machine learning
- second computing device refers to equipment capable, configured, arranged, and/or operable for causal encoding and/or decoding of CSL
- second computing devices include, but are not limited to, a computer, a decentralized edge device, a decentralized edge server, a cloud node, a cloud server, and centralized or distributed APs (e.g., base stations) in a radio access network (RAN) (e.g., GNodeBs (gNBs), evolved NodeBs (eNBs), core network nodes, access points (Aps) (e.g., radio access points) etc.).
- RAN radio access network
- gNBs GNodeBs
- eNBs evolved NodeBs
- Aps access points
- parameter data refers to parameter data provided by a vendor (e.g., a UE vendor, a gNB vendor, etc.) and includes, without limitation, vendor-specific parameter data and general/universal parameter data as discussed further herein.
- a vendor e.g., a UE vendor, a gNB vendor, etc.
- the operations further include creation of a ML model of underlying constraints that generate observational data including CSI data.
- the ML model may be a structural causal model (SCM), where the ML model captures a cause- and-effect relationships between an observational or endogenous variable from the CSI data and an unobserved or exogenous variable(s) from the parameter data (e.g. a UE speed, frequency, etc., discussed further herein).
- SCM structural causal model
- a causal graph is represented by a directed acyclic graph(s) (DAG(s)), where nodes (e.g., vertices) correspond to the endogenous variables, and directed edges account for a causal parent-child relationship.
- a endogenous variable can be an input vector of a channel H matrix, and an exogenous variable(s) can include the parameter data (e.g., power thresholds, channel quality indicator, etc. discussed further herein).
- Certain embodiments may provide one or more of the following technical advantages. Inclusion of causally aware encoding may circumvent a need for retraining a ML model for a data distribution or environmental change. Moreover, only one ML model may need to be stored on the network side (e.g., at a BS) and a device side (e.g., at a UE). As a consequence, power consumption and bandwidth may be reduced based on a decrease of signaling overhead.
- a first computing device e.g., a UE
- a second computing device e.g., a BS such as a gNB
- a parameter data such as thresholds and/or different environmental operational conditions (e.g., UE mobility such as speed and direction of movement in relation to a gNB, line of sight or non- line of sight propagation, channel richness, channel correlations, power usage of the UE and/or the gNB, etc.).
- a ML model performs operations to model underlying constraints that generate observational data including CSI.
- the ML model may be a structural causal models (SCM).
- SCM structural causal models
- the ML model captures cause- and-effect relationships (e.g., dependencies of CSI on UE speed, CSI dependencies on frequency, etc.) and can also address hidden confounders.
- domain adaptation can be obtained by matching a causal layer (e.g., in a neural network based SCM).
- the operations may be scalable.
- Various embodiments of the present disclosure are directed to operations for a single-cell downlink massive MIMO system with N t » 1 transmit antennas at a second computing device (e.g., a base station) and a single receiver antenna on the first computing device side (e.g., on a UE side).
- a second computing device e.g., a base station
- a single receiver antenna on the first computing device side e.g., on a UE side
- An orthogonal frequency division multiplexing (OFDM) system can include N c subcarriers and, thus, a received signal at the nth subcarrier can be given by: yn -n v n x n + z n where h n , v n , x n and z n denote a channel vector, precoding vector, data-bearing symbol, and additive noise of the n th subcarrier, respectively.
- the second computing device e.g., BS
- the first computing device e.g., UE
- An encoder may take a channel matrix H as an input, and reconstruct the channel matrix H.
- An autoencoder may be trained in an unsupervised learning setting, where a set of parameters is updated by an adaptive moment estimation (ADAM) algorithm.
- a loss function may be mean square error (MSE), calculated as an average over all the training samples.
- MSE mean square error
- An ADAM optimizer and MSE are examples and other optimization and error calculation methods can be used.
- Various embodiments of the present disclosure are directed to operations performed by a first computing device and/or a second computing device that augment an autoencoder based on use of a ML model.
- the ML model may model observational or endogenous variables, such as CSI H, as a function of exogenous (unobserved) variables from parameter data, using structural causal equations or functional causal models (FCMs).
- FIG. 1 is a schematic diagram illustrating an overview of operations in accordance with some embodiments of the present disclosure.
- parameter data 102 and CSI data H from a channel data source 104 may be included in a channel data service 100.
- the CSI data H 104 is provided to encoder 106 and loss function 116.
- a ML model 110 e.g., a SCM as illustrated
- the ML model 110 captures causal relationships among the parameter data and the measured CSI data.
- ML model 110 therefore, can be vendor agnostic based on inclusion of the parameter data, which can take care of changes in the autoencoded output of encoder 106 that are specific to a parameter data change.
- ML model 110 can be trained using the same loss function 116 as decoder 114.
- encoder 106 uses CSI channel data H, e.g., available from channel data service 100.
- the CSI channel data H can include features indicating an estimated channel quality.
- the CSI channel data H can be a three-dimensional tensor where dimensions correspond to a second computing device's (e.g., a base station such as a gNB) transmission antenna ports, a first computing device's (e.g., a UE) receive antenna ports, and frequency (either divided in subcarriers or subbands).
- Encoder 106 outputs a temporary latent space representation, Y te mp which is transformed by ML model 110 to a latent space that is input to decoder 114.
- ML model 110 uses parameter data 102 to transform Y tem p to Y.
- Loss function 116 is provided to decoder backpropagation 118 to propagate the loss function to individual nodes of the autoencoder, calculate each weight's contribution to the loss function, and adjust the weights accordingly using gradient descent.
- the gradients are provided to encoder backpropagation 120 to propagate the gradients back through the layers of the autoencoder, and update 122 the encoder weights and biases.
- the ML model 110 can be in either the first computing device (e.g., a UE) or the second computing device (e.g., a gNB).
- the first or second computing devices can also include encoder 106, and the second computing device can include decoder 114.
- ML model 110 is used when decoder 114 decodes the CSI measurement data incorrectly (e.g., consistently decodes the CSI measurement data incorrectly).
- Incorrect decoding of channel quality can be observed by the first computing device (e.g., UE) and/or second computing device (e.g., gNB) indirectly, for example an unusually high packet rate, low throughput, handover of a UE to a target cell despite reporting of good channel quality, dropped calls, failure to establish a radio access bearer, etc.
- such observations are triggers of the method.
- Figure 2 is a signalling diagram illustrating operations performed by a first computing device 200 (e.g., a UE) for causal encoding of CSI.
- ML model 110 is part of the first computing device 200.
- Figure 3 is a signalling diagram illustrating operations performed by a second computing device 202 (e.g., a gNB) for causal encoding and/or decoding of CSI.
- ML model 110 is part of the second computing device 202.
- first and second computing devices 200, 202 maintain one encoder and decoder, respectively, and ML model 110 transforms the latent space produced by the encoder to the expected input of the decoder. It is noted that when the first computing device 200 is a UE, and the UE changes a cell, the process resets.
- UE 200 obtaining 204 CSI being reported incorrectly.
- UE 200 detects 206 incorrect reporting of CSI.
- UE 200 receives 208 a reporting of incorrect CSI from gNB 202.
- gNB 202 requests that UE 200 retrieve parameter data.
- UE 200 in operation 212, obtains a measurement of CSI data and augments the measurement to the parameter data.
- UE 200 activates ML model 110 for CSI reporting.
- gNB 202 in operation 216, requests that UE 200 obtain a CSI measurement and report the CSI measurement to gNB 202.
- UE 200 obtains the measurement and, in operation 220, uses encoder 106 to compress the obtained measurement to a latent space representation , Y tem p.
- UE 200 in operation 222, uses ML model 110 to transform Y tem p to Y, and in operation 224, transmits a response including the CSI measurement Y to gNB 202.
- gNB 202 decodes the CSI information given Y.
- gNB 202 uses the CSI information for a resource allocation (e.g., a physical resource block (PRB) allocation as illustrated in the example embodiment of Figure 2).
- PRB physical resource block
- gNB 202 requests that UE 200 obtain a CSI measurement and report the CSI measurement to gNB 202.
- UE 200 obtains the CSI measurement and compresses the CSI measurement data with encoder 106 to a latent space representation , Ytemp.
- UE 200 transmits a response including the CSI measurement Y to gNB 202.
- gNB 202 monitors behavior of UE 200.
- gNB 202 detects incorrect reporting of CSI for UE 200.
- gNB 202 in operation 310, retrieves parameter data.
- UE 200 transmits a request including to augment UE measurements of channel data to the general and specific parameters.
- gNB 202 activates ML model 110 for CSI reporting.
- gNB 202 uses ML model 110 to transform Y tem p to Y, and in operation 318, decodes the CSI information given Y.
- gNB 202 uses the CSI information for a resource allocation (e.g., a physical resource block (PRB) allocation as illustrated in the example embodiment of Figure 3).
- a resource allocation e.g., a physical resource block (PRB) allocation as illustrated in the example embodiment of Figure 3).
- PRB physical resource block
- Figure 4 is a block diagram illustrating a causally aware CSI encoding system in accordance with some embodiments of the present disclosure.
- specific parameter data 400 is quantized and encoded as a one-hot encoded tensor.
- General parameter data 402 is quantized and encoded as a one-hot encoded tensor.
- Observation data 404 including CSI data may be collected from multiple vendors.
- Specific parameter data 400 is input to a ML model illustrated as a graph neural network 408 based structural causal model.
- Specific parameter data 400 and general parameter data 402, as respective tenors, are input to GNN 408.
- CSI data 404 is auto encoded by encoder 406, and the output of encoder 406 is input to GNN 408.
- a dynamically changed tensor 410 at run time also is input to GNN 408. Responsive to the inputs, GNN 408 outputs an adapted output.
- Speed of a first computing device e.g., UE
- the speed can be determined by a network-based solution (e.g., a fifth generation (5G) positioning type of approach that uses triangulation and/or beamforming); and/or speed can be determined by the first computing device itself (e.g., using over the top technologies such as satellite positioning information (e.g., a global positioning system (GPS)).
- GPS global positioning system
- Line of Sight or non-line-of-sight (NLOS) propagation, which can be a Boolean, for example, indicating whether a wave travels in a direct path or there are diffractions, refractions, and/or reflections due to obstacles.
- Spatial channel correlation (e.g., a measure of independence between adjacent antennas in a MIMO antenna array). Spatial channel correlation can be, e.g., measured by an antenna correlation coefficient (ACC) that indicates interdependence. For example, the lower the ACC, the more independent the antennas are, generally leading to higher bandwidth.
- ACC antenna correlation coefficient
- Far-field radiation pattern and S-parameter characterization are examples of processes that can be used to measure ACC.
- Sub-carrier frequency e.g., a frequency range of each of the subcarriers in a MIMO antenna.
- Ambient environmental conditions that may affect signal propagation such as precipitation, temperature, etc.
- IBW Instantaneous available bandwidth
- the general parameters can be one-hot encoded where each of the general parameters is quantized to n-levels. In an example embodiment, n is 3; and is input as a nx8 bit tensor to the auto encoding module. The general parameters and specific parameters discussed below can be input as a tensor when building a causality graph.
- Specific parameter data of various embodiments of the present disclosure includes, without limitation:
- Perceived interference such as a measure of signal to noise ratio or signal quality using as metrics, e.g., reference signal received power (RSRP), reference signal received quality (RSRQ), etc.
- RSRP reference signal received power
- RSRQ reference signal received quality
- First computing device e.g., UE
- thresholds for a particular vendor e.g., an acceptable level of loss in channel quality information (CQI) reconstruction at the network vendor.
- CQI channel quality information
- Second computing device e.g., gNB
- Second computing device thresholds for a particular vendor.
- Power available at the first computing device e.g., UE
- a plurality of metrics e.g., battery state of charge, rate of discharge, etc.
- Nominal power parameter of the second computing device e.g., gNB
- a power amplifier e.g., single or multiple transmission.
- a first computing device e.g., UE
- UE e.g., UE
- defined power class e.g., a defined industry power class for a maximum transmission power of a device, and maximum and minimum effective isotropic radiated power (EIRP) over new radio (NR).
- EIRP effective isotropic radiated power
- the specific parameter data can be input as a tensor when building a causality graph.
- Some embodiments include operations to build the ML model (e.g., a SCM), for example, using a Graph Neural Network (GNN).
- a GNN can be an additional encoding step at the encoder that is amenable for adaptation with changes in the exogenous variables, depending on parameter data (e.g., vendor or environmental changes).
- the GNN can answer two types of queries: (a) Interventional queries, such as, what would happen if one of the exogenous variables changes by a certain amount and/or
- Mean squared error (MSE) metrics can be used to build the ML model (e.g., the GNN).
- Figure 5 is a block diagram of an encoder that captures causal relationships in accordance with some embodiments of the present disclosure. As illustrated in Figure 5, CSI data 501 is input to an autoencoder 406 trained with ADAM optimization and MSE. The output of autoencoder 406 is input to a trained GNN 408 that captures causal relationships. Trained GNN 408 outputs latent samples 502.
- a ML model layer (e.g., a SCM layer) can be trained in a federated learning setting where data from multiple vendors or from different environmental conditions for the same vendor can incrementally improve the model. Online adaptation also may be included, which can be vendor specific for changes in environmental conditions, can be federated across vendors, etc.
- CSI data was used from Wen.
- the CSI data includes data collected from two settings: (1) Indoor data from a Pico cellular scenario at a 5.3 GHz band. A base station was positioned at the center of a 20 m square; and (2) Outdoor data from a rural setting at a 300MHz band. A base station was positioned at the center of a 400 m square. For both scenarios, UEs were positioned randomly within the square. There were 32 antennas at the base station and 1024 subcarriers in use.
- the CSI data was used to generate a synthetic parameter dataset, where matrixes can be transformed using scaling, changing orientation, changing position x, and position y. In a deployment scenario, it may be easier to collect the data by changing settings such as power and keeping a note of the distance and speed of the UE from the base station, and vendors of the UE(s) and gNB as available metadata.
- Figure 6 shows sample CSI data from the indoor and outdoor settings.
- causal factors were chosen that give rise to the CSI dataset, including indoor/outdoor setting labels, scale (similar to power in a real setting), orientation, and (x,y) positions. Orientation and (x,y) positions can be the position of the antenna array in real setting.
- the example embodiment included the following operations: (1) a variational autoencoder (VAE) was constructed from the underlying labels and images; (2) a SCM was built using the VAE and the underlying labels as inputs; and (3) interventional and counterfactual analysis was demonstrated when the labels were changed.
- VAE variational autoencoder
- the method can include adapting the encoding to newer vendors. While in the simulation, a SCM was manually formed, in an actual deployment, the ML model can also be learnt using algorithms for causal discovery.
- Position X 32 values in (0, 1)
- Position Y 32 values in (0, 1)
- Training for the VAE was done on a graphics processing unit (GPU), with 200 epochs and an ADAM optimizer, and a learning rate of 1.0e-3. Test loss was calculated after every 5 training epochs. Figures 7 and 8 show train and test loss plots, respectively, for the simulation.
- GPU graphics processing unit
- Figures 9A-D are images of sample reconstructions from the VAE. Accuracy can be increased by increasing the number of training epochs.
- Figures 10 A-F are images of results of the VAE reconstruction where causal parameters were changed.
- Figure 10B is a sample VAE reconstruction of the image of Figure 10A.
- Figure 10D is a sample VAE reconstruction of the image of Figure 10C.
- Figure 10F is a sample VAE reconstruction of the image of Figure 10E.
- Figure 11 includes an original CSI image and a CSI image reconstructed with the SCM.
- Figure 12 includes an original CSI image and a CSI image reconstructed with intervention on scale.
- Figure 13 includes an original CSI image and a CSI counterfactual image reconstructed.
- certain embodiments may provide one or more of the following technical advantages.
- Use of causal autoencoding may result in adaption to different vendors based on interventional queries.
- the method can adapt to Gaussian noise, with sufficient training, and can work for random noise with other distributions as well. Energy and power savings may be achieved as different ML models do not need to be stored for each vendor.
- one ML model can adapt based on operation conditions and vendor settings. Additionally, the ML model may be explainable via counterfactuals.
- the method may adapt to new vendors and/or different environmental settings.
- FIG 14 is a flowchart of operations of a first computing device 200, 18300 (implemented using the structure of the block diagram of Figure 18) in accordance with some embodiments of the present disclosure.
- modules may be stored in memory 18304 of Figure 18, and these modules may provide instructions so that when the instructions of a module are executed by respective first computing device processing circuitry 18302, processing circuitry 18302 performs respective operations of the flow charts.
- a computer-implemented method performed by the first computing device for causal encoding of CSI is provided.
- the method includes obtaining (1402) an indication of incorrect reporting of CSI; retrieving (1404) parameter data including a plurality of specific parameters and a plurality of general parameters; obtaining (1406) a measurement of CSI data; and activating a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measurement of CSI data.
- the method further includes encoding (1410) the measurement of CSI data to obtain encoded CSI measurement data; applying (1412) the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data; and transmitting (1414) the adapted CSI measurement data to a second computing device.
- the ML model can include a SCM.
- the ML model models the CSI measurement data as a function of the plurality of specific parameters and the plurality of general parameters.
- the ML model can include a directed acyclic graph comprising a plurality of nodes that correspond to the CSI measurement data and a plurality of directed edges that account for a causal parent-child relationship.
- the obtaining (1402) can include at least one of (i) detecting the incorrect reporting of CSI, and (ii) receiving a message from the second computing device comprising the incorrect reporting of CSI.
- the plurality of general parameters can include a plurality of at least one or more of (i) a speed of the first computing device, (ii) a location of the first computing device, (iii) a direction of movement of the first computing device, (iv) a line-of-sight or a non-line-of-sight propagation of a signal between the first computing device and the second computing device, (v) a spatial channel correlation, (vi) one or more sub-carrier frequencies, (vii) an environmental condition, (viii) an available bandwidth, and (ix) a current state of the first computing device.
- the plurality of specific parameters can include a plurality of at least one or more of (i) a measurement of perceived interference, (ii) a threshold for the first computing device for a particular vendor, (iii) a threshold for the second computing device for a particular vendor, (iv) a power available at the first computing device, (v) a nominal power parameter of the second computing device, (vi) an input power to the second computing device, and (vii) a specified power class for the first computing device.
- the method further includes building (1400) the ML model using a GNN, the plurality of specific parameters, the plurality of general parameters, and CSI data.
- the built ML model can model how a distribution of the CSI measurement data changes with changes in the specific and general parameters.
- the building (1400) can include encoding the plurality of specific parameters and the plurality of general parameters, respectively; inputting to the GNN (i) the encoded plurality of specific parameters as a tensor, (ii) the encoded plurality of general parameters as a tensor, and (iii) the CSI data; and training the GNN with an adaptive algorithm and an error metric.
- the trained GNN can capture causal relationships between the plurality of specific parameters, the plurality of general parameters, and the CSI data.
- the ML model is trained in a federated learning setting.
- the first computing device is a UE
- the second computing device is a network node
- FIGS 15 and 16 are flowcharts of operations of a second computing device 202, 19300 (implemented using the structure of the block diagram of Figure 19) in accordance with some embodiments of the present disclosure.
- modules may be stored in memory 19304 of Figure 19, and these modules may provide instructions so that when the instructions of a module are executed by respective second computing device processing circuitry 19302, processing circuitry 19302 performs respective operations of the flow chart.
- a computer-implemented method performed by the second computing device for decoding causally encoded CSI includes receiving (1500) an adapted CSI data from a first computing device, where the adapted CSI data was output from a ML, model for CSI reporting including causal relationships among parameter data comprising a plurality of specific parameters and a plurality of general parameters and a measurement of CSI data.
- the method further includes decoding (1502) the adapted CSI measurement data to obtain decoded CSI; and using (1504) the decoded CSI for a resource allocation.
- a computer-implemented method performed by the second computing device for causal encoding and decoding of CSI includes detecting (1602) an incorrect reporting of CSI for a first computing device; retrieving (1604) parameter data comprising a plurality of specific parameters and a plurality of general parameters; and obtaining (1606) a measurement of CSI data from the first computing device.
- the method further includes activating (1608) a ML model for CSI reporting.
- the ML model includes causal relationships among the parameter data and the measured CSI data.
- the method further includes encoding (1610) the measurement CSI data to obtain encoded CSI measurement data; and applying (1612) the ML model to the parameter data and the encoded CSI measurement data to obtain an adapted CSI measurement data.
- the method further includes decoding (1614) adapted CSI measurement data to obtain decoded CSI; and using (1616) the decoded CSI for a resource allocation.
- the ML model can include a SCM.
- the ML model models the CSI measurement data as a function of the plurality of specific parameters and the plurality of general parameters.
- the ML model can include a directed acyclic graph comprising a plurality of nodes that correspond to the CSI measurement data and a plurality of directed edges that account for a causal parent-child relationship between the CSI measurement data and the parameter data.
- the plurality of general parameters can include a plurality of at least one or more of (i) a speed of the first computing device, (ii) a location of the first computing device, (iii) a direction of movement of the first computing device, (iv) a line-of-sight or a non-line-of-sight propagation of a signal between the first computing device and the second computing device, (v) a spatial channel correlation, (vi) one or more sub-carrier frequencies, (vii) an environmental condition, (viii) an available bandwidth, and (ix) a current state of the first computing device.
- the plurality of specific parameters can include a plurality of at least one or more of (i) a measurement of perceived interference, (ii) a threshold for the first computing device for a particular vendor, (iii) a threshold for the second computing device for a particular vendor, (iv) a power available at the first computing device, (v) a nominal power parameter of the second computing device, (vi) an input power to the second computing device, and (vii) a specified power class for the first computing device.
- the method further includes building (1600) the ML model using a GNN, the plurality of specific parameters, the plurality of general parameters, and CSI data.
- the built ML model can model how a distribution of the CSI measurement data changes with changes in the specific and general parameters.
- the building (1600) can include encoding the plurality of specific parameters and the plurality of general parameters, respectively; inputting to the GNN (i) the encoded plurality of specific parameters as a tensor, (ii) the encoded plurality of general parameters as a tensor, and (iii) the CSI data; and training the GNN with an adaptive algorithm and an error metric.
- the trained GNN can capture causal relationships between the plurality of specific parameters, the plurality of general parameters, and the CSI data.
- the ML model is trained in a federated learning setting.
- the first computing device is a UE
- the second computing device is a network node
- Example embodiments of the methods of the present disclosure may be implemented in a network that includes, without limitation a telecommunication network, as illustrated on Figure 17.
- the telecommunications network 17102 may include an access network 17104, such as a RAN, and a core network 17106, which includes one or more core network nodes 17108.
- the access network 17104 may include one or more access nodes 17110A. 17110B, such as network nodes (e.g., base stations), or any other similar Third Generation Partnership project (3GPP) access node or non-3GPP access point.
- the network nodes 17110 facilitate direct or indirect connection of first computing devices 1711A-D (e.g., a UE), such as by and/or other first computing devices to the core network 17106 over one or more wireless connections.
- Example wireless communications over a wireless connection include transmitting and/or receiving wireless signals using electromagnetic waves, radio waves, infrared waves, and/or other types of signals suitable for conveying information without the use of wires, cables, or other material conductors.
- the network may include any number of wired or wireless networks, network nodes, UEs, computing devices, and/or any other components or systems that may facilitate or participate in the communication of data and/or signals whether via wired or wireless connections.
- the network may include and/or interface with any type of communication, telecommunication, data, cellular, radio network, and/or other similar type of system.
- the network 17100 enables connectivity between the first computing devices 17112 and second computing device(s) 17110.
- the network 17100 may be configured to operate according to predefined rules or procedures, such as specific standards that include, but are not limited to: Global System for Mobile Communications (GSM); Universal Mobile Telecommunications System (UMTS); Long Term Evolution (LTE), and/or other suitable 2G, 3G, 4G, 5G standards, or any applicable future generation standard (e.g., 6G); wireless local area network (WLAN) standards, such as the Institute of Electrical and Electronics Engineers (IEEE) 802.11 standards (WiFi); and/or any other appropriate wireless communication standard, such as the Worldwide Interoperability for Microwave Access (WiMax), Bluetooth, Z-Wave, Near Field Communication (NFC) ZigBee, LiFi, and/or any low-power wide-area network (LPWAN) standards such as LoRa and Sigfox.
- GSM Global System for Mobile Communications
- UMTS Universal Mobile Telecommunications System
- LTE Long Term
- the telecommunication network 17102 is a cellular network that implements 3GPP standardized features. Accordingly, the telecommunications network 17102 may support network slicing to provide different logical networks to different devices that are connected to the telecommunication network. For example, the telecommunications network 17102 may provide Ultra Reliable Low Latency Communication (URLLC) services to some first computing devices (e.g., UEs), while providing Enhanced Mobile Broadband (eMBB) services to other first computing devices, and/or Massive Machine Type Communication (mMTC)/Massive loT services to yet further first computing devices.
- URLLC Ultra Reliable Low Latency Communication
- eMBB Enhanced Mobile Broadband
- mMTC Massive Machine Type Communication
- the network 17100 is not limited to including a RAN, and rather includes any that includes any programmable/configurable decentralized access point or network element that also records data from performance measurement points in the network 17100.
- first computing devices and/or second computing devices are configured as a computer without radio/baseband, etc. attached.
- the method of the present disclosure is light-weight and, thus, amenable to distributed node and cloud implementation.
- Various distributed processing options may be used that suit data source, storage, compute, and coordination.
- data sampling may be done at a node (e.g., worker node), with data analysis, inference, ML model creation, ML model sharing, and CSI encoding performed at a cloud server (e.g., master).
- a cloud server e.g., master
- VNFs virtual network functions
- OSS operations support system
- implementation includes implementation in an open RAN (ORAN).
- OFRAN open RAN
- Methods of the present disclosure may be performed by a first computing device (e.g., any first computing devicesl7112A-D of Figure 17 (one or more of which may be generally referred to as first computing device 17112), or computing device 18300 of Figure 18).
- a first computing device can include first computing device 20o of Figure 2 or computing device 18300 of Figure 18.
- First computing device includes equipment capable, configured, arranged, and/or operable for causal encoding of CSL
- first computing devices include, but are not limited to, a computer, a decentralized edge device, a decentralized edge server, and a UE.
- the first computing device 18300 includes processing circuitry 18302 that is operatively coupled to a memory 18304, a ML model and an autoencoder (not illustrated), and/or any other component, or any combination thereof.
- Certain first computing devices may utilize all or a subset of the components shown in Figure 18. The level of integration between the components may vary from one first computing device to another first computing device. Further, certain first computing devices may contain multiple instances of a component, such as multiple processors, memories, transceivers, transmitters, receivers, etc.
- the processing circuitry 18302 is configured to process instructions and data and may be configured to implement any sequential state machine operative to execute instructions stored as machine-readable computer programs in the memory 18304 and/or the ML model.
- the processing circuitry 18302 may be implemented as one or more hardware-implemented state machines (e.g., in discrete logic, field-programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), etc.); programmable logic together with appropriate firmware; one or more stored computer programs, general-purpose processors, such as a microprocessor or digital signal processor (DSP), together with appropriate software; or any combination of the above.
- the processing circuitry 18302 may include multiple central processing units (CPUs).
- the memory 18304 and/or the ML model may be or be configured to include memory such as random access memory (RAM), read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic disks, optical disks, hard disks, removable cartridges, flash drives, and so forth.
- the memory 18304 and/or the ML model includes one or more application programs, such as an operating system, web browser application, a widget, gadget engine, or other application, and corresponding data.
- the memory 18304 and/or the ML model may store, for use by the first computing device 18300, any of a variety of various operating systems or combinations of operating systems.
- the memory 18304 and/or the ML model may be configured to include a number of physical drive units, such as redundant array of independent disks (RAID), flash memory, USB flash drive, external hard disk drive, thumb drive, pen drive, key drive, high- density digital versatile disc (HD-DVD) optical disc drive, internal hard disk drive, Blu-Ray optical disc drive, holographic digital data storage (HDDS) optical disc drive, external minidual in-line memory module (DIMM), synchronous dynamic random access memory (SDRAM), external micro-DIMM SDRAM, smartcard memory such as tamper resistant module in the form of a universal integrated circuit card (UICC) including one or more subscriber identity modules (SIMs), such as a USIM and/or ISIM, other memory, or any combination thereof.
- RAID redundant array of independent disks
- HD-DVD high- density digital versatile disc
- HDDS holographic digital data storage
- DIMM external minidual in-line memory module
- SDRAM synchronous dynamic random access memory
- SDRAM synchronous dynamic random access
- the UICC may for example be an embedded UICC (eUlCC), integrated UICC (iUICC) or a removable UICC commonly known as 'SIM card.
- eUlCC embedded UICC
- iUICC integrated UICC
- 'SIM card removable UICC commonly known as 'SIM card.
- the memory 18304 and/or the ML model may allow the first computing device 18300 to access instructions, application programs and the like, stored on transitory or non-transitory memory media, to off-load data, or to upload data.
- An article of manufacture, such as one utilizing a network may be tangibly embodied as or in the memory 18304 and/or ML model, which may be or comprise a device-readable storage medium.
- the processing circuitry 18302 may be configured to communicate with an access network or other network using a communication interface 18306.
- the communication interface may comprise one or more communication subsystems and may include or be communicatively coupled to an optional antenna.
- the communication interface 18306 may include one or more transceivers used to communicate, such as by communicating with one or more remote transceivers of another device capable of wireless communication (e.g., another first computing device or a second computing device such as a network node).
- Each transceiver may include a transmitter and/or a receiver appropriate to provide network communications (e.g., optical, electrical, frequency allocations, and so forth).
- the optional transmitter and receiver may be coupled to one or more optional antennas and may share circuit components, software or firmware, or alternatively be implemented separately.
- communication functions of the communication interface 18306 may include cellular communication, Wi-Fi communication, LPWAN communication, data communication, voice communication, multimedia communication, short-range communications such as Bluetooth, near-field communication, location-based communication such as the use of the global positioning system (GPS) to determine a location, another like communication function, or any combination thereof.
- GPS global positioning system
- Communications may be implemented in according to one or more communication protocols and/or standards, such as IEEE 802.11, Code Division Multiplexing Access (CDMA), Wideband Code Division Multiple Access (WCDMA), GSM, LTE, New Radio (NR), UMTS, WiMax, Ethernet, transmission control protocol/internet protocol (TCP/IP), synchronous optical networking (SONET), Asynchronous Transfer Mode (ATM), QUIC, Hypertext Transfer Protocol (HTTP), and so forth.
- CDMA Code Division Multiplexing Access
- WCDMA Wideband Code Division Multiple Access
- WCDMA Wideband Code Division Multiple Access
- GSM Global System for Mobile communications
- LTE Long Term Evolution
- NR New Radio
- UMTS Worldwide Interoperability for Microwave Access
- WiMax Ethernet
- TCP/IP transmission control protocol/internet protocol
- SONET synchronous optical networking
- ATM Asynchronous Transfer Mode
- QUIC Hypertext Transfer Protocol
- HTTP Hypertext Transfer Protocol
- second computing device includes equipment capable, configured, arranged, and/or operable for causal encoding of CSI and/or decoding of CSI.
- second computing devices include, but are not limited to, a computer, a decentralized edge device, a decentralized edge server, a cloud node, a cloud server, and centralized or distributed BS in a RAN (e.g., gNBs, eNBs, core network nodes, APs (e.g., radio access points) etc.).
- the second computing device 19300 includes modules may be stored in memory 19304, a ML model, autoencoder, and/or decoder (not illustrated), and these modules may provide instructions so that when the instructions of a module are executed by processing circuitry 19302 of Figure 19, the second computing device performs respective operations of methods in accordance with various embodiments of the present disclosure.
- Certain second computing devices may utilize all or a subset of the components shown in Figure 19.
- the level of integration between the components may vary from one second computing device to another second computing device.
- certain second computing devices may contain multiple instances of a component, such as multiple processors, memories, transceivers, transmitters, receivers, etc.
- the processing circuitry 19302 is configured to process instructions and data and may be configured to implement any sequential state machine operative to execute instructions stored as machine-readable computer programs in the memory 19304, the ML model, autoencoder, and/or the decoder.
- the processing circuitry 19302 may be implemented as one or more hardware-implemented state machines (e.g., in discrete logic, field-programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), etc.); programmable logic together with appropriate firmware; one or more stored computer programs, general-purpose processors, such as a microprocessor or digital signal processor (DSP), together with appropriate software; or any combination of the above.
- the processing circuitry 19302 may include multiple central processing units (CPUs).
- the memory 19304, the ML model, autoencoder, and/or the decoder may be or be configured to include memory such as random access memory (RAM), read-only memory (ROM), programmable read-only memory (PROM), erasable programmable readonly memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic disks, optical disks, hard disks, removable cartridges, flash drives, and so forth.
- the memory 19304, the ML model, autoencoder, and/or the decoder includes one or more application programs, such as an operating system, web browser application, a widget, gadget engine, or other application, and corresponding data.
- the memory 19304, the ML model, autoencoder, and/or the decoder may store, for use by the second computing device 19300, any of a variety of various operating systems or combinations of operating systems.
- the memory 19304, the ML model, autoencoder, and/or the decoder may be configured to include a number of physical drive units, such as redundant array of independent disks (RAID), flash memory, USB flash drive, external hard disk drive, thumb drive, pen drive, key drive, high-density digital versatile disc (HD-DVD) optical disc drive, internal hard disk drive, Blu-Ray optical disc drive, holographic digital data storage (HDDS) optical disc drive, external mini-dual in-line memory module (DIMM), synchronous dynamic random access memory (SDRAM), external micro-DIMM SDRAM, smartcard memory such as tamper resistant module in the form of a universal integrated circuit card (UICC) including one or more subscriber identity modules (SIMs), such as a USIM and/or ISIM, other memory, or any combination thereof.
- RAID redundant array of independent disks
- HD-DVD high-density digital versatile disc
- HDDS holographic digital data storage
- DIMM mini-dual in-line memory module
- SDRAM
- the UICC may for example be an embedded UICC (eUlCC), integrated UICC (iUICC) or a removable UICC commonly known as 'SIM card.
- eUlCC embedded UICC
- iUICC integrated UICC
- 'SIM card removable UICC commonly known as 'SIM card.
- the memory 19304, the ML model, autoencoder, and/or the decoder may allow the second computing device 19300 to access instructions, application programs and the like, stored on transitory or non-transitory memory media, to off-load data, or to upload data.
- An article of manufacture, such as one utilizing a network may be tangibly embodied as or in the memory 19304, the ML model, autoencoder, and/or the decoder, which may be or comprise a device-readable storage medium.
- the processing circuitry 19302 may be configured to communicate with an access network or other network using a communication interface 19306.
- the communication interface 19306 may comprise one or more communication subsystems and may include or be communicatively coupled to an optional antenna.
- the communication interface 19306 may include one or more transceivers used to communicate, such as by communicating with one or more remote transceivers of another device capable of wireless communication (e.g., a first computing device or another second computing device).
- Each transceiver may include a transmitter and/or a receiver appropriate to provide network communications (e.g., optical, electrical, frequency allocations, and so forth).
- the optional transmitter and receiver may be coupled to one or more optional antennas and may share circuit components, software or firmware, or alternatively be implemented separately.
- communication functions of the communication interface 19306 may include cellular communication, Wi-Fi communication, LPWAN communication, data communication, voice communication, multimedia communication, short-range communications such as Bluetooth, near-field communication, location-based communication such as the use of the GPS to determine a location, another like communication function, or any combination thereof.
- Communications may be implemented in according to one or more communication protocols and/or standards, such as IEEE 802.11, CDMA, WCDMA, GSM, LTE, NR, UMTS, WiMax, Ethernet, TCP/IP, SONET, ATM, QUIC, HTTP, and so forth.
- FIG. 20 is a block diagram illustrating a virtualization environment 20500 in which functions implemented by some embodiments may be virtualized.
- virtualizing means creating virtual versions of apparatuses or devices which may include virtualizing hardware platforms, storage devices and networking resources.
- virtualization can be applied to any device described herein, or components thereof, and relates to an implementation in which at least a portion of the functionality is implemented as one or more virtual components.
- VMs virtual machines
- hardware nodes such as a hardware computing device that operates as a second computing device (e.g., a network node), a first computing device (e.g., a UE), core network node, or host.
- a second computing device e.g., a network node
- a first computing device e.g., a UE
- core network node e.g., a host
- the node may be entirely virtualized.
- Applications 20502 (which may alternatively be called software instances, virtual appliances, network functions, virtual nodes, virtual network functions, etc.) are run in the virtualization environment Q400 to implement some of the features, functions, and/or benefits of some of the embodiments disclosed herein.
- Hardware 20504 includes processing circuitry, memory that stores software and/or instructions executable by hardware processing circuitry, and/or other hardware devices as described herein, such as a network interface, input/output interface, and so forth.
- Software may be executed by the processing circuitry to instantiate one or more virtualization layers 20506 (also referred to as hypervisors or virtual machine monitors (VMMs)), provide VMs 20508a and 20508b (one or more of which may be generally referred to as VMs 20508), and/or perform any of the functions, features and/or benefits described in relation with some embodiments described herein.
- the virtualization layer 20506 may present a virtual operating platform that appears like networking hardware to the VMs 20508.
- the VMs 20508 comprise virtual processing, virtual memory, virtual networking or interface and virtual storage, and may be run by a corresponding virtualization layer 20506.
- Different embodiments of the instance of a virtual appliance 20502 may be implemented on one or more of VMs 20508, and the implementations may be made in different ways.
- Virtualization of the hardware is in some contexts referred to as network function virtualization (NFV or VNF).
- NFV network function virtualization
- VNFV may be used to consolidate many network equipment types onto industry standard high volume server hardware, physical switches, and physical storage, which can be located in data centers, and customer premise equipment.
- a VM 20508 may be a software implementation of a physical machine that runs programs as if they were executing on a physical, nonvirtualized machine.
- Each of the VMs 20508, and that part of hardware 20504 that executes that VM be it hardware dedicated to that VM and/or hardware shared by that VM with others of the VMs, forms separate virtual network elements.
- a virtual network function is responsible for handling specific network functions that run in one or more VMs 20508 on top of the hardware 20504 and corresponds to the application 20502.
- Hardware 20504 may be implemented in a standalone network node with generic or specific components. Hardware 20504 may implement some functions via virtualization. Alternatively, hardware 20504 may be part of a larger cluster of hardware (e.g. such as in a data center or CPE) where many hardware nodes work together and are managed via management and orchestration 20510, which, among others, oversees lifecycle management of applications 20502. In some embodiments, hardware 20504 is coupled to one or more radio units that each include one or more transmitters and one or more receivers that may be coupled to one or more antennas.
- hardware 20504 is coupled to one or more radio units that each include one or more transmitters and one or more receivers that may be coupled to one or more antennas.
- Radio units may communicate directly with other hardware nodes via one or more appropriate network interfaces and may be used in combination with the virtual components to provide a virtual node with radio capabilities, such as a radio access node or a base station.
- some signaling can be provided with the use of a control system 20512 which may alternatively be used for communication between hardware nodes and radio units
- first and second computing devices described herein may include the illustrated combination of hardware components, other embodiments may comprise first and/or second computing devices with different combinations of components. It is to be understood that these first and/or second computing devices may comprise any suitable combination of hardware and/or software needed to perform the tasks, features, functions and methods disclosed herein. Determining, calculating, obtaining or similar operations described herein may be performed by processing circuitry, which may process information by, for example, converting the obtained information into other information, comparing the obtained information or converted information to information stored in the first and/or second computing device, and/or performing one or more operations based on the obtained information or converted information, and as a result of said processing making a determination.
- first and/or second computing devices may comprise multiple different physical components that make up a single illustrated component, and functionality may be partitioned between separate components.
- a communication interface may be configured to include any of the components described herein, and/or the functionality of the components may be partitioned between the processing circuitry and the communication interface.
- non-computationally intensive functions of any of such components may be implemented in software or firmware and computationally intensive functions may be implemented in hardware.
- processing circuitry executing instructions stored on in memory, which in certain embodiments may be a computer program product in the form of a non- transitory computer-readable storage medium.
- some or all of the functionality may be provided by the processing circuitry without executing instructions stored on a separate or discrete device-readable storage medium, such as in a hard-wired manner.
- the processing circuitry can be configured to perform the described functionality. The benefits provided by such functionality are not limited to the processing circuitry alone or to other components of the first and/or second computing device, but are enjoyed by the first and/or second computing device as a whole, and/or by end users and a wireless network generally.
- the terms “comprise”, “comprising”, “comprises”, “include”, “including”, “includes”, “have”, “has”, “having”, or variants thereof are open-ended, and include one or more stated features, integers, elements, steps, components or functions but does not preclude the presence or addition of one or more other features, integers, elements, steps, components, functions or groups thereof.
- the common abbreviation “e.g.”, which derives from the Latin phrase “exempli gratia” may be used to introduce or specify a general example or examples of a previously mentioned item, and is not intended to be limiting of such item.
- the common abbreviation “i.e.”, which derives from the Latin phrase “id est,” may be used to specify a particular item from a more general recitation.
- Example embodiments are described herein with reference to block diagrams and/or flowchart illustrations of computer-implemented methods, apparatus (systems and/or devices) and/or computer program products. It is understood that a block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by computer program instructions that are performed by one or more computer circuits.
- These computer program instructions may be provided to a processor circuit of a general purpose computer circuit, special purpose computer circuit, and/or other programmable data processing circuit to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, transform and control transistors, values stored in memory locations, and other hardware components within such circuitry to implement the functions/acts specified in the block diagrams and/or flowchart block or blocks, and thereby create means (functionality) and/or structure for implementing the functions/acts specified in the block diagrams and/or flowchart block(s).
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Data Mining & Analysis (AREA)
- Biomedical Technology (AREA)
- Power Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- Quality & Reliability (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
La présente invention concerne un procédé mis en œuvre par ordinateur et réalisé par un premier dispositif informatique (200, 18300) pour un codage causal d'informations d'état de canal, CSI. Le procédé comprend l'obtention (1402) d'une indication d'un rapport incorrect d'informations CSI ; la récupération (1404) de données de paramètres ; l'obtention (1406) d'une mesure de données CSI ; et l'activation (1408) d'un modèle d'apprentissage automatique, ML, pour rapporter des informations CSI. Le modèle ML comprend une relation causale parmi les données de paramètres et la mesure de données CSI. Le procédé comprend en outre le codage (1410) de la mesure des données CSI pour obtenir des données de mesure CSI codées ; l'application (1412) du modèle ML aux données de paramètres et aux données de mesure CSI codées pour obtenir des données de mesure CSI adaptées ; et la transmission (1414) des données de mesure CSI adaptées à un second dispositif informatique. La présente invention concerne en outre un procédé mis en œuvre par ordinateur et réalisé par un second dispositif informatique, ainsi que des procédés et un appareil associés.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GR20220100721 | 2022-09-01 | ||
GR20220100721 | 2022-09-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024049338A1 true WO2024049338A1 (fr) | 2024-03-07 |
Family
ID=90098455
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SE2023/050763 WO2024049338A1 (fr) | 2022-09-01 | 2023-08-01 | Codage causal d'informations d'état de canal |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024049338A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4443785A1 (fr) * | 2023-04-05 | 2024-10-09 | Nokia Solutions and Networks Oy | Procédés et dispositifs de compression et de décompression de l'information sur l'état du canal |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020213964A1 (fr) * | 2019-04-16 | 2020-10-22 | Samsung Electronics Co., Ltd. | Procédé et appareil de rapport d'informations d'état de canal |
WO2022040046A1 (fr) * | 2020-08-18 | 2022-02-24 | Qualcomm Incorporated | Rapport de configurations de traitement sur la base d'un réseau neuronal au niveau d'un ue |
WO2022227081A1 (fr) * | 2021-04-30 | 2022-11-03 | Qualcomm Incorporated | Techniques d'informations d'état de canal et changement de compression de canal |
WO2023282804A1 (fr) * | 2021-07-07 | 2023-01-12 | Telefonaktiebolaget Lm Ericsson (Publ) | Classification de qualité de compression de csi |
WO2023097591A1 (fr) * | 2021-12-02 | 2023-06-08 | Qualcomm Incorporated | Techniques pour rapporter des informations d'état de canal pour une rétroaction de canal basée sur l'apprentissage machine |
-
2023
- 2023-08-01 WO PCT/SE2023/050763 patent/WO2024049338A1/fr unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020213964A1 (fr) * | 2019-04-16 | 2020-10-22 | Samsung Electronics Co., Ltd. | Procédé et appareil de rapport d'informations d'état de canal |
WO2022040046A1 (fr) * | 2020-08-18 | 2022-02-24 | Qualcomm Incorporated | Rapport de configurations de traitement sur la base d'un réseau neuronal au niveau d'un ue |
WO2022227081A1 (fr) * | 2021-04-30 | 2022-11-03 | Qualcomm Incorporated | Techniques d'informations d'état de canal et changement de compression de canal |
WO2023282804A1 (fr) * | 2021-07-07 | 2023-01-12 | Telefonaktiebolaget Lm Ericsson (Publ) | Classification de qualité de compression de csi |
WO2023097591A1 (fr) * | 2021-12-02 | 2023-06-08 | Qualcomm Incorporated | Techniques pour rapporter des informations d'état de canal pour une rétroaction de canal basée sur l'apprentissage machine |
Non-Patent Citations (2)
Title |
---|
BANERJEE SERENE; KARAPANTELAKIS ATHANASIOS; ELEFTHERIADIS LACKIS; FARHADI HAMED; SINGH VANDITA; KARTHICK R M: "Causality-Aware Channel State Information Encoding", 2023 15TH INTERNATIONAL CONFERENCE ON COMMUNICATION SYSTEMS & NETWORKS (COMSNETS), IEEE, 3 January 2023 (2023-01-03), pages 617 - 625, XP034295960, DOI: 10.1109/COMSNETS56262.2023.10041353 * |
SHINYA KUMAGAI, NTT DOCOMO, INC.: "Discussion on other aspects on AI/ML for CSI feedback enhancement", 3GPP DRAFT; R1-2303706; TYPE DISCUSSION; FS_NR_AIML_AIR, 3RD GENERATION PARTNERSHIP PROJECT (3GPP), MOBILE COMPETENCE CENTRE ; 650, ROUTE DES LUCIOLES ; F-06921 SOPHIA-ANTIPOLIS CEDEX ; FRANCE, vol. 3GPP RAN 1, no. Online; 20230417 - 20230426, 7 April 2023 (2023-04-07), Mobile Competence Centre ; 650, route des Lucioles ; F-06921 Sophia-Antipolis Cedex ; France, XP052294264 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4443785A1 (fr) * | 2023-04-05 | 2024-10-09 | Nokia Solutions and Networks Oy | Procédés et dispositifs de compression et de décompression de l'information sur l'état du canal |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11637607B2 (en) | Generic reciprocity based channel state information acquisition frameworks for advanced networks | |
US12068816B2 (en) | Hybrid common/independent FD-basis for Type II CSI enhancement | |
US20240275451A1 (en) | Reporting of coefficients for channel state information | |
US20220343167A1 (en) | Methods and apparatus for machine learning model life cycle | |
US10917150B2 (en) | Non-constant modulus codebook design | |
US11233553B2 (en) | Method and devices for estimation of MIMO channel state information | |
US20240202542A1 (en) | Methods for transfer learning in csi-compression | |
WO2024049338A1 (fr) | Codage causal d'informations d'état de canal | |
US11271623B2 (en) | Codebook design for virtualized active antenna system (AAS) | |
WO2022172198A1 (fr) | Modèles génératifs profonds pour l'estimation de canal de liaison descendante dans des systèmes mimo massifs fdd | |
CN116615893A (zh) | 下行链路mu-mimo传送中的码本和pmi覆写 | |
EP3915198A1 (fr) | Procédés, appareil et supports lisibles par ordinateur dédiés au réglage du gain dans le faisceau dans des réseaux de communication sans fil | |
WO2023192409A1 (fr) | Rapport d'équipement utilisateur de performance de modèle d'apprentissage automatique | |
US20240015667A1 (en) | System and methods for configurable eirp restriction | |
US20230164000A1 (en) | Channel Estimation in a Wireless Communication Network | |
Banerjee et al. | Causality-Aware Channel State Information Encoding | |
US20240356783A1 (en) | Detection of pulsed radar signal | |
US20240364486A1 (en) | Sounding Reference Signal Transmission in a Wireless Communication Network | |
US20240121766A1 (en) | Method and apparatus for determining channel parameter | |
WO2021072625A1 (fr) | Procédé et appareil de génération de forme de faisceau d'un canal de liaison descendante | |
US20210399853A1 (en) | Methods, Apparatus and Machine-Readable Mediums for Signalling in a Base Station | |
WO2023140767A1 (fr) | Balayage de faisceau avec détection compressée basée sur l'intelligence artificielle (ia) | |
CN118828902A (zh) | 由无线通信系统中的节点执行的方法及其电子设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23860977 Country of ref document: EP Kind code of ref document: A1 |