CN114747250A - Feedback of channel state information - Google Patents

Feedback of channel state information Download PDF

Info

Publication number
CN114747250A
CN114747250A CN201980102635.3A CN201980102635A CN114747250A CN 114747250 A CN114747250 A CN 114747250A CN 201980102635 A CN201980102635 A CN 201980102635A CN 114747250 A CN114747250 A CN 114747250A
Authority
CN
China
Prior art keywords
state information
channel state
channel
type
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980102635.3A
Other languages
Chinese (zh)
Inventor
刘皓
蔡立羽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Shanghai Bell Co Ltd
Nokia Solutions and Networks Oy
Original Assignee
Nokia Shanghai Bell Co Ltd
Nokia Solutions and Networks Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Shanghai Bell Co Ltd, Nokia Solutions and Networks Oy filed Critical Nokia Shanghai Bell Co Ltd
Publication of CN114747250A publication Critical patent/CN114747250A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/04Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas
    • H04B7/06Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station
    • H04B7/0613Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission
    • H04B7/0615Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission of weighted versions of same signal
    • H04B7/0619Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission of weighted versions of same signal using feedback from receiving side
    • H04B7/0621Feedback content
    • H04B7/0626Channel coefficients, e.g. channel state information [CSI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/04Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas
    • H04B7/06Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station
    • H04B7/0613Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission
    • H04B7/0615Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission of weighted versions of same signal
    • H04B7/0619Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission of weighted versions of same signal using feedback from receiving side
    • H04B7/0621Feedback content
    • H04B7/0632Channel quality parameters, e.g. channel quality indicator [CQI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/02Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas
    • H04B7/04Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas
    • H04B7/06Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station
    • H04B7/0613Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission
    • H04B7/0615Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission of weighted versions of same signal
    • H04B7/0619Diversity systems; Multi-antenna system, i.e. transmission or reception using multiple antennas using two or more spaced independent antennas at the transmitting station using simultaneous transmission of weighted versions of same signal using feedback from receiving side
    • H04B7/0636Feedback format
    • H04B7/0643Feedback on request
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • H04W24/02Arrangements for optimising operational condition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • H04W24/10Scheduling measurement reports ; Arrangements for measurement reports

Abstract

Example embodiments of the present disclosure relate to devices, methods, apparatuses, and computer-readable storage media for feedback of Channel State Information (CSI). In an example embodiment, the first device determines a type of CSI to be fed back and then generates the type of CSI. The first device transmits the generated CSI to the second device. Accordingly, if the type of CSI comprises high-precision CSI, the second device trains a machine learning model using at least the received CSI to obtain at least a first set of parameters for generation of the CSI and at least a second set of parameters for reconstruction of the CSI. The second device also sends the first set of parameters to the first device for CSI generation.

Description

Feedback of channel state information
Technical Field
Example embodiments of the present disclosure relate generally to the field of communications, and, in particular, to an apparatus, method, apparatus, and computer-readable storage medium for feedback of Channel State Information (CSI).
Background
For the third generation partnership project (3GPP) release 17(Rel-17), further enhanced Multiple Input Multiple Output (MIMO) has been agreed. Knowledge of the channel characteristics would be beneficial for enhanced MIMO or massive MIMO. For Time Division Duplex (TDD) systems, a base station such as a new radio NodeB (or gNB) may acquire Downlink (DL) channel characteristics through Uplink (UL) sounding signals from different User Equipments (UEs) based on reciprocity of UL and DL channels.
Frequency Division Duplex (FDD) systems do not have the reciprocity characteristics of the UL and DL channels. In FDD systems, downlink Channel State Information (CSI) is reported by the UE to the gNB via an uplink feedback channel. In terms of system efficiency, a trade-off needs to be made between high accuracy of CSI feedback and reasonable overhead. Machine Learning (ML) techniques may be used to enhance CSI feedback. For example, by utilizing ML techniques, a multi-layer Neural Network (NN) is first trained using channel measurement data, and then the NN is used for compression and recovery of CSI during CSI feedback. However, obtaining sufficient channel measurement data for NN training is a significant challenge.
Disclosure of Invention
In general, example embodiments of the present disclosure provide devices, methods, apparatuses, and computer-readable storage media for feedback of CSI.
In a first aspect, a first apparatus is provided that includes at least one processor and at least one memory including computer program code. The at least one memory and the computer program code configured to, with the at least one processor, cause the first apparatus to: the type of channel state information to be fed back is determined. The first device is caused to generate the type of channel state information, and then transmits the generated channel state information to the second device.
In a second aspect, a second apparatus is provided that includes at least one processor and at least one memory including computer program code. The at least one memory and the computer program code configured to, with the at least one processor, cause the second apparatus to: channel state information is received from a first device. If the type of received channel state information comprises high precision channel state information, the second device is caused to train the machine learning model by using at least the received channel state information to obtain at least a first set of parameters for the generation of the channel state information and at least a second set of parameters for the reconstruction of the channel state information. The second device is also caused to send the first set of parameters to the first device.
In a third aspect, a method is provided. In the method, a first device determines a type of channel state information to be fed back. The first device then generates the type of channel state information and transmits the generated channel state information to the second device.
In a fourth aspect, a method is provided. In the method, a second device receives channel state information from a first device. If the type of the received channel state information includes high precision channel state information, the second device trains the machine learning model by using at least the received channel state information to obtain at least a first set of parameters for the generation of the channel state information and at least a second set of parameters for the reconstruction of the channel state information. The second device then sends the first set of parameters to the first device.
In a fifth aspect, there is provided an apparatus comprising means for performing a method according to the third or fourth aspect.
In a sixth aspect, a computer-readable storage medium comprising program instructions stored thereon is provided. The instructions, when executed by a processor of an apparatus, cause the apparatus to perform the method according to the third or fourth aspect.
It should be understood that the summary is not intended to identify key or essential features of the example embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become readily apparent from the following description.
Drawings
Some example embodiments will now be described with reference to the accompanying drawings, in which:
fig. 1 illustrates an example architecture of machine learning-based CSI feedback;
FIG. 2 illustrates an example environment in which example embodiments of the present disclosure may be implemented;
fig. 3 illustrates an example process between a first device and a second device, according to some example embodiments of the present disclosure;
fig. 4 illustrates an example configuration for feedback for two types of CSI in accordance with some example embodiments of the present disclosure;
fig. 5 illustrates an example process between a first device and a second device, according to some other example embodiments of the present disclosure;
fig. 6 shows a flow diagram of an example method according to some example embodiments of the present disclosure;
fig. 7 shows a flowchart of an example method according to some other example embodiments of the present disclosure; and
fig. 8 shows a simplified block diagram of a device suitable for implementing an example embodiment of the present disclosure.
Throughout the drawings, the same or similar reference numbers refer to the same or similar elements.
Detailed Description
The principles of the present disclosure will now be described with reference to some exemplary embodiments. It is understood that these example embodiments are described merely to illustrate and assist those of ordinary skill in the art in understanding and enabling the disclosure, and are not intended to limit the scope of the disclosure in any way. The disclosure described herein may be implemented in a variety of other ways besides those described below.
In the following description and claims, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
As used herein, the term "terminal device" or "user equipment" (UE) refers to any terminal device capable of wireless communication with each other or a base station. Communication may involve the transmission and/or reception of wireless signals using electromagnetic signals, radio waves, infrared signals, and/or other types of signals suitable for the transmission of information over the air. In some example embodiments, the UE may be configured to transmit and/or receive information without direct human-machine interaction. For example, when triggered by an internal or external event, or in response to a request from the network side, the UE may transmit information to the base station according to a predetermined schedule.
Examples of UEs include, but are not limited to, smart phones, wireless enabled tablets, laptop embedded devices (LEEs), laptop installed devices (LMEs), wireless client devices (CPEs), sensors, metering devices, personal wearable devices such as watches, and/or vehicles capable of communication. For purposes of discussion, some example embodiments will be described with reference to a UE as an example of a terminal device, and the terms "terminal device" and "user equipment" (UE) may be used interchangeably in the context of this disclosure. The UE may also correspond to a Mobile Termination (MT) portion of an Integrated Access and Backhaul (IAB) node (also referred to as a relay node).
As used herein, the term "network device" refers to a device via which services may be provided to terminal devices in a communication network. As an example, the network device may include a base station. As used herein, the term "base station" (BS) refers to a network device via which services may be provided to terminal devices in a communication network. A base station may comprise any suitable device via which a terminal device or UE may access a communication network. Examples of a base station include a relay, an Access Point (AP), a transmission point (TRP), a NodeB (NodeB or NB), an evolved NodeB (eNodeB or eNB), a New Radio (NR) NodeB (gnb), a remote radio module (RRU), a Radio Header (RH), a Remote Radio Head (RRH), a low power node such as femto, pico, etc. The relay node may correspond to a Distributed Unit (DU) portion of the IAB node.
As used herein, the term "circuitry" may refer to one or more or all of the following:
(a) a purely hardware circuit implementation (such as an implementation using only analog and/or digital circuitry), and
(b) a combination of hardware circuitry and software, such as (as applicable):
(i) combinations of analog and/or digital hardware circuit(s) and software/firmware, and
(ii) hardware processor(s) with software (including digital signal processor (s)), software, and any portion of memory(s) that work together to cause a device, such as a mobile phone or server, to perform various functions, and
(c) hardware circuit(s) and/or processor(s), such as microprocessor(s) or a portion of microprocessor(s), that require software (e.g., firmware)
The operation is performed but the software may not be present when the operation is not required.
The definition of circuitry is appropriate for all uses of the term in this application, including in any claims. As another example, as used in this application, the term circuitry also encompasses implementations of only a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. For example, the term circuitry, if applicable to a particular claim element, also encompasses a baseband integrated circuit or processor integrated circuit for a mobile device, or a similar integrated circuit in a server, cellular base station, or other computing or base station.
As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The term "include" and its variants should be understood as open-ended terms meaning "including, but not limited to". The term "based on" should be understood as "based at least in part on". The terms "one embodiment" and "an embodiment" should be understood as "at least one embodiment". The term "another embodiment" should be understood as "at least one other embodiment". Other definitions (explicit and implicit) may be included below.
As used herein, the terms "first," "second," and the like may be used herein to describe various elements, which should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of example embodiments. As used herein, the term "and/or" includes any and all combinations of one or more of the listed terms.
As described above, Machine Learning (ML) techniques are used to enhance CSI feedback. An example architecture for ML-based CSI feedback is shown in fig. 1. In the architecture 100 shown in fig. 1, the CSI encoder 105 and the CSI decoder 110 are jointly designed using a Convolutional Neural Network (CNN). CSI encoder 105 may learn a transition from the input channel samples to the compressed CSI version. The compressed CSI version is then quantized and reported to the CSI decoder 110 over the feedback channel 115. The CSI decoder 110 may learn to recover the original channel matrix from the received CSI.
For example, CSI encoder 105 may transform a channel matrix H having N coefficients into an M-dimensional vector s as CSI using CNN, where M < N. The CSI compression ratio γ is M/N. The first layer of the CSI encoder 105 may be a convolutional layer that generates two feature maps using a kernel with dimensions 3 × 3. The feature map can be reshaped and then compressed into a vector s by a fully connected layer.
On the other hand, the CSI decoder 110 may perform the inverse transformation from the vector s to the original channel matrix H over several layers using CNN. The first layer of the CSI decoder 110 may be a fully-connected layer that provides an initial estimate of the channel matrix H. The initial estimate of the channel matrix H can then be fed to several "reflonenet cells" which progressively refine the reconstruction and each cell comprises four layers. After refinement by several reflonenet units, the channel matrix H is input to the last convolutional layer to derive the final reconstructed channel matrix H.
After the end-to-end training process based on sufficient channel samples, two sets of parameters related to CSI encoder 105 and CSI decoder 110 may be jointly determined and used jointly to achieve satisfactory CSI feedback and recovery capability. The end-to-end NN training process may be implemented on the gNB side or the UE side based on a large number of channel samples. Due to the reciprocity of the bidirectional channels in TDD deployment, a large amount of NN training data is easily obtained. However, obtaining a large amount of channel measurement data for NN training in FDD deployments is a significant challenge.
Example embodiments of the present disclosure propose a CSI feedback mechanism to feed back different types of CSI, where one type of CSI has high accuracy and another type of CSI has low accuracy. The high-precision CSI may be obtained by any existing or future-developed method for improving CSI feedback precision. The high-precision CSI is used to train a machine learning model for CSI feedback. Various types of Machine Learning (ML) models may be used herein, including, but not limited to, logistic regression models, Neural Networks (NN), such as concurrent NN and cyclic NN, support vector machines, random forests, and the like. The trained ML model may then be used to generate low-precision CSI to improve CSI feedback efficiency.
The training of the ML model may be implemented by a terminal device, such as a UE, or a network device, such as a gNB. Some example embodiments of the present disclosure will be discussed in the context of a network device performing training. Training at the network device may provide further benefits since the network device has greater NN processing power and has easier access to channel measurement data for multiple UEs.
FIG. 2 illustrates an example environment 200 in which example embodiments of the present disclosure may be implemented.
The environment 200, which may be part of a communication network, includes two devices 210 and 220, referred to as a first device 210 and a second device 220, respectively, in communication with each other. In this example, the first device 210 is implemented by a terminal device, such as a UE, and the second device 220 is implemented by a network device, such as a base station. The first device 210 and the second device 220 may be implemented by any other suitable device. For example, both the first device 210 and the second device 220 may be implemented by terminal devices that communicate via a device-to-device (D2D) link or sidechain.
The communication between the first device 210 and the second device 220 may conform to any suitable communication standard or protocol that already exists or is developed in the future, such as Universal Mobile Telecommunications System (UMTS), Long Term Evolution (LTE), LTE advanced (LTE-a), fifth generation (5G) New Radio (NR), wireless fidelity (Wi-Fi), and Worldwide Interoperability for Microwave Access (WiMAX) standards, and employs any suitable communication techniques including, for example, multiple-input multiple-output (MIMO), Orthogonal Frequency Division Multiplexing (OFDM), Time Division Multiplexing (TDM), Frequency Division Multiplexing (FDM), Code Division Multiplexing (CDM), bluetooth, ZigBee, Machine Type Communication (MTC), enhanced machine type communication (eMTC), enhanced mobile broadband (eMBB), large scale Machine Type Communication (MTC), ultra-reliable low latency communication (URLLC), Carrier Aggregation (CA), Dual Connectivity (DC), and new radio license exemption (NR-U) techniques.
The first device 210 may feed back CSI to the second device 220. In various embodiments of the present disclosure, the first device 210 may report different types of CSI, including high-precision CSI and low-precision CSI. The second device 220 trains the ML model using the high-precision CSI to obtain a set of parameters for generation of the CSI and a set of parameters for reconstruction of the CSI. These two sets of parameters will be used for CSI generation and reconstruction to improve CSI feedback efficiency.
It should be understood that two devices are shown in environment 200 for illustration purposes only and do not imply any limitations. In some example embodiments, there may be more devices feeding back CSI to the second device 220. Thus, the second device 220 may obtain more CSI from more devices and thereby improve training efficiency.
Fig. 3 illustrates an example process 300 between a first device 210 and a second device 220, according to some example embodiments of the present disclosure.
In process 300, the first device 210 determines (305) the type of CSI to be fed back. The type of CSI may be high-precision CSI or low-precision CSI. In some example embodiments, feedback instances or opportunities for particular types of CSI may be predefined or preconfigured in the network. In this example, the first device 210 may determine high-precision or low-precision CSI to be fed back in the respective feedback instance or opportunity.
In some example embodiments, the first device 210 may determine the type of CSI to feedback based on the indication from the second device 220. For example, the second device 220 may send an indication of feedback to the first device 210 and other devices for some type of CSI. The indication may be sent for each feedback instance or opportunity to indicate the type of CSI feedback to be fed back in the feedback opportunity. An example process of indicating the type of CSI to be fed back will be discussed below with reference to fig. 4.
Fig. 4 illustrates an example configuration 400 for feedback of two types of CSI in accordance with some example embodiments of the present disclosure.
The multiple boxes 405-1, … …, 405-K shown represent a continuous time sequence, where K represents a positive integer. In configuration 400, blocks 405-1 and 405-9 are configured as feedback instances for high-precision CSI, and blocks 405-3, 405-5, 405-7, and 405-11 are configured as feedback instances for low-precision CSI. In this example, the two types of CSI may be fed back periodically or aperiodically at unequal intervals. The feedback interval for high-precision CSI is longer than for low-precision CSI. While high-precision CSI may bring higher feedback overhead, it occurs for a longer period. Therefore, the overall feedback overhead can be controlled within a reasonable range.
Thus, the second device 220 needs to indicate to one or more devices, including the first device 210, the type of CSI to be fed back for each feedback instance. In example embodiments where the first device 210 is implemented by a terminal device and the second device 220 is implemented by a network device, a new parameter or field may be defined in Downlink Control Information (DCI) to carry the indication. The new DCI parameter may have only 1 bit to indicate both types of CSI, including high-precision and low-precision CSI. For example, "1" indicates feedback of high-precision CSI, and "0" indicates feedback of low-precision CSI. Alternatively or additionally, the indication may be carried in other signaling such as Radio Resource Control (RRC) signaling.
Still referring to fig. 3, after the first device 210 determines (305) the type of CSI to be fed back, the first device 210 generates (310) the type of CSI. If high accuracy CSI is to be fed back, the first device 210 generates (310) CSI with higher accuracy. The higher accuracy may be higher than a predetermined threshold accuracy. Any suitable method may be employed to improve CSI feedback accuracy.
In some example embodiments, the first device 210 may quantize the channel measurement data with a number of bits higher than a threshold number of bits and then form the high-precision CSI from the quantized channel measurement data. The threshold number may be predefined or preconfigured. In some example embodiments, the threshold number may be predefined or preconfigured for the entire payload of CSI. In some other example embodiments, where the channel measurement data is represented by one or more channel matrices, the threshold number may be predefined or preconfigured for the channel elements in the channel matrices. In this case, the first device 210 may quantize a larger number of channel elements or even each channel element in the channel matrix with a number of bits greater than the threshold number of bits. For example, 5 to 6 bits may be used for quantization of amplitude and phase or real and imaginary parts in channel elements. The first device 210 then generates high-precision CSI from the quantized channel elements.
The first device 210 then transmits (315) the generated CSI to the second device 220. Accordingly, the second device 220 performs (320) operations based on the type of CSI received. If the second device 220 has sent an indication to the first device 210 for feedback of a certain type of CSI, such as through DCI or RRC signaling, the second device 220 may know the type of CSI received.
If high-precision CSI is received, the second device 210 trains the ML model using at least the received CSI. In some example embodiments, the second device 210 uses the newly received CSI as well as the previously received stored CSI to further improve the training efficiency of the ML model. After training the ML model, the second device 220 may obtain a set of parameters for generation of CSI (referred to as a first set) and a set of parameters for reconstruction of CSI (referred to as a second set). The parameters may be associated with any suitable characteristics of the ML model, including, for example, the architecture of the ML model and the encoding and decoding weight coefficients. Due to the end-to-end training requirements, these two sets of parameters can be applied jointly for CSI generation and reconstruction (or recovery).
The high-precision CSI may also be used by the second device 220 for other purposes. For example, the second device 220 may use the high-precision CSI for multi-user MIMO (MU-MIMO) scheduling and precoding to improve communication efficiency.
In some example embodiments, the second device 220 may send a first set of parameters to the first device 210 such that the first device 210 may generate CSI (such as low-precision CSI) using the first set of parameters. The first set of parameters may be sent by the second device 220 to the first device 210 via RRC signaling.
The first device 210 may generate CSI using the first set of parameters if it is determined (305) that low-precision CSI is to be fed back. For example, where the ML model includes an NN, the first device 210 may configure the NN encoder using a first set of parameters. The first device 210 may then transform and compress the channel matrix (representing the channel measurement data) by the NN encoder and then quantize to fewer channel elements to generate the low-precision CSI. The low-precision CSI may be transmitted (315) by the first device 210 to the second device 220 with less payload than the high-precision CSI. For example, the payload may contain a number of bits below a predefined or preconfigured threshold number of bits.
It is also possible to generate low-precision CSI without using the ML model. In this case, the first device 210 may quantize the channel measurement data with a number of bits lower than the threshold number of bits and form the low-precision CSI from the quantized channel measurement data.
After receiving the low-precision CSI, the second device 220 may reconstruct the CSI based on the received CSI using the second set of parameters. For example, the second device 220 may configure the NN decoder using the second set of parameters and perform an inverse transform by the NN decoder to reconstruct or recover the original channel matrix from the received CSI. The reconstructed or recovered channel matrix may be used for MU-MIMO scheduling and precoding.
In some example embodiments, the ML model may be trained by the second device for each channel scenario to obtain more than one set of parameters for CSI generation and more than one set of parameters for CSI construction to further improve CSI feedback efficiency. In these example embodiments, the first device 210 may identify channel scenarios, such as city micro (UMi), city macro (UMa), indoor, mixed types, and so on. The first device 210 may then transmit an indication of the identified channel scenario to the second device 220. The indication of the channel scenario may be implemented in any suitable form. For example, the indication may be implemented in an index of the channel type. In practice, the indication may also include a specific value (e.g., 0) that is independent of any channel type, but represents CSI generation and reconstruction that does not use the ML model.
Thus, if high-precision CSI is received, the second device 220 performs training of the ML model for a particular channel scene. A first set of parameters and a second set of parameters are also generated for the particular channel scenario. If low-precision CSI is received, the second device 220 may construct a decoder for the particular channel scenario.
In these example embodiments, the second device 220 may transmit multiple sets of parameters for multiple ML models and indications of associated channel scenarios such that the first device 210 may generate CSI using the parameters associated with the current channel scenario.
An ML model may also be trained by the second device 220 for a similar plurality of channel scenes. Alternatively or additionally, the second device 220 may train a generic ML model for various channel scenarios.
Fig. 5 illustrates an example process 500 between the first device 210 and the second device 220, according to some other example embodiments of the present disclosure. Process 500 is an example implementation of process 300 as shown in fig. 3. In this example, the first device 210 is implemented by a terminal device and the second device 220 is implemented by a network device. The ML model is implemented by NN, and the type of CSI to be fed back is indicated by DCI parameters with 1 bit.
In the process 500, the second device 220 indicates (505) the different channel scenarios and the sets of related parameters for the NN encoder to the first device 210 through upper layer RRC signaling. Different channel scenarios may be indicated by different NN types. The second device 220 indicates (510) the type of CSI to be fed back to the first device 210 and other devices, such as other terminal devices, through a new DCI parameter with 1 bit, where "1" indicates feedback of high-precision CSI and "0" indicates feedback of low-precision CSI.
The first device 210 identifies (515) a current channel scenario, such as UMi, UMa, indoor, hybrid type, etc. The first device 210 may also quantize the identified channel scenario into an index of the channel type to be indicated to the second device 220.
The first device 210 performs (520) CSI quantization. If the DCI parameter is set to 1, the first device 21 may quantize the channel elements with a greater number of bits than usual to generate high-precision CSI. If the DCI parameter is set to 0, the first device 210 may configure the NN encoder according to a first set of parameters indicated in RRC signaling and then transform and compress a channel matrix (such as a DL channel matrix) into fewer channel elements through the NN encoder. The first device 210 may eventually quantize channel elements in the channel matrix with less payload than the high-precision CSI.
The first device 210 reports (525) the CSI and the index of the channel type to the second device 220 as an indication of the current channel scenario. The second device 220 processes (530) the received CSI. When the DCI parameter is set to 1, the second device 220 may receive the high-precision CSI report and the channel type index as the channel scene indication. Then, the second device 220 may perform NN training by using channel samples included in the newly received CSI and previously stored channel samples included in previously received CSI having the same or similar channel scenario.
When the DCI parameter is set to 0, the second device 220 may receive the low-precision CSI and an index of a channel type as an indication of a channel scenario. The second device 220 may configure the NN decoder using the second set of parameters according to the index of the channel type and perform inverse transformation through the NN decoder and reconstruct or restore the original channel matrix from the received CSI. The second device 220 may also use the recovered channel matrix for MU-MIMO scheduling and precoding.
The CSI feedback mechanism remarkably improves the CSI feedback efficiency. For example, from simulation results, high accuracy CSI improves end-to-end CSI training efficiency. The simulation was implemented in a Long Term Evolution (LTE) three-dimensional (3D) UMa scenario. In this case, a large number of channel samples are generated to form the training set. All channel samples have 16 Tx antennas and 1 Rx antenna in the 2GHz band. There are 600 active subcarriers within the 10MHz bandwidth.
Frequency domain compression is performed based on a Discrete Fourier Transform (DFT) operation and 32 main taps are selected for each channel sample representing a channel measurement. All channel samples of dimension 32 × 16 in the training and validation sets are used to train the full parameter set in the encoder and decoder of the NN architecture. The channel samples in the test set are used to evaluate the performance or capability of channel reconstruction using the trained parameter set.
The channel recovery capability is evaluated in terms of two metrics, including Normalized Mean Square Error (NMSE) and channel correlation coefficient ρ. The NMSE may be defined by equation (1) as follows.
Figure BDA0003669089030000131
Wherein
Figure BDA0003669089030000132
Representing the reconstructed channel matrix and H the original channel samples in the test set. The channel correlation coefficient ρ may be defined by equation (2):
Figure BDA0003669089030000133
wherein
Figure BDA0003669089030000134
A reconstructed channel vector, h, representing a subcarrier nnRepresenting the original channel vector for subcarrier n in the test set.
Table 1 shows a comparison of different quantization methods of channel samples in a training set and a validation set used as input for NN training.
TABLE 1
Amount of input sampleTransforming NMSE(dB) Correlation ρ
No quantization -11.46 0.96
6 bit quantization -10.08 0.95
5 bit quantization -7.82 0.92
The comparison takes 5 to 6 bits to quantize the real and imaginary parts, respectively, in each channel element. The baseline uses raw channel samples in the training and validation set, which are not quantized. Compression ratio gamma of NN training is set as
Figure BDA0003669089030000135
As shown in table 1, the NMSE loss for the 6-bit quantization of the input channel samples is only 1.38dB and the channel correlation capability is almost the same compared to no quantization of the input samples. Using 5-bit quantization to further reduce the CSI feedback accuracy may affect the performance of NN training.
Fig. 6 illustrates a flow diagram of an example method 600, according to some example embodiments of the present disclosure. The method 600 may be implemented by the first device 210 shown in fig. 2. For ease of discussion, the method 600 will be described with reference to fig. 2.
At block 605, the first device 210 determines the type of CSI to be fed back. The types of CSI may include high-precision CSI and low-precision CSI. At block 610, the first device 210 generates the type of CSI. At block 615, the first device 210 sends the generated CSI to the second device 220.
In some example embodiments, the first device 210 may receive an indication of feedback for this type of CSI from the second device 220. Based on the indication, the first device 210 may determine the type of CSI to feedback. The type of CSI to be fed back may also be determined according to a predefined configuration of CSI feedback.
In some example embodiments, if high-precision CSI is to be fed back, the first device 210 may quantize the channel measurement data with a number of bits higher than a predefined or preconfigured threshold number of bits and then form the high-precision CSI from the quantized channel measurement data.
In some example embodiments, the first device 210 may receive at least a first set of parameters for the generation of CSI from the second device 220. The first device 210 may also receive an indication of a channel scenario associated with the first set of parameters from the second device 220. In the case where the low-precision CSI is to be fed back, the first device 210 may generate the low-precision CSI having the number of bits lower than the threshold number of bits based on the channel measurement data using the first set of parameters. For example, the first device 210 may transform the channel measurement data into low-precision CSI by using the first set of parameters. The channel measurement data may be represented in any suitable form. As an example, the channel measurement data may be represented by a plurality of channel elements in one or more channel matrices.
In some example embodiments, the first device 210 may identify a channel scenario associated with channel measurement data used to generate CSI and then send an indication of the identified channel scenario to the second device 220.
Fig. 7 illustrates a flow diagram of an example method 700 in accordance with some example embodiments of the present disclosure. The method 700 may be implemented by the second device 220 shown in fig. 2. For discussion purposes, the method 700 will be described with reference to fig. 2.
At block 705, the second device 220 receives CSI from the first device 210. In some example embodiments, the second device 220 may send an indication of feedback for this type of CSI to the first device 210 to indicate the type of CSI that the first device 210 feeds back. At block 710, if the type of received CSI comprises high-precision CSI, the second device 220 trains a machine learning model using at least the received CSI to obtain at least a first set of parameters for generation of CSI and at least a second set of parameters for reconstruction of CSI.
In some example embodiments, the second device 220 may train the machine learning model using the received CSI and previously stored CSI. In some example embodiments, the second device 220 may receive an indication of a channel scenario associated with the received CSI from the first device 210. Thus, the received CSI may be used to train an ML model associated with the channel scene.
At block 715, the second device 220 transmits a first set of parameters to the first device 210 so that the first device 210 may generate CSI using the first set of parameters. In some example embodiments, the second device 220 may send an indication of a channel scenario associated with the first set of parameters to the first device 210.
In some example embodiments, if the type of received CSI comprises low-precision CSI, the second device 220 may reconstruct the CSI based on the received CSI using a second set of parameters. In example embodiments where the indication of the channel scenario is received from the first device 210, the second device 220 may use the set of parameters associated with the channel scenario for CSI reconstruction or recovery.
All operations and features in processes 300 and 500 as described above with reference to fig. 3-5 are equally applicable to methods 600 and 700 and have similar effects. Details will be omitted for simplicity.
Fig. 8 is a simplified block diagram of a device 800 suitable for implementing an example embodiment of the present disclosure. The device 800 may be implemented at or as part of the first device 210 or the second device 220 as shown in fig. 2.
As shown, the device 800 includes a processor 810, a memory 820 coupled to the processor 810, a communication module 830 coupled to the processor 810, and a communication interface (not shown) coupled to the communication module 830. The memory 820 stores at least a program 840. The communication module 830 is used for bi-directional communication, e.g., via multiple antennas. The communication interface may represent any interface required for communication.
The program 840 is assumed to include program instructions that, when executed by the associated processor 810, enable the device 800 to operate in accordance with example embodiments of the present disclosure, as discussed herein with reference to fig. 2-7. The example embodiments herein may be implemented by computer software executable by the processor 810 of the device 800, or by hardware, or by a combination of software and hardware. The processor 810 may be configured to implement various example embodiments of the present disclosure.
The memory 820 may be of any type suitable to the local technology network and may be implemented using any suitable data storage technology, such as non-transitory computer-readable storage media, semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed memory and removable memory, as non-limiting examples. Although only one memory 820 is shown in device 800, there may be several physically distinct memory modules in device 800. The processor 810 may be of any type suitable to the local technology network, and may include one or more of general purpose computers, special purpose computers, microprocessors, Digital Signal Processors (DSPs) and processors based on a multi-core processor architecture, as non-limiting examples. Device 800 may have multiple processors, such as application specific integrated circuit chips that are time-dependent from a clock synchronized to the main processor.
When the device 800 is acting as the first device 210 or part of the first device 210, the processor 810 and the communication module 830 may cooperate to implement the method 600 as described above with reference to fig. 6. When the device 800 is acting as the second device 220 or part of the second device 220, the processor 810 and the communication module 830 may cooperate to implement the method 700 as described above with reference to fig. 7. All of the operations and features described above with reference to fig. 2-7 are equally applicable to the device 800 and have similar effects. Details will be omitted for simplicity.
In general, the various example embodiments of this disclosure may be implemented using hardware or special purpose circuits, software, logic or any combination thereof. Some aspects may be implemented using hardware, while other aspects may be implemented using firmware or software which may be executed by a controller, microprocessor or other computing device. While various aspects of the example embodiments of this disclosure are illustrated and described as block diagrams, flow charts, or using some other pictorial representation, it is well understood that the blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
The present disclosure also provides at least one computer program product tangibly stored on a non-transitory computer-readable storage medium. The computer program product comprises computer-executable instructions, such as instructions included in program modules, that execute in the device on the target real or virtual processor to perform the processes 300 and 500 and methods 600 and 700 as described above with reference to fig. 2-7. Generally, program modules include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types. In various embodiments, the functionality of the program modules may be combined or split between program modules as desired. Machine-executable instructions of program modules may be executed within local or distributed devices. In a distributed facility, program modules may be located in both local and remote memory storage media.
Program code for performing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present disclosure, computer program code or related data may be carried by any suitable carrier to enable a device, apparatus or processor to perform various processes and operations as described above. Examples of the carrier include a signal, computer readable medium.
The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a computer-readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Further, while operations are described in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous. Also, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure, but rather as descriptions of features that may be specific to particular example embodiments. Certain features that are described in the context of separate example embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple example embodiments separately or in any suitable subcombination.
Although the disclosure has been described in language specific to structural features and/or methodological acts, it is to be understood that the disclosure defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Various example embodiments of the technology have been described. In addition to or in place of the foregoing, the following embodiments are described. Features described in any of the examples below may be used with any of the other examples described herein.
In some aspects, a first device comprises: at least one processor; and at least one memory including computer program code; the at least one memory and the computer program code configured to, with the at least one processor, cause the first apparatus to: determining a type of channel state information to be fed back; generating channel state information of the type; and transmitting the generated channel state information to the second device.
In some example embodiments, the first device is caused to determine the type of channel state information to be fed back by: receiving an indication of feedback of channel state information of a type from a second device; and determining a type of channel state information to be fed back based on the indication.
In some example embodiments, the type of channel state information comprises high accuracy channel state information, and the first device is caused to generate the type of channel state information by: quantizing the channel measurement data with a number of bits higher than a threshold number of bits; and forming high accuracy channel state information from the quantized channel measurement data.
In some example embodiments, the first device is further caused to: at least a first set of parameters for generation of channel state information is received from a second device.
In some example embodiments, the first device is further caused to: an indication of a channel scenario associated with the first set of parameters is received from the second device.
In some example embodiments, the type of channel state information comprises low precision channel state information, and the first device is further caused to generate the type of channel state information by: the channel measurement data is transformed into low precision channel state information having a number of bits lower than a threshold number of bits by using a first set of parameters.
In some example embodiments, the first device is further caused to: identifying a channel scenario associated with channel measurement data used to generate channel state information; and transmitting an indication of the identified channel scenario to the second device.
In some aspects, a second device comprises: at least one processor; and at least one memory including computer program code; the at least one memory and the computer program code configured to, with the at least one processor, cause the second apparatus to: receiving channel state information from a first device; training a machine learning model by using at least the received channel state information to obtain at least a first set of parameters for generation of the channel state information and at least a second set of parameters for reconstruction of the channel state information according to a type of the received channel state information including high-precision channel state information; and sending the first set of parameters to the first device.
In some example embodiments, the second device is further caused to: an indication of feedback of the type of channel state information is sent to the first device.
In some example embodiments, the second device is caused to train the machine learning model by: the machine learning model is trained using the received channel state information and previously stored channel state information.
In some example embodiments, the second device is further caused to: an indication of a channel scenario associated with the first set of parameters is sent to the first device.
In some example embodiments, the second device is further caused to: reconstructing channel state information based on the received channel state information using a second set of parameters according to a type of the received channel state information including the low precision channel state information.
In some example embodiments, the second device is further caused to: an indication of a channel scenario associated with the received channel state information is received from the first device.
In some aspects, a method implemented at a first device includes: determining a type of channel state information to be fed back; generating channel state information of the type; and transmitting the generated channel state information to the second device.
In some example embodiments, determining the type of channel state information to feedback comprises: receiving an indication of feedback of channel state information of a type from a second device; and determining the type of channel state information to be fed back based on the indication.
In some example embodiments, the type of channel state information comprises high precision channel state information, and generating the type of channel state information comprises: quantizing the channel measurement data with a number of bits higher than a threshold number of bits; and forming high accuracy channel state information from the quantized channel measurement data.
In some example embodiments, the method further comprises: at least a first set of parameters for generation of channel state information is received from a second device.
In some example embodiments, the method further comprises: an indication of a channel scenario associated with the first set of parameters is received from the second device.
In some example embodiments, the type of channel state information comprises low precision channel state information, and generating the type of channel state information comprises: the channel measurement data is transformed into low precision channel state information having a number of bits lower than a threshold number of bits by using a first set of parameters.
In some example embodiments, the method further comprises: identifying a channel scenario associated with channel measurement data used to generate channel state information; and transmitting an indication of the identified channel scenario to the second device.
In some aspects, a method implemented at a second device includes: receiving channel state information from a first device; training a machine learning model by using at least the received channel state information to obtain at least a first set of parameters for generation of the channel state information and at least a second set of parameters for reconstruction of the channel state information according to a type of the received channel state information including high-precision channel state information; and sending the first set of parameters to the first device.
In some example embodiments, the method further comprises: an indication of feedback of the type of channel state information is sent to the first device.
In some example embodiments, training the machine learning model comprises: the machine learning model is trained using the received channel state information and previously stored channel state information.
In some example embodiments, the method further comprises: an indication of a channel scenario associated with the first set of parameters is sent to the first device.
In some example embodiments, the method further comprises: reconstructing channel state information based on the received channel state information using a second set of parameters according to a type of the received channel state information including the low precision channel state information.
In some example embodiments, the method further comprises: an indication of a channel scenario associated with the received channel state information is received from the first device.
In some aspects, an apparatus comprises: means for determining a type of channel state information to be fed back; means for generating channel state information for a type; and means for transmitting the generated channel state information to the second device.
In some example embodiments, the means for determining the type of channel state information to feedback comprises: means for receiving an indication of feedback of channel state information of a type from a second device; and means for determining a type of channel state information to be fed back based on the indication.
In some example embodiments, the type of channel state information comprises high precision channel state information, and the means for generating the type of channel state information comprises: means for quantizing the channel measurement data with a number of bits above a threshold number of bits; and means for forming high accuracy channel state information from the quantized channel measurement data.
In some example embodiments, the apparatus further comprises: means for receiving at least a first set of parameters for generation of channel state information from a second device.
In some example embodiments, the apparatus further comprises: means for receiving, from the second device, an indication of a channel scenario associated with the first set of parameters.
In some example embodiments, the type of channel state information comprises low precision channel state information, and the means for generating the type of channel state information comprises: means for transforming the channel measurement data into low precision channel state information having a number of bits below a threshold number of bits by using a first set of parameters.
In some example embodiments, the apparatus further comprises: means for identifying a channel scenario associated with channel measurement data used to generate channel state information; and means for transmitting an indication of the identified channel scenario to the second device.
In some aspects, an apparatus comprises: means for receiving channel state information from a first device; means for training a machine learning model by using at least the received channel state information to obtain at least a first set of parameters for generation of the channel state information and at least a second set of parameters for reconstruction of the channel state information according to a type of the received channel state information including high precision channel state information; and means for transmitting the first set of parameters to the first device.
In some example embodiments, the apparatus further comprises: means for transmitting an indication of feedback of channel state information of the type to the first device.
In some example embodiments, the means for training the machine learning model comprises: means for training a machine learning model using the received channel state information and previously stored channel state information.
In some example embodiments, the apparatus further comprises: means for transmitting, to the first device, an indication of a channel scenario associated with the first set of parameters.
In some example embodiments, the apparatus further comprises: means for reconstructing channel state information based on the received channel state information using a second set of parameters according to a type of the received channel state information including low precision channel state information.
In some example embodiments, the apparatus further comprises: means for receiving, from a first device, an indication of a channel scenario associated with the received channel state information.
In some aspects, a computer-readable storage medium includes program instructions stored thereon that, when executed by a processor of a device, cause the device to perform a method according to some example embodiments of the present disclosure.

Claims (30)

1. A first device, comprising:
at least one processor; and
at least one memory including computer program code;
the at least one memory and the computer program code configured to, with the at least one processor, cause the first apparatus to:
determining a type of channel state information to be fed back;
generating the type of channel state information; and
transmitting the generated channel state information to a second device.
2. The first device of claim 1, wherein the first device is caused to determine the type of channel state information to feed back by:
receiving an indication of feedback of the type of channel state information from the second device; and
determining that the type of channel state information is to be fed back based on the indication.
3. The first device of claim 1, wherein the type of channel state information comprises high precision channel state information, and the first device is caused to generate the type of channel state information by:
quantizing the channel measurement data with a number of bits higher than a threshold number of bits; and
forming the high precision channel state information from the quantized channel measurement data.
4. The first device of claim 1, wherein the first device is further caused to:
receiving at least a first set of parameters for generation of channel state information from the second device.
5. The first device of claim 4, wherein the first device is further caused to:
receiving, from the second device, an indication of a channel scenario associated with the first set of parameters.
6. The first device of claim 4, wherein the type of channel state information comprises low precision channel state information, and the first device is further caused to generate the type of channel state information by:
transforming channel measurement data into the low-precision channel state information having a number of bits lower than a threshold number of bits by using the first set of parameters.
7. The first device of claim 1, wherein the first device is further caused to:
identifying a channel scenario associated with channel measurement data used to generate the channel state information; and
transmitting an indication of the identified channel scenario to the second device.
8. A second device, comprising:
at least one processor; and
at least one memory including computer program code;
the at least one memory and the computer program code configured to, with the at least one processor, cause the second apparatus to:
receiving channel state information from a first device;
training a machine learning model by using at least the received channel state information to obtain at least a first set of parameters for generation of channel state information and at least a second set of parameters for reconstruction of channel state information according to the type of the received channel state information including high precision channel state information; and
transmitting the first set of parameters to the first device.
9. The second device of claim 8, wherein the second device is further caused to:
sending an indication of feedback of the type of channel state information to the first device.
10. The second device of claim 8, wherein the second device is caused to train the machine learning model by:
training the machine learning model using the received channel state information and previously stored channel state information.
11. The second device of claim 8, wherein the second device is further caused to:
transmitting, to the first device, an indication of a channel scenario associated with the first set of parameters.
12. The second device of claim 8, wherein the second device is further caused to:
reconstructing channel state information based on the received channel state information using the second set of parameters according to the type of the received channel state information that includes low precision channel state information.
13. The second device of claim 8, wherein the second device is further caused to:
receiving, from the first device, an indication of a channel scenario associated with the received channel state information.
14. A method implemented at a first device, comprising:
determining a type of channel state information to be fed back;
generating the type of channel state information; and
transmitting the generated channel state information to a second device.
15. The method of claim 14, wherein determining the type of channel state information to feedback comprises:
receiving an indication of feedback of the type of channel state information from the second device; and
determining that the type of channel state information is to be fed back based on the indication.
16. The method of claim 14, wherein the type of channel state information comprises high precision channel state information, and generating the type of channel state information comprises:
quantizing the channel measurement data with a number of bits higher than a threshold number of bits; and
forming the high precision channel state information from the quantized channel measurement data.
17. The method of claim 14, further comprising:
receiving at least a first set of parameters for generation of channel state information from the second device.
18. The method of claim 17, further comprising:
receiving, from the second device, an indication of a channel scenario associated with the first set of parameters.
19. The method of claim 17, wherein the type of channel state information comprises low precision channel state information, and generating the type of channel state information comprises:
transforming channel measurement data into the low-precision channel state information having a number of bits lower than a threshold number of bits by using the first set of parameters.
20. The method of claim 14, further comprising:
identifying a channel scenario associated with channel measurement data used to generate the channel state information; and
transmitting an indication of the identified channel scenario to the second device.
21. A method implemented at a second device, comprising:
receiving channel state information from a first device;
training a machine learning model by using at least the received channel state information to obtain at least a first set of parameters for generation of channel state information and at least a second set of parameters for reconstruction of channel state information according to the type of the received channel state information including high precision channel state information; and
transmitting the first set of parameters to the first device.
22. The method of claim 21, further comprising:
sending an indication of feedback of the type of channel state information to the first device.
23. The method of claim 21, wherein training the machine learning model comprises:
training the machine learning model using the received channel state information and previously stored channel state information.
24. The method of claim 21, further comprising:
transmitting, to the first device, an indication of a channel scenario associated with the first set of parameters.
25. The method of claim 21, further comprising:
reconstructing channel state information based on the received channel state information using the second set of parameters according to the type of the received channel state information that includes low precision channel state information.
26. The method of claim 21, further comprising:
receiving, from the first device, an indication of a channel scenario associated with the received channel state information.
27. An apparatus, comprising:
means for determining a type of channel state information to be fed back;
means for generating the type of channel state information; and
means for transmitting the generated channel state information to a second device.
28. An apparatus, comprising:
means for receiving channel state information from a first device;
means for training a machine learning model by using at least the received channel state information to obtain at least a first set of parameters for generation of channel state information and at least a second set of parameters for reconstruction of channel state information according to the type of the received channel state information including high precision channel state information; and
means for transmitting the first set of parameters to the first device.
29. A computer readable storage medium comprising program instructions stored thereon which, when executed by a processor of an apparatus, cause the apparatus to perform the method of any of claims 14 to 20.
30. A computer readable storage medium comprising program instructions stored thereon which, when executed by a processor of an apparatus, cause the apparatus to perform the method of any of claims 21 to 26.
CN201980102635.3A 2019-11-29 2019-11-29 Feedback of channel state information Pending CN114747250A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/121999 WO2021102917A1 (en) 2019-11-29 2019-11-29 Feedback of channel state information

Publications (1)

Publication Number Publication Date
CN114747250A true CN114747250A (en) 2022-07-12

Family

ID=76128986

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980102635.3A Pending CN114747250A (en) 2019-11-29 2019-11-29 Feedback of channel state information

Country Status (2)

Country Link
CN (1) CN114747250A (en)
WO (1) WO2021102917A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024026793A1 (en) * 2022-08-04 2024-02-08 北京小米移动软件有限公司 Data transmission method and apparatus, and device, storage medium and system
WO2024031689A1 (en) * 2022-08-12 2024-02-15 北京小米移动软件有限公司 Csi reporting method and apparatus, device, and system
WO2024037321A1 (en) * 2022-08-16 2024-02-22 中国移动通信有限公司研究院 Ai model training method and apparatus, and device and readable storage medium

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115642938A (en) * 2021-07-20 2023-01-24 维沃移动通信有限公司 Information transmission method, information receiving method, device, terminal and network side equipment
CN113922936B (en) * 2021-08-31 2023-04-28 中国信息通信研究院 AI technology channel state information feedback method and equipment
WO2023113677A1 (en) * 2021-12-15 2023-06-22 Telefonaktiebolaget Lm Ericsson (Publ) Nodes, and methods for proprietary ml-based csi reporting
CN116436500A (en) * 2021-12-31 2023-07-14 展讯通信(上海)有限公司 Channel data processing or de-processing method and device, terminal and network equipment
WO2024000440A1 (en) * 2022-06-30 2024-01-04 Shenzhen Tcl New Technology Co., Ltd. Communication device and method for determining channel state information report based on artificial intelligence/machine learning
WO2024031538A1 (en) * 2022-08-11 2024-02-15 Qualcomm Incorporated Frequency domain compression of channel state information
WO2024065800A1 (en) * 2022-09-30 2024-04-04 富士通株式会社 Channel state information feedback method and apparatus

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180091992A1 (en) * 2016-09-28 2018-03-29 Telefonaktiebolaget Lm Ericsson (Publ) Dynamic CSI reporting type
US20180123653A1 (en) * 2016-11-03 2018-05-03 At&T Intellectual Property I, L.P. Providing a format indicator comprising rank indication and channel state information spatial domain resolution type
WO2018127038A1 (en) * 2017-01-05 2018-07-12 华为技术有限公司 Method of transmitting channel state information, device, and system
CN108574521A (en) * 2017-03-10 2018-09-25 上海诺基亚贝尔股份有限公司 Method and apparatus for MIMO communications
CN109155714A (en) * 2016-05-13 2019-01-04 瑞典爱立信有限公司 Multiresolution CSI feedback
CN110300075A (en) * 2019-04-30 2019-10-01 北京科技大学 A kind of radio channel estimation method
CN110419175A (en) * 2017-03-23 2019-11-05 高通股份有限公司 Difference CSI report for higher resolution channel state information (CSI)
US20200059282A1 (en) * 2017-05-05 2020-02-20 Qualcomm Incorporated Procedures for differential channel state information (csi) reporting

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112165349A (en) * 2016-03-31 2021-01-01 华为技术有限公司 Channel state measuring method and device
CN108631847B (en) * 2017-03-24 2021-06-01 华为技术有限公司 Method for transmitting channel state information, terminal equipment and network equipment
CN108809374B (en) * 2017-05-05 2021-08-13 华为技术有限公司 Data transmission method, terminal equipment and network equipment
CN109428639B (en) * 2017-08-24 2021-04-09 上海诺基亚贝尔股份有限公司 Method and apparatus for determining channel state information

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109155714A (en) * 2016-05-13 2019-01-04 瑞典爱立信有限公司 Multiresolution CSI feedback
US20180091992A1 (en) * 2016-09-28 2018-03-29 Telefonaktiebolaget Lm Ericsson (Publ) Dynamic CSI reporting type
US20180123653A1 (en) * 2016-11-03 2018-05-03 At&T Intellectual Property I, L.P. Providing a format indicator comprising rank indication and channel state information spatial domain resolution type
WO2018127038A1 (en) * 2017-01-05 2018-07-12 华为技术有限公司 Method of transmitting channel state information, device, and system
CN108574521A (en) * 2017-03-10 2018-09-25 上海诺基亚贝尔股份有限公司 Method and apparatus for MIMO communications
CN110419175A (en) * 2017-03-23 2019-11-05 高通股份有限公司 Difference CSI report for higher resolution channel state information (CSI)
US20200059282A1 (en) * 2017-05-05 2020-02-20 Qualcomm Incorporated Procedures for differential channel state information (csi) reporting
CN110300075A (en) * 2019-04-30 2019-10-01 北京科技大学 A kind of radio channel estimation method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZTE, SANECHIPS: "RP-191845 "Evolution of NR MIMO in Rel-17"", 3GPP TSG_RAN\\TSG_RAN, no. 85, 10 September 2019 (2019-09-10) *
ZTE: "Evolution of NR MIMO in Rel-17", 《3GPP TSG RAN MEETING #85 RP-191845》, 9 September 2019 (2019-09-09), pages 6 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024026793A1 (en) * 2022-08-04 2024-02-08 北京小米移动软件有限公司 Data transmission method and apparatus, and device, storage medium and system
WO2024031689A1 (en) * 2022-08-12 2024-02-15 北京小米移动软件有限公司 Csi reporting method and apparatus, device, and system
WO2024037321A1 (en) * 2022-08-16 2024-02-22 中国移动通信有限公司研究院 Ai model training method and apparatus, and device and readable storage medium

Also Published As

Publication number Publication date
WO2021102917A1 (en) 2021-06-03

Similar Documents

Publication Publication Date Title
CN114747250A (en) Feedback of channel state information
WO2021108940A1 (en) Channel state information feedback
US20180191410A1 (en) Subset of w2 method and apparatus for sending precoding information and feeding back precoding information
KR101566136B1 (en) Method and system for channel feedback in wireless communications
WO2019032642A1 (en) Method for wireless communication
US20220239362A1 (en) Method, device and computer readable medium for uplink control information transmission
US20230216567A1 (en) Methods and devices for channel state information transmission
EP3429089A1 (en) Electronic device for communication device having multiple antennas and communication method
CN107404344B (en) Communication method, network equipment and terminal equipment
CN102684797B (en) Method of constructing spectrum map by using compressed sensing and related communication device
TWI762898B (en) Uplink control information
WO2019034121A1 (en) Method, device and computer readable medium for mimo communication
RU2589314C1 (en) Method, system and device for signal transmission
CN109479047B (en) Terminal, base station and method for obtaining channel information
CN110943943B (en) Method and device for determining channel state information
CN110943804B (en) Method and device for determining channel state information
US20230246693A1 (en) Configurations for channel state feedback
WO2022012256A1 (en) Communication method and communication device
CN110999114B (en) Codebook subset restriction based on wideband amplitude
CN114762276B (en) Channel state information feedback
CN116114183A (en) Information transmission method, related device and equipment
CN116806437A (en) Method, apparatus and computer readable medium for communication
WO2020073788A1 (en) Precoding vector indication and determination method and communication device
WO2024027393A1 (en) Channel state information feedback method and apparatus
WO2023197298A1 (en) Method, device and computer storage medium of communication

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination