CN111835459A - Transmission method, terminal and network side equipment for CSI report - Google Patents

Transmission method, terminal and network side equipment for CSI report Download PDF

Info

Publication number
CN111835459A
CN111835459A CN201910786073.0A CN201910786073A CN111835459A CN 111835459 A CN111835459 A CN 111835459A CN 201910786073 A CN201910786073 A CN 201910786073A CN 111835459 A CN111835459 A CN 111835459A
Authority
CN
China
Prior art keywords
bitmap
csi report
mapping
information
coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910786073.0A
Other languages
Chinese (zh)
Other versions
CN111835459B (en
Inventor
施源
塔玛拉卡·拉盖施
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201910786073.0A priority Critical patent/CN111835459B/en
Publication of CN111835459A publication Critical patent/CN111835459A/en
Application granted granted Critical
Publication of CN111835459B publication Critical patent/CN111835459B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/0001Systems modifying transmission characteristics according to link quality, e.g. power backoff
    • H04L1/0023Systems modifying transmission characteristics according to link quality, e.g. power backoff characterised by the signalling
    • H04L1/0026Transmission of channel quality indication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L5/00Arrangements affording multiple use of the transmission path
    • H04L5/003Arrangements for allocating sub-channels of the transmission path
    • H04L5/0053Allocation of signaling, i.e. of overhead other than pilot signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

The invention discloses a transmission method of a Channel State Information (CSI) report, a terminal and network side equipment, and belongs to the technical field of communication. The CSI report transmission method is applied to a terminal side and comprises the following steps: grouping the quantization coefficients of the nonzero coefficients to be fed back in the polarization coefficient matrix according to the first grouping length to obtain a plurality of information groups; according to the priority information of the information group, at least one of the following information in the CSI report is discarded: the bit number of the discarded CSI report is equal to the bit number which can be borne by uplink channel resources used for sending the CSI report; and sending the CSI report after the discarding process on the uplink channel resource. The technical scheme of the invention can lead the network side equipment to accurately judge the channel condition.

Description

Transmission method, terminal and network side equipment for CSI report
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a transmission method, a terminal, and a network side device for a CSI report.
Background
In a wireless communication system, feedback of Channel State Information (CSI) is enhanced, and the CSI feedback has two modes of a type I and a type II. Type two employs spatial orthogonal baseline combining (LC) to approximate CSI, such as eigenvalue vectors of channels. Specifically, L orthogonal beams are selected from oversampled two-dimensional Discrete Fourier Transform (2D DFT) beams, a combination coefficient (complex number) corresponding to each layer (or each eigenvalue vector) of the L orthogonal beams is calculated, and an amplitude value, a phase value, and/or a phase angle value thereof is quantized. Where L is configured for the network side device, and the selection of the orthogonal beam is based on the bandwidth and is applicable to all ranks (rank), i.e. to all layers (layers). The amplitude quantization of the combined coefficients may be configured as bandwidth quantization or as bandwidth quantization and subband quantization, wherein the bandwidth quantization is indicated when the subband amplitude (subband amplitude) is false (false) and the bandwidth quantization and subband quantization is indicated when the subband amplitude is true (true). The phase angle quantization of the combined coefficients is done on each subband.
Further, the CSI report corresponding to CSI feedback type two may be written as a codebook write at frequency domain granularity m as a 2L × R matrix.
If all the combination coefficients at the granularity of the frequency domain are concatenated together, a precoding matrix of the layer r in the frequency domain can be obtained, and the precoding matrix can be written as a 2L × M matrix.
In order to reduce CSI feedback overhead, a2 lxm matrix can be compressed into a2 lxk compression matrix by methods such as frequency domain compression for frequency domain correlation, time domain compression for sparsity of time domain impulse response, and frequency domain difference.
Specifically, the type two CSI report includes a first part (part1) and a second part (part2), wherein part1 has a fixed payload size, specifically including: rank Indication (RI), Channel Quality Indication (CQI), and a number Indication of non-zero amplitude combining coefficients for each layer of bandwidth. part2 includes a Precoding Matrix Indicator (PMI). The CSI report encodes part1 and part2, respectively, and the payload size of part2 is determined according to the information of part 1.
When a CSI report is transmitted on a Physical Uplink Shared Channel (PUSCH), since a network side device cannot predict the size of the CSI report, especially the size of a load of part2 in the CSI report, a PUSCH resource allocated by the network side device may not accommodate all contents of the CSI report, and at this time, a terminal may discard information of a sub-band CSI, so as to ensure that the CSI report can be placed in an Uplink resource configured by a corresponding network. However, for a codebook compressed based on fourier transform in a New Radio (NR), information put into a CSI report has no concept of a subband, and thus, the existing discarding scheme cannot be used.
Disclosure of Invention
The embodiment of the invention provides a transmission method of a Channel State Information (CSI) report, a terminal and network side equipment.
In a first aspect, an embodiment of the present invention provides a method for transmitting a CSI report, which is applied to a terminal side, and includes:
grouping the quantization coefficients of the nonzero coefficients to be fed back in the polarization coefficient matrix according to the first grouping length to obtain a plurality of information groups;
according to the priority information of the information group, at least one of the following information in the CSI report is discarded: the CSI report comprises a public part, a quantized coefficient and a bit bitmap for indicating the quantized coefficient, wherein the bit number of a discarded CSI report is equal to the bit number which can be borne by uplink channel resources for sending the CSI report, and the bit number which can be borne by the uplink channel resources for sending the CSI report is determined according to a code rate configured by a network side and the uplink channel resources distributed for the CSI report;
and sending the CSI report after the discarding processing on the uplink channel resource.
In a second aspect, an embodiment of the present invention provides a method for transmitting a CSI report, which is applied to a network side device, and includes:
receiving a Channel State Information (CSI) report of a terminal, wherein quantized coefficients of nonzero coefficients to be fed back in a polarization coefficient matrix in the CSI report are divided into a plurality of information groups;
and demodulating the bit bitmap and the quantized coefficients of the CSI report according to the priority information of the information group.
In a third aspect, an embodiment of the present invention further provides a transmission apparatus for CSI report, which is applied to a terminal side, and includes:
the grouping module is used for grouping the quantization coefficients of the nonzero coefficients to be fed back in the polarization coefficient matrix according to the first grouping length to obtain a plurality of information groups;
a discarding module, configured to discard at least one of the following information in the CSI report according to priority information of an information group: the CSI report comprises a public part, a quantized coefficient and a bit bitmap for indicating the quantized coefficient, wherein the bit number of a discarded CSI report is equal to the bit number which can be borne by uplink channel resources for sending the CSI report, and the bit number which can be borne by the uplink channel resources for sending the CSI report is determined according to a code rate configured by a network side and the uplink channel resources distributed for the CSI report;
and the sending module is used for sending the CSI report after the discarding processing on the uplink channel resource.
In a fourth aspect, an embodiment of the present invention provides a transmission apparatus for CSI reports, which is applied to a network side device side, and includes:
the terminal comprises a receiving module, a feedback module and a feedback module, wherein the receiving module is used for receiving a Channel State Information (CSI) report of the terminal, and quantized coefficients of nonzero coefficients to be fed back in a polarization coefficient matrix in the CSI report are divided into a plurality of information groups;
and the processing module is used for demodulating the bit map and the quantized coefficients of the CSI report according to the priority information of the information group.
In a fifth aspect, an embodiment of the present invention further provides a communication device, where the communication device includes a processor, a memory, and a computer program stored in the memory and running on the processor, and the processor, when executing the computer program, implements the steps of the transmission method for CSI report as described above.
In a sixth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the transmission method for CSI report as described above.
In the above scheme, quantization coefficients of non-zero coefficients to be fed back in a polarization coefficient matrix are grouped to obtain a plurality of information groups, information in a CSI report is discarded according to priority information of the information groups, so that the bit number of the CSI report after discarding the information is equal to the bit number that can be borne by an uplink channel resource used for sending the CSI report, and a network side device can receive and analyze the CSI report according to the priority information of the information groups to determine the discarded part of the content of a terminal, which is beneficial for the network side device to accurately know a channel state and optimize CSI feedback performance.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 shows a block diagram of a mobile communication system to which an embodiment of the present invention is applicable;
fig. 2 is a flowchart illustrating a method for transmitting a CSI report of a terminal according to an embodiment of the present invention;
fig. 3 is a schematic block diagram of a terminal according to an embodiment of the present invention;
FIG. 4 shows a block diagram of a terminal of an embodiment of the invention;
fig. 5 is a flowchart illustrating a method for transmitting a CSI report of a network side device according to an embodiment of the present invention;
fig. 6 is a schematic block diagram of a network device according to an embodiment of the present invention;
FIG. 7 is a block diagram of a network-side device according to an embodiment of the invention;
FIGS. 8-10 are diagrams illustrating mapping of a bitmap according to an embodiment of the present invention;
FIGS. 11-12 are diagrams illustrating grouping of bitmap mapping queues according to embodiments of the present invention;
FIGS. 13-20 are diagrams illustrating grouping of quantized coefficient map queues according to embodiments of the present invention;
fig. 21-25 are diagrams illustrating discarding of information in CSI reports according to embodiments of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. In the description and in the claims "and/or" means at least one of the connected objects.
The techniques described herein are not limited to Long Term Evolution (LTE)/LTE Evolution (LTE-Advanced) systems, and may also be used for various wireless communication systems, such as Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Frequency Division Multiple Access (FDMA), Orthogonal Frequency Division Multiple Access (OFDMA), Single-carrier Frequency Division Multiple Access (SC-FDMA), and other systems. The terms "system" and "network" are often used interchangeably. CDMA systems may implement Radio technologies such as CDMA2000, Universal Terrestrial Radio Access (UTRA), and so on. UTRA includes Wideband CDMA (Wideband code division Multiple Access, WCDMA) and other CDMA variants. TDMA systems may implement radio technologies such as Global System for Mobile communications (GSM). The OFDMA system may implement radio technologies such as Ultra Mobile Broadband (UMB), evolved-UTRA (E-UTRA), IEEE 802.11(Wi-Fi), IEEE 802.16(WiMAX), IEEE 802.20, Flash-OFDM, etc. UTRA and E-UTRA are parts of the Universal Mobile Telecommunications System (UMTS). LTE and higher LTE (e.g., LTE-A) are new UMTS releases that use E-UTRA. UTRA, E-UTRA, UMTS, LTE-A, and GSM are described in documents from an organization named "third Generation partnership project" (3 GPP). CDMA2000 and UMB are described in documents from an organization named "third generation partnership project 2" (3GPP 2). The techniques described herein may be used for both the above-mentioned systems and radio technologies, as well as for other systems and radio technologies. However, the following description describes the NR system for purposes of example, and NR terminology is used in much of the description below, although the techniques may also be applied to applications other than NR system applications.
The following description provides examples and does not limit the scope, applicability, or configuration set forth in the claims. Changes may be made in the function and arrangement of elements discussed without departing from the spirit and scope of the disclosure. Various examples may omit, substitute, or add various procedures or components as appropriate. For example, the described methods may be performed in an order different than described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Referring to fig. 1, fig. 1 is a block diagram of a wireless communication system to which an embodiment of the present invention is applicable. The wireless communication system includes a terminal 11 and a network-side device 12. The terminal 11 may also be referred to as a terminal Device or a User Equipment (UE), where the terminal 11 may be a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or a vehicle-mounted Device, and the specific type of the terminal 11 is not limited in the embodiment of the present invention. The network-side device 12 may be a Base Station or a core network, wherein the Base Station may be a 5G or later-version Base Station (e.g., a gNB, a 5G NR NB, etc.), or a Base Station in other communication systems (e.g., an eNB, a WLAN access point, or other access points, etc.), wherein the Base Station may be referred to as a node B, an evolved node B, an access point, a Base Transceiver Station (BTS), a radio Base Station, a radio Transceiver, a Basic Service Set (BSS), an Extended Service Set (ESS), a node B, an evolved node B (eNB), a home node B, a home evolved node B, a WLAN access point, a WiFi node, or some other suitable terminology in the field, as long as the same technical effect is achieved, the Base Station is not limited to a specific technical vocabulary, it should be noted that, in the embodiment of the present invention, only the Base Station in the NR system is taken as an example, but does not limit the specific type of base station.
The base stations may communicate with the terminals 11 under the control of a base station controller, which may be part of the core network or some of the base stations in various examples. Some base stations may communicate control information or user data with the core network through a backhaul. In some examples, some of the base stations may communicate with each other, directly or indirectly, over backhaul links, which may be wired or wireless communication links. A wireless communication system may support operation on multiple carriers (waveform signals of different frequencies). A multi-carrier transmitter can transmit modulated signals on the multiple carriers simultaneously. For example, each communication link may be a multi-carrier signal modulated according to various radio technologies. Each modulated signal may be transmitted on a different carrier and may carry control information (e.g., reference signals, control channels, etc.), overhead information, data, and so on.
The base station may communicate wirelessly with the terminal 11 via one or more access point antennas. Each base station may provide communication coverage for a respective coverage area. The coverage area of an access point may be divided into sectors that form only a portion of the coverage area. A wireless communication system may include different types of base stations (e.g., macro, micro, or pico base stations). The base stations may also utilize different radio technologies, such as cellular or WLAN radio access technologies. The base stations may be associated with the same or different access networks or operator deployments. The coverage areas of different base stations (including coverage areas of base stations of the same or different types, coverage areas utilizing the same or different radio technologies, or coverage areas belonging to the same or different access networks) may overlap.
The communication link in the wireless communication system may include an Uplink for carrying Uplink (UL) transmission (e.g., from the terminal 11 to the network side device 12) or a Downlink for carrying Downlink (DL) transmission (e.g., from the network side device 12 to the terminal 11). The UL transmission may also be referred to as reverse link transmission, while the DL transmission may also be referred to as forward link transmission. Downlink transmissions may be made using licensed frequency bands, unlicensed frequency bands, or both. Similarly, uplink transmissions may be made using licensed frequency bands, unlicensed frequency bands, or both.
In a wireless communication system, a CSI report of type two includes a first part (part1) and a second part (part2), wherein part1 has a fixed payload size, and specifically includes: rank Indication (RI), Channel Quality Indication (CQI), and a number Indication of non-zero amplitude combining coefficients for each layer of bandwidth. part2 includes a Precoding Matrix Indicator (PMI). The CSI report encodes part1 and part2, respectively, and the payload size of part2 is determined according to the information of part 1.
The two-stage codebook for CSI reporting at frequency domain granularity m can be written as:
Figure BDA0002178053710000071
wherein N is1、N2The number of ports of CSI Reference signals (CSI-RS) in two dimensions is respectively, and R is a rank number or a layer number; b'lFor orthogonal vectors consisting of 2D-DFT beam vectors, cl,rAnd (m) is a combination coefficient of the L-th orthogonal beam vector of the layer R on the frequency domain granularity m, wherein R is 1,2, …, and R, L is 1,2, …,2L, and L is the number of the selected orthogonal beams. The frequency domain granularity may be a sub-band or a Resource Block (RB), and the wideband may be divided into M frequency domain resources by using the frequency domain granularity as a unit.
If the combined coefficients of all sub-bands are concatenated together, a precoding matrix of the layer r in the frequency domain can be obtained, where the precoding matrix is a precoding matrix of a certain layer in a wideband (or called as a frequency domain), that is, the combined coefficients of all frequency domain granularities are concatenated together, and a precoding matrix of the layer r in the frequency domain can be obtained, and the precoding matrix can be written as a 2L × M matrix, which is expressed as follows:
Figure BDA0002178053710000081
wherein, cl,r(m) is the combining coefficient of the l-th orthogonal beam vector of layer r at frequency domain granularity m. W2,rThe l line in (1) represents a beam vector b'lThe combined coefficient matrix at all frequency domain granularities is expressed as follows:
Figure BDA0002178053710000082
these coefficient matrices (W) are described above due to the presence of frequency domain correlations2,r)2L×MFrequency domain compression can be further carried out; on the other hand, the sparsity of the time-domain channel impulse response can be time-domain compression, and the frequency-domain compression and the time-domain compression are equivalent in a certain sense.
The compression matrix is: extracting elements from the product of the precoding matrix and the initial vector matrix of the orthogonal basis, the extracted elementsAnd K is a value smaller than M, and can be configured by the network side equipment, agreed by the protocol or determined by the terminal. E.g. spatial compression using CSI feedback type two, for W2,rCarry out transformation W3I.e. by
Figure BDA0002178053710000083
From W3Is orthogonal to
Figure BDA0002178053710000084
Suppose W3An Inverse Discrete Fourier Transform (IDFT) matrix is determined in M × M dimensions, which is equivalent to transform the combined coefficients in the frequency domain to the time domain, and if the frequency domain coefficients after spatial compression have sparsity in the time domain, only a small number of time domain coefficients with large amplitude may be fed back, and the other time domain coefficients are zero. Assuming that only the maximum amplitude k after IDFT transform is fed backrA number of time-domain coefficients are determined,
Figure BDA0002178053710000091
the number of complex numbers to be fed back per layer is reduced from (2L-1) M to (2L) krAnd feeding back the numbers of the selected k1 non-zero coefficients, time domain compression is realized, wherein
Figure BDA0002178053710000092
The selected orthogonal basis vector matrix corresponding to the corresponding position is
Figure BDA0002178053710000093
Suppose that
Figure BDA0002178053710000094
Including selected kr optimal orthogonal vectors, where kr<M, then W2, r may be approximately restored. For example
Figure BDA0002178053710000095
Including selected krA quadrature DFT vectorOr K right principal Singular vectors after Singular Value Decomposition (SVD) Decomposition, and the like. To W2rTransforming to obtain:
Figure BDA0002178053710000096
therefore, the content to be fed back is composed of W of 2L × M dimensions2,rBecomes 2L × krOf dimension
Figure BDA0002178053710000097
And selected krThe number of each orthogonal vector. The number of complex numbers to be fed back per layer is reduced from (2L-1) M to 2L krIn addition, frequency domain compression is realized.
Therefore, what the terminal needs to feed back is quantized
Figure BDA0002178053710000098
And
Figure BDA0002178053710000099
the selected orthogonal basis vector matrix corresponding to the corresponding position
Figure BDA00021780537100000910
Indicates information.
Currently, polarization difference methods have been used for quantization in NR. The specific quantification method is as follows: in that
Figure BDA00021780537100000911
The maximum amplitude coefficients are respectively found out in the front L rows and the rear L rows in the matrix to form a polarization matrix, wherein the L rows in which the strongest amplitude coefficients are located are called strong polarization parts, and the other half of the L rows are called weak polarization parts. For the
Figure BDA00021780537100000912
And the front L rows and the rear L rows in the matrix are respectively normalized according to the corresponding maximum amplitude coefficients. Normalization is carried out in the polarization matrix according to the maximum amplitude coefficient in the matrix, and the strong polarization coefficient in the polarization matrixAnd (4) normalization is 1, amplitude and phase quantization is not needed, the terminal informs the network side equipment through the strongest coefficient indication information, and the weak amplitude polarization coefficient needs quantization. Normalized for polarization
Figure BDA00021780537100000913
Matrix (For { c)l,k,(l,k)≠(l*,k*) }), both amplitude and phase require quantization.
After codebook compression, the following parameters exist:
k0 number of subset coefficients, configured by network side device, for indicating multi-layered compression coefficient matrix
Figure BDA00021780537100000914
How many coefficients need to be fed back, i.e. for
Figure BDA00021780537100000915
Coefficients other than K0 coefficients in the matrix, considered 0, are not fed back. For rank1/2, K0 coefficients are independently selected from layer to layer. For rank3/4, all layers pick 2 × K0 coefficients together, i.e., for rank1, only K0 coefficients need to be quantized, for rank>1, all layers need to quantize 2 × K0 coefficients altogether.
Indication information Of Non-Zero Coefficients (NNZCI), for a compressed coefficient matrix Of multiple layers
Figure BDA0002178053710000101
The sum of all non-zero coefficients is K1, and the total number of subset coefficients is K0 or 2 × K0, if K1 is less than the total number of subset coefficients, the terminal only needs to feed back K1 coefficients. Therefore, part1 needs to feed back indication information of non-zero coefficients.
Bitmap (bitmap) a matrix of X rows of Xi columns, the rows relating to the number of Spatial (SD) beams (beams) and the columns relating to the number of orthogonal base beams, where i represents the layer number. For each layer, a bitmap is independently owned for indicating the position of the nonzero coefficient of the current layer and the number of the nonzero coefficients of the current layer.
When the type-two CSI report is transmitted on the PUSCH, since the network side device cannot know the CSI feedback in advance, especially the payload size of part2, the allocated PUSCH resource may not accommodate the complete CSI report content, and therefore the terminal needs to discard part of the content of part2 of the CSI without feedback. Assuming that N CSI reports need to be fed back in one timeslot, part2 content discard Priority (Priority) of CSI is shown in table 1: the priority 0 is the highest priority, namely the CSI report content which is sent preferentially; the priority 2N is the lowest priority, that is, the CSI report content discarded first, and the CSI report content of each priority is discarded as a whole.
Table 1: part 2CSI reporting priority
Figure BDA0002178053710000102
Figure BDA0002178053710000111
When a UE is scheduled to transmit uplink data (transport block) and one or more CSI reports on the PUSCH, if
Figure BDA0002178053710000112
Is greater than
Figure BDA0002178053710000113
Part2 of the CSI is discarded step by step in the above order until
Figure BDA0002178053710000114
Is less than or equal to
Figure BDA0002178053710000115
Until now. Wherein:
OCSI-2is the number of bits of CSI part 2;
if O is presentCSI-2≥360,L CSI-211; otherwise LCSI-2The number of Cyclic Redundancy Check (CRC) bits of the CSI part2 is determined according to a preset rule;
Figure BDA0002178053710000116
is a set CSI offset value;
Figure BDA0002178053710000117
is the total number of OFDM symbols within the PUSCH, including all OFDM symbols transmitting DMRS;
Figure BDA0002178053710000118
the number of REs used for transmitting UCI on an OFDM symbol l in the PUSCH;
CUL-SCHis the number of code blocks of the UL-SCH transmitted on the PUSCH;
if the DCI scheduling the PUSCH contains a Code Block Group Transmission Information (CBGTI) field indicating that the UE does not need to send the r-th code block, K r0; otherwise KrIs the r code block size of the UL-SCH transmitted on PUSCH;
Q'CSI-1the number of coded modulation symbols of each layer of the CSI part1 transmitted on the PUSCH;
if the bit number of the HARQ-ACK information is greater than 2, Q'ACKThe number of coded modulation symbols of each layer transmitted on the PUSCH is equal to 1 or 2, Q'ACK=0;
α is a scaling parameter for high layer configuration;
when the UE only transmits the CSI report on the PUSCH, part2 of the CSI is discarded step by step according to the sequence until the code rate of the part 2CSI is lower than a threshold code rate c less than 1TTo therein, wherein
Figure BDA0002178053710000121
Figure BDA0002178053710000122
For the set CSI offset value, R is the code rate indicated by the DCI.
When the type-two CSI is transmitted on the PUCCH, the part 2CSI is discarded from the lowest priority according to the priority of the table 1 until the code rate of the part 2CSI is less than or equal to the parameter maxCodeRate configured by the higher layer.
In the related technology, a bitmap and a quantization coefficient corresponding to the bitmap are grouped based on a DFT compression codebook, and after the grouping is completed, the grouping is discarded according to a corresponding priority. Correspondingly, after decoding, the network side equipment obtains corresponding bit cache and then implements bit bitmap and quantized coefficient joint detection according to a set rule, wherein the joint detection means that corresponding bitmaps are decoded according to groups, the bit number of the quantized coefficient corresponding to the corresponding bitmaps is obtained according to bitmap information, the quantized coefficient is further decoded, and by analogy, the information of the next group of bitmaps and quantized coefficients is continuously obtained.
Wherein, the discarding principle is as follows: the contents in the same priority level are all discarded when the contents are discarded.
It may be that the bitmap and the quantization coefficients corresponding to the bitmap are in the same group:
the priorities of the quantization coefficients indicated by a certain group of bitmaps and the group of bitmaps are equal, if the group needs to be discarded, the bitmaps of the group and the quantization coefficients indicated by the bitmaps are discarded at the same time;
the priority of a certain group of bitmaps is greater than the priority of the quantized coefficients indicated by the group of bitmaps, and if the group needs to be discarded, the quantized coefficients indicated by the bitmaps of the group are discarded preferentially.
Or two adjacent groups of quantization coefficients of bitmap and bitmap corresponding to the quantization coefficients can be selected:
the priority of a bitmap of a certain group is greater than the priority of the quantized coefficients of the neighboring groups indicated by the bitmap of the group.
When bitmap and the quantization coefficient indicated by bitmap are subjected to bit mapping, the element with the smaller row index value is mapped preferentially, or the element with the smaller bitmap column index value is mapped preferentially, or the element with the larger row index value is mapped preferentially, or the element with the larger bitmap column index value is mapped preferentially.
Wherein, the element of the position of bitmap where the strongest coefficient is located may not be mapped.
In the related art, the terminal may discard the information of the sub-band CSI to ensure that the CSI report can be placed in the uplink resource configured by the corresponding network. However, for a codebook compressed based on fourier transform in a New Radio (NR), information put into a CSI report has no concept of a subband, and thus, the existing discarding scheme cannot be used.
In order to solve the above problem, embodiments of the present invention provide a transmission method, a terminal, and a network side device for CSI report.
An embodiment of the present invention provides a transmission method for CSI report, which is applied to a terminal side, and as shown in fig. 2, the method may include the following steps:
step 101: grouping the quantization coefficients of the nonzero coefficients to be fed back in the polarization coefficient matrix according to the first grouping length to obtain a plurality of information groups;
step 102: according to the priority information of the information group, at least one of the following information in the CSI report is discarded: the CSI report comprises a public part, a quantized coefficient and a bit bitmap for indicating the quantized coefficient, wherein the bit number of a discarded CSI report is equal to the bit number which can be borne by uplink channel resources for sending the CSI report, and the bit number which can be borne by the uplink channel resources for sending the CSI report is determined according to a code rate configured by a network side and the uplink channel resources distributed for the CSI report;
step 103: and sending the CSI report after the discarding processing on the uplink channel resource.
In this embodiment, quantization coefficients of non-zero coefficients to be fed back in a polarization coefficient matrix are grouped to obtain a plurality of information groups, information in a CSI report is discarded according to priority information of the information groups, so that the number of bits of the discarded CSI report is equal to the number of bits that can be borne by an uplink channel resource used for sending the CSI report, and a network side device can receive and analyze the CSI report according to the priority information of the information groups to determine a part of discarded contents of a terminal, which is beneficial for the network side device to accurately know a channel state and optimize CSI feedback performance.
In a specific embodiment, before discarding at least one of the following information in the CSI report according to the priority information of the information group, the method further includes:
acquiring uplink channel resources for sending the CSI report;
calculating uplink channel resources required for transmitting the CSI report;
and judging that the acquired uplink channel resources are smaller than the transmission resources required by the CSI report.
The Uplink channel resource includes, but is not limited to, a Physical Uplink Control Channel (PUCCH) and/or a Physical Uplink Shared Channel (PUSCH). Optionally, the uplink channel Resource may be configured to the terminal semi-statically by a Radio Resource Control (RRC) signaling through the network side device, or may be dynamically indicated to the terminal through a Physical Downlink Control Channel (PDCCH).
After compression is completed, the terminal obtains bit maps of all layers and quantized coefficients indicated by the bit maps, wherein the quantized coefficients comprise at least one of the following: an amplitude quantization value and a phase quantization value. The quantized coefficient representation has been quantized to bit levels. The bitmap is a matrix without bitmap and with complete dimension, the rows are equal to 2 × SD beams, and the columns are equal to FD basis, where SD beams are spatial-domain beams and FD is a frequency-domain orthogonal basis.
Optionally, grouping the quantized coefficients of the non-zero coefficients to be fed back in the polarization coefficient matrix according to the first grouping length includes:
mapping the quantization coefficients indicated by the elements in the bitmap into a quantization coefficient mapping queue according to the bitmap mapping queue or the mapping sequence of the bitmap, wherein the bitmap mapping queue is obtained by mapping the elements in the bitmap;
grouping the quantization coefficient mapping queues according to a first grouping length, and dividing the quantization coefficient mapping queues into E quantization coefficient information groups, wherein each E-1 quantization coefficient information group comprises F quantization coefficients, F is the first grouping length, E is an integer greater than 1, and E is equal to ceil (BF/F) or floor (BF/F), wherein ceil is rounding-up, floor is rounding-down, and BF is the total number of quantization coefficients.
Optionally, the dividing the quantized coefficient mapping queue into E quantized coefficient information groups comprises:
and according to the sequencing of the quantization coefficient mapping queues, sequentially dividing the quantization coefficient mapping queues into a quantization coefficient information group 0, a quantization coefficient information group 1, …, a quantization coefficient information group E-2 and a quantization coefficient information group E-1 from beginning to end, wherein the priority of the quantization coefficient information group E is higher than that of the quantization coefficient information group E-1 or the priority of the quantization coefficient information group E-1 is higher than that of the quantization coefficient information group E, and E is an integer which is more than or equal to 1 and less than or equal to E.
Optionally, the quantized coefficient information group 0 includes a smaller number of quantized coefficients than the other quantized coefficient information groups; or
The quantized coefficient information group 0 includes a larger number of quantized coefficients than the other quantized coefficient information groups; or
The quantized coefficient information group E-1 includes quantized coefficients whose number is smaller than the quantized coefficients included in the other quantized coefficient information groups; or
The quantized coefficient information group E-1 includes a larger number of quantized coefficients than the other quantized coefficient information groups.
Optionally, the method further comprises:
and grouping the bitmap mapping queues according to a second packet length, and dividing the bitmap mapping queues into G bitmap information groups, wherein G-1 bitmap information groups all comprise H elements, H is the second packet length, G is an integer larger than 1, E is equal to ceil (BG/H) or floor (BG/H), ceil is rounded up, floor is rounded down, and BG is the total number of bitmap elements.
Optionally, dividing the bitmap mapping queue into G bitmap information groups includes:
and according to the sequence of the bitmap mapping queue, sequentially dividing the bitmap mapping queue into a bitmap information group 0, bitmap information groups 1 and …, a bitmap information group G-2 and a bitmap information group G-1 from beginning to end, wherein if the priority of a quantization coefficient information group e is higher than that of a quantization coefficient information group e-1, the priority of a bitmap information group G is higher than that of a bitmap information group G-1, if the priority of the quantization coefficient information group e-1 is higher than that of the quantization coefficient information group e, the priority of the bitmap information group G-1 is higher than that of the bitmap information group G, and G is an integer which is greater than or equal to 1 and less than or equal to G.
Optionally, the bit map information group 0 includes a smaller number of quantized coefficients than the other quantized coefficient information groups; or
The bit map information group 0 includes quantization coefficients whose number is greater than that of other quantization coefficient information groups; or
The number of quantization coefficients included in the bitmap information group G-1 is smaller than the number of quantization coefficients included in the other quantization coefficient information groups; or
The bit map information group G-1 includes a larger number of quantized coefficients than the other quantized coefficient information groups.
Optionally, the first packet length employs any of:
greater than or equal to the second packet length;
equal to the number of elements of the bitmap information group containing the most elements.
Optionally, mapping the elements in the bitmap to a bitmap mapping queue includes any one of:
mapping each row of elements in the bitmap of each layer into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of the layer indexes, and sequencing the sub-bitmap mapping queues with the same layer index according to the sequence of the row indexes to obtain the bitmap mapping queue;
mapping each row of elements in the bitmap of each layer into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of the row indexes, and sequencing the sub-bitmap mapping queues with the same row index according to the sequence of the layer indexes to obtain the bitmap mapping queue;
mapping each row of elements in each layer of bitmap into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of layer indexes, and sequencing the sub-bitmap mapping queues with the same layer index according to the sequence of the row indexes to obtain the bitmap mapping queue;
mapping each row of elements in each layer of bitmap into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of row indexes, and sequencing the sub-bitmap mapping queues with the same row index according to the sequence of layer indexes to obtain the bitmap mapping queue.
Optionally, the order of the layer indexes adopts any one of the following:
according to the layer index from large to small;
from small to large according to the layer index;
according to the preset sequence of the layer indexes;
the order of the row indexes adopts any one of the following orders:
according to the line index from large to small;
from small to large according to the row index;
according to the preset sequence of the line indexes;
the order of the column indexes adopts any one of the following orders:
from large to small by column index;
from small to large in terms of column index;
in a preset order of column indices.
Optionally, mapping a row of elements in the bitmap to the bitmap mapping queue comprises any one of:
mapping the row element into a bitmap mapping queue according to the column index of the row element from large to small;
mapping the row element into a bitmap mapping queue according to the column index of the row element from small to large;
and mapping the row elements into a bitmap mapping queue according to the preset sequence of the column indexes of the row elements.
Optionally, mapping a column of elements in the bitmap to the bitmap mapping queue includes any one of:
mapping the row element into a bitmap mapping queue according to the row index of the row element from large to small;
mapping the row element into a bitmap mapping queue according to the row index of the row element from small to large;
and mapping the row elements into a bitmap mapping queue according to the preset sequence of the row indexes of the column elements.
Optionally, the element in the bitmap corresponding to the strongest coefficient is not mapped to the bitmap mapping queue.
Optionally, the first packet length and/or the second packet length is determined by any one of the following methods:
protocol provision;
configuring network side equipment;
the terminal sets and reports the data to the network side equipment;
determining the number of spatial domain beams configured by the network side equipment;
the grouping coefficient is calculated, and is greater than or equal to 0 and less than or equal to 1.
Optionally, the grouping coefficient is determined by any one of the following methods:
protocol provision;
configuring network side equipment;
and the terminal sets and reports the data to the network side equipment.
Optionally, the grouping coefficient includes a first grouping coefficient and/or a second grouping coefficient, and when the first grouping length is calculated by the first grouping coefficient, the first grouping length is calculated by any one of the following methods:
the first packet length (the first packet coefficient BL);
the first packet length (ceil) (the first packet coefficient (BL)), and BL is the number of non-zero coefficients to be fed back;
when the second packet length is calculated by the second packet coefficient, the second packet length is calculated by any one of the following methods:
the second grouping length ceil (the second grouping coefficient dimension), which is equal to the length of a row or column of the bitmap;
the second grouping length (the second grouping coefficient dimension) is equal to the length of the bitmap row or column;
the second packet length is floor (the second packet coefficient BS);
the second packet length ceil (the second packet coefficient BS), and BS is the total number of elements of the bitmap.
Optionally, the priority of the common part is any one of:
higher priority than all bit map information groups and/or higher priority than all quantized coefficient information groups;
equal to the priority of the highest priority set of bitmap information and/or equal to the priority of the highest priority set of bitmap information.
Optionally, the mapping queue of the CSI report adopts any one of:
the mapping sequence of the public part is prior to the mapping sequence of the bit bitmap information group and the quantization coefficient information group, and the mapping sequence of the bit bitmap information group is prior to the mapping sequence of the quantization coefficient information group;
the mapping order of the common part is prior to the mapping order of the bitmap information groups and the quantized coefficient information groups, and the bitmap information groups and the quantized coefficient information groups are alternately arranged from the tail of the mapping queue of the CSI report.
Optionally, if the mapping order of the bitmap information groups is prior to the mapping order of the quantized coefficient information groups, in a direction from the head of the mapping queue of the CSI report to the tail of the mapping queue of the CSI report, the priorities of the bitmap information groups are sequentially decreased, and the priorities of the quantized coefficient information groups are sequentially decreased;
if the bit bitmap information groups and the quantization coefficient information groups are alternately arranged from the tail of the mapping queue of the CSI report, the priority of the bit bitmap information groups is sequentially reduced or the priority of a plurality of continuous bit bitmap information groups is the same in the direction from the head of the mapping queue of the CSI report to the tail of the mapping queue of the CSI report, and the priority of the bit bitmap information groups is the same as the priority of the adjacent quantization coefficient information groups behind.
Optionally, the bit number of the common part is smaller than a bit number D that can be carried by an uplink channel resource used for sending the CSI report, and discarding at least one of the following information in the CSI report according to the priority information of the information group includes:
and discarding the quantization coefficient information group with the lowest priority and the bitmap information group with the lowest priority in the mapping queue of the CSI report until the bit number of the CSI report after information discarding is less than or equal to the bit number D which can be borne by the uplink channel resource used for sending the CSI report.
Optionally, the number of bits of the common part is greater than or equal to a number of bits D that can be carried by uplink channel resources used for sending the CSI report, and the discarding at least one of the following information in the CSI report includes any one of:
discarding all quantized coefficient information groups and bit bitmap information groups, and discarding the content of the public part bit by bit until the bit number of the CSI report after information discarding is equal to the bit number D which can be borne by the uplink channel resource used for sending the CSI report;
discarding all quantized coefficient information groups, bit map information groups, and the common portion.
Optionally, discarding at least one of the following information in the CSI report further includes:
after all quantized coefficient information groups are discarded, all bitmap information groups are discarded.
Optionally, if the bit number of the CSI report after discarding information is smaller than the bit number D that can be carried by the uplink channel resource used to send the CSI report, before sending the discarded CSI report on the uplink channel resource, the method further includes any one of the following:
supplementing specific bits in the CSI report after information discarding so that the bit number of the CSI report after bit supplementing is equal to D;
and adjusting the code rate to ensure that the bit number which can be borne by the uplink channel resource used for sending the CSI report is equal to the bit number of the CSI report after the information is discarded.
Optionally, the specific bit is 0 or 1.
Optionally, the common part comprises at least one of the following information: spatial beam information, oversampling information, information of a strongest coefficient, indication information of a frequency domain orthogonal basis, and weak polarization reference amplitude quantization information.
The following further introduces a CSI report transmission method for a terminal with reference to specific embodiments:
example one
In this embodiment, as shown in fig. 8, the number of SD beams configured on the network side is 4, and the number of FD bases to be fed back is 7, so the dimension of bitmap is 8 rows and 7 columns. When the bitmap is mapped into the bitmap mapping queue, the bitmap is mapped column by column according to the size of the column index, and when each column element in the bitmap is mapped, the bitmap is mapped according to the size of the row index.
Example two
In this embodiment, as shown in fig. 9, the number of SD beams configured on the network side is 4, and the number of FD bases to be fed back is 7, so the dimension of bitmap is 8 rows and 7 columns. When the bitmap is mapped into the bitmap mapping queue, the bitmap is mapped column by column according to the size of the column index, and when each column element in the bitmap is mapped, the bitmap is mapped according to the size of the row index.
EXAMPLE III
In this embodiment, as shown in fig. 10, the number of SD beams configured on the network side is 4, and the number of FD bases to be fed back is 7, so the dimension of bitmap is 8 rows and 7 columns. When mapping the bitmap into a bitmap mapping queue, mapping the bitmap column by column according to the size of the column index, when mapping each column element in the bitmap, mapping alternately according to the size of the row index and the size of the row index, for example, for column 0, mapping according to the size of the row index, for column 1, mapping according to the size of the row index, for column 2, mapping according to the size of the row index, for column 3, mapping according to the size of the row index, …, and so on.
Example four
In this embodiment, as shown in fig. 11, the number of SD beams configured on the network side is 4, and the number of FD bases to be fed back is 7, so the dimension of bitmap is 8 rows and 7 columns. When mapping the bitmap into a bitmap mapping queue, mapping the bitmap column by column according to the size of the column index, when mapping each column element in the bitmap, mapping alternately according to the size of the row index and the size of the row index, for example, for column 0, mapping according to the size of the row index, for column 1, mapping according to the size of the row index, for column 2, mapping according to the size of the row index, for column 3, mapping according to the size of the row index, …, and so on.
After the bitmap mapping queues are obtained, grouping is performed on the bitmap mapping queues according to a preset grouping length to obtain bitmap information groups, specifically, as shown in fig. 11, the grouping length may be equal to 8, and the bitmap mapping queues are divided into groups 0 to 6.
EXAMPLE five
In this embodiment, as shown in fig. 12, the number of SD beams configured on the network side is 4, and the number of FD bases to be fed back is 7, so the dimension of bitmap is 8 rows and 7 columns. When mapping the bitmap into a bitmap mapping queue, mapping the bitmap row by row according to the line index from small to large, and mapping according to the column index from small to large when mapping each row element in the bitmap. Specifically, as shown in fig. 12, the packet length may be equal to 7, dividing the bitmap mapping queues into groups 0-7.
EXAMPLE six
In this embodiment, as shown in fig. 13, the number of SD beams configured on the network side is 4, the number of FD bases to be fed back is 7, so that the dimension of bitmap is 8 rows and 7 columns, the number of non-zero quantization coefficients to be fed back is 28, and the quantization coefficients indicated by elements in the bitmap are mapped into a quantization coefficient mapping queue according to the mapping order of the bitmap. After the quantized coefficient mapping queues are obtained, the quantized coefficient mapping queues are grouped according to the grouping length to obtain quantized coefficient information groups, as shown in fig. 13, the grouping length can be equal to 8, and the quantized coefficient mapping queues are divided into groups 0-3, wherein the number of quantized coefficients of the group 0 is smaller than that of quantized coefficients of other groups. The priority of the quantized coefficient information groups may be: group 0 has a higher priority than group 1, group 1 has a higher priority than group 2, and group 2 has a higher priority than group 3.
EXAMPLE seven
In this embodiment, as shown in fig. 14, the number of SD beams configured on the network side is 4, the number of FD bases to be fed back is 7, so that the dimension of bitmap is 8 rows and 7 columns, the number of non-zero quantization coefficients to be fed back is 28, elements in a bitmap are mapped to obtain a bitmap mapping queue, and the bitmap mapping queue is used to obtain a quantization coefficient mapping queue. When the bitmap is mapped into the bitmap mapping queue, the bitmap is mapped column by column according to the size of the column index, and when each column element in the bitmap is mapped, the bitmap is mapped according to the size of the row index. As shown in fig. 14, the grouping length of the quantized coefficient mapping queue may be equal to 8, and the quantized coefficient mapping queue is divided into groups 0 to 3, where the number of quantized coefficients of group 0 is smaller than the number of quantized coefficients of the other groups. The priority of the quantized coefficient information groups may be: group 0 has a higher priority than group 1, group 1 has a higher priority than group 2, and group 2 has a higher priority than group 3.
Example eight
In this embodiment, as shown in fig. 15, the number of SD beams configured on the network side is 4, the number of FD bases to be fed back is 7, so the dimension of bitmap is 8 rows and 7 columns, the number of non-zero quantization coefficients to be fed back is 28, elements in a bitmap are mapped to obtain a bitmap mapping queue, when the bitmap is mapped to the bitmap mapping queue, the bitmap is mapped column by column according to the column index from small to large, and when each column of elements in the bitmap is mapped, the bitmap is mapped according to the row index from small to large. Grouping bitmap mapping queues according to the grouping length of 8, wherein the maximum element number in a bitmap information group is equal to 8, therefore, the grouping length of a quantization coefficient mapping queue is also set to 8, 28 quantization coefficients in total need to be fed back, grouping the quantization coefficients, and dividing the quantization coefficient mapping queue into groups 0-3, wherein 28 cannot be divided by 8, so that the number of quantization coefficients of group 0 is less than the number of quantization coefficients of other groups, the number of quantization coefficients of group 0 is 4, and the number of quantization coefficients of other groups is 8, and is equal to the grouping length of the quantization coefficient mapping queue. The priority of the quantized coefficient information groups may be: group 0 has a higher priority than group 1, group 1 has a higher priority than group 2, and group 2 has a higher priority than group 3.
Example nine
In this embodiment, when there are multiple layers of bitmaps, as shown in fig. 16, when mapping bitmaps into bitmap mapping queues, mapping each column of elements in bitmaps of each layer into a sub-bitmap mapping queue, first sorting the sub-bitmap mapping queues according to the sequence of column indexes, then sorting the sub-bitmap mapping queues with the same column index according to the sequence of layer indexes to obtain bitmap mapping queues, and grouping the bitmap mapping queues according to a grouping length of 8 to obtain bitmap information groups: group 0-group 13, wherein the bitmap of layer 0 is mapped and divided into group 0, group 2, group 4, group 6, group 8, group 10, and group 12; the bitmap of layer 1 is mapped and divided into group 1, group 3, group 5, group 7, group 9, group 11, and group 13.
And mapping the quantization coefficients indicated by the elements in the bitmap into a quantization coefficient mapping queue according to the mapping sequence of the bitmap or according to the bitmap mapping queue. After the quantized coefficient mapping queue is obtained, the quantized coefficient mapping queue is grouped according to the grouping length to obtain a quantized coefficient information group, as shown in fig. 16, the grouping length may be equal to 8, and the quantized coefficient mapping queue is divided into groups 0 to 7, wherein the quantized coefficient of layer 0 is divided into groups 0, 2, 4, and 6 after being mapped, and the quantized coefficient of layer 1 is divided into groups 1, 3, 5, and 7 after being mapped. In layer 0, the number of quantized coefficients of group 0 is smaller than the number of quantized coefficients of other groups, layer 0 has 28 quantized coefficients in total that require feedback, and 28 cannot be evenly divided by 8, so the number of quantized coefficients of group 0 is smaller than the number of quantized coefficients of other groups, the number of quantized coefficients of group 0 is 4, and the number of quantized coefficients of other groups is 8, which is equal to the packet length of the quantized coefficient mapping queue. Similarly, in layer 1, the number of quantized coefficients of group 1 is smaller than the number of quantized coefficients of other groups, layer 1 has 28 quantized coefficients in total and needs feedback, and 28 cannot be divided by 8, so the number of quantized coefficients of group 1 is smaller than the number of quantized coefficients of other groups, the number of quantized coefficients of group 1 is 4, and the number of quantized coefficients of other groups is 8, which is equal to the packet length of the quantized coefficient mapping queue.
Example ten
In this embodiment, when there are multiple layers of bitmaps, as shown in fig. 17, when mapping bitmaps into bitmap mapping queues, mapping each row of elements in bitmaps of each layer into a sub-bitmap mapping queue, first sorting the sub-bitmap mapping queues according to the order of layer indexes, then sorting the sub-bitmap mapping queues with the same layer indexes according to the order of the row indexes to obtain bitmap mapping queues, and grouping the bitmap mapping queues according to the grouping length of 8 to obtain bitmap information groups: group 0-group 13, wherein the bitmap of layer 0 is mapped and divided into group 0-group 6; the bitmap of layer 1 is mapped and divided into groups 7-13.
And mapping the quantization coefficients indicated by the elements in the bitmap into a quantization coefficient mapping queue according to the mapping sequence of the bitmap or according to the bitmap mapping queue. After the quantized coefficient mapping queue is obtained, the quantized coefficient mapping queue is grouped according to the grouping length to obtain a quantized coefficient information group, as shown in fig. 17, the grouping length may be equal to 8, the quantized coefficient mapping queue is divided into groups 0 to 7, wherein the quantized coefficient of layer 0 is divided into groups 0 to 3 after mapping, and the quantized coefficient of layer 1 is divided into groups 4 to 7 after mapping. In layer 0, the number of quantized coefficients of group 0 is smaller than the number of quantized coefficients of other groups, layer 0 has 28 quantized coefficients in total that require feedback, and 28 cannot be evenly divided by 8, so the number of quantized coefficients of group 0 is smaller than the number of quantized coefficients of other groups, the number of quantized coefficients of group 0 is 4, and the number of quantized coefficients of other groups is 8, which is equal to the packet length of the quantized coefficient mapping queue. Similarly, in layer 1, the number of quantized coefficients of group 4 is smaller than the number of quantized coefficients of other groups, layer 1 has 28 quantized coefficients in total and needs feedback, and 28 cannot be divided by 8, so the number of quantized coefficients of group 4 is smaller than the number of quantized coefficients of other groups, the number of quantized coefficients of group 4 is 4, and the number of quantized coefficients of other groups is 8, which is equal to the packet length of the quantized coefficient mapping queue.
EXAMPLE eleven
In this embodiment, when there are multiple layers of bitmaps, as shown in fig. 18, when mapping bitmaps into bitmap mapping queues, mapping each column of elements in bitmaps of each layer into a sub-bitmap mapping queue, first sorting the sub-bitmap mapping queues according to the sequence of the column indexes, then sorting the sub-bitmap mapping queues with the same column index according to the sequence of the layer indexes to obtain bitmap mapping queues, and grouping the bitmap mapping queues according to a grouping length of 16 to obtain bitmap information groups: group 0-group 6, where layer 0 and layer 1 column 0 map groups to group 0, layer 0 and layer 1 column 1 map groups to group 1, layer 0 and layer 1 column 2 map groups to group 2, layer 0 and layer 1 column 3 map groups to group 3, layer 0 and layer 1 column 4 map groups to group 4, layer 0 and layer 1 column 5 map groups to group 5, and layer 0 and layer 1 column 6 map groups to group 6.
The quantization coefficients indicated by elements in the bitmap are mapped into a quantization coefficient mapping queue according to the bitmap mapping queue, after the quantization coefficient mapping queue is obtained, the quantization coefficient mapping queue is grouped according to a grouping length 16 to obtain a quantization coefficient information group, the quantization coefficient mapping queue is divided into a group 0-group 3, the quantization coefficients of the layer 0 and the layer 1 share 56 and need feedback, and 56 cannot be divided by 16, so that the number of the quantization coefficients of the group 0 is smaller than that of the quantization coefficients of other groups, the number of the quantization coefficients of the group 0 is 8, and the number of the quantization coefficients of other groups is 16 and is equal to the grouping length of the quantization coefficient mapping queue.
Example twelve
In this embodiment, when there are multiple layers of bitmaps, as shown in fig. 19, when mapping bitmaps into bitmap mapping queues, mapping each column of elements in bitmaps of each layer into a sub-bitmap mapping queue, first sorting the sub-bitmap mapping queues according to the sequence of column indexes, then sorting the sub-bitmap mapping queues with the same column index according to the sequence of layer indexes to obtain bitmap mapping queues, and grouping the bitmap mapping queues according to a grouping length of 32 to obtain bitmap information groups: group 0-group 3, where column 0 and column 1 of layer 0 and layer 1 map groups to group 0, column 2 and column 3 of layer 0 and layer 1 map groups to group 1, column 4 and column 5 of layer 0 and layer 1 map groups to group 2, and column 6 and column 7 of layer 0 and layer 1 map groups to group 3.
The quantization coefficients indicated by elements in the bitmap are mapped into a quantization coefficient mapping queue according to the bitmap mapping queue, after the quantization coefficient mapping queue is obtained, the quantization coefficient mapping queue is grouped according to a grouping length 32 to obtain a quantization coefficient information group, the quantization coefficient mapping queue is divided into a group 0 and a group 1, the quantization coefficient mapping queue has 56 quantization coefficients in common for layer 0 and layer 1, feedback is needed, 56 cannot be divided by 32, so the number of the quantization coefficients of the group 0 is smaller than that of the quantization coefficients of the group 1, the number of the quantization coefficients of the group 0 is 24, and the number of the quantization coefficients of the group 1 is 32, which is equal to the grouping length of the quantization coefficient mapping queue.
EXAMPLE thirteen
In this embodiment, when there are multiple layers of bitmaps, as shown in fig. 20, when mapping bitmaps into bitmap mapping queues, mapping each column of elements in bitmaps of each layer into a sub-bitmap mapping queue, first sorting the sub-bitmap mapping queues according to the sequence of the column indexes, then sorting the sub-bitmap mapping queues with the same column index according to the sequence of the layer indexes to obtain bitmap mapping queues, and grouping the bitmap mapping queues according to a grouping length of 32 to obtain bitmap information groups: group 0-group 3, where column 0 and column 1 of layer 0 and layer 1 map groups to group 0, column 2 and column 3 of layer 0 and layer 1 map groups to group 1, column 4 and column 5 of layer 0 and layer 1 map groups to group 2, and column 6 and column 7 of layer 0 and layer 1 map groups to group 3.
The quantization coefficients indicated by elements in the bitmap are mapped into a quantization coefficient mapping queue according to the bitmap mapping queue, after the quantization coefficient mapping queue is obtained, the quantization coefficient mapping queue is grouped according to a grouping length 32 to obtain a quantization coefficient information group, the quantization coefficient mapping queue is divided into a group 0 and a group 1, the quantization coefficient mapping queue has 56 quantization coefficients in common for layer 0 and layer 1, feedback is needed, 56 cannot be divided by 32, so the number of the quantization coefficients of the group 1 is smaller than that of the quantization coefficients of the group 0, the number of the quantization coefficients of the group 1 is 24, and the number of the quantization coefficients of the group 0 is 32, which is equal to the grouping length of the quantization coefficient mapping queue.
Example fourteen
In this embodiment, as shown in fig. 21, in the mapping queue of the CSI report, the sequence is sequentially: the method comprises the steps of a public part, a bit bitmap information group and a quantization coefficient information group, wherein the bit bitmap information group comprises an information group 0-an information group 6, and the priority of the bit bitmap information group is reduced in sequence from the information group 0 to the information group 6; the quantized coefficient information group includes information group 0 to information group 3, and the priority of the quantized coefficients decreases in order from information group 0 to information group 3. When information in the CSI report is discarded, and when the bit number of the CSI report is larger than the bit number D which can be borne by uplink channel resources used for sending the CSI report, the bit bitmap information group and the quantization coefficient information group with the lowest priority in the mapping queue of the current CSI report are discarded at the same time. As shown in fig. 21, after discarding the quantized coefficient information group 3 and the bitmap information group 6, if the bit number of the CSI report is smaller than D, the discarding is stopped; and if the bit number of the CSI report is larger than D after the quantized coefficient information group 3 and the bit bitmap information group 6 are discarded, continuously discarding the bit bitmap information group and the quantized coefficient information group with the lowest priority in the mapping queue of the current CSI report.
Example fifteen
In this embodiment, as shown in fig. 22, in the mapping queue of the CSI report, the sequence is sequentially: the method comprises the steps of a public part, a bit bitmap information group and a quantization coefficient information group, wherein the bit bitmap information group comprises an information group 0-an information group 6, and the priority of the bit bitmap information group is reduced in sequence from the information group 0 to the information group 6; the quantized coefficient information group includes information group 0 to information group 3, and the priority of the quantized coefficients decreases in order from information group 0 to information group 3. From the tail of the mapping queue of the CSI report, bit bitmap information groups and quantization coefficient information groups are alternately arranged, but the bit bitmap information group with the lowest priority is arranged before the quantization coefficient information group with the lowest priority, the bit bitmap information group with the lower priority is arranged before the quantization coefficient information group with the lower priority, and the like.
When information in the CSI report is discarded, and when the bit number of the CSI report is larger than the bit number D which can be borne by uplink channel resources used for sending the CSI report, the bit bitmap information group and the quantization coefficient information group with the lowest priority in the mapping queue of the current CSI report are discarded at the same time. As shown in fig. 22, after discarding the quantized coefficient information group 3 and the bitmap information group 6, if the bit number of the CSI report is smaller than D, the discarding is stopped; and if the bit number of the CSI report is larger than D after the quantized coefficient information group 3 and the bit bitmap information group 6 are discarded, continuously discarding the bit bitmap information group and the quantized coefficient information group with the lowest priority in the mapping queue of the current CSI report.
Example sixteen
In this embodiment, as shown in fig. 23, in the mapping queue of the CSI report, the sequence is sequentially: the method comprises the steps of a public part, a bit bitmap information group and a quantization coefficient information group, wherein the bit bitmap information group comprises an information group 0-an information group 6, and the priority of the bit bitmap information group is reduced in sequence from the information group 0 to the information group 6; the quantized coefficient information group includes information group 0 to information group 3, and the priority of the quantized coefficients decreases in order from information group 0 to information group 3. When information in the CSI report is discarded, and when the bit number of the CSI report is larger than the bit number D which can be borne by uplink channel resources used for sending the CSI report, the bit bitmap information group and the quantization coefficient information group with the lowest priority in the mapping queue of the current CSI report are discarded at the same time. As shown in fig. 23, after discarding the quantized coefficient information group 3 and the bitmap information group 6, if the bit number of the CSI report is smaller than D, the discarding is stopped; and if the bit number of the CSI report is larger than D after the quantized coefficient information group 3 and the bit bitmap information group 6 are discarded, continuously discarding the bit bitmap information group and the quantized coefficient information group with the lowest priority in the mapping queue of the current CSI report.
The bit number of the CSI report after the information is discarded is smaller than the bit number D which can be carried by the uplink channel resource used for sending the CSI report, and a specific bit is complemented in the CSI report after the information is discarded, so that the bit number of the CSI report after the bit complementation is equal to D, and the specific bit can be all 0 or all 1.
Example seventeen
In this embodiment, as shown in fig. 24, in the mapping queue of the CSI report, the sequence is sequentially: the method comprises the steps of a public part, a bit bitmap information group and a quantization coefficient information group, wherein the bit bitmap information group comprises an information group 0-an information group 6, and the priority of the bit bitmap information group is reduced in sequence from the information group 0 to the information group 6; the quantized coefficient information group includes information group 0 to information group 3, and the priority of the quantized coefficients decreases in order from information group 0 to information group 3. From the tail of a mapping queue of the CSI report, alternately arranging bit bitmap information groups and quantization coefficient information groups; starting from the tail of the mapping queue of the CSI report, the priority of the 1 st quantized coefficient information group is the same as that of the 1 st bit map information group, the priority of the 2 nd quantized coefficient information group is the same as that of the 2 nd bit map information group, and so on, quantized coefficient information groups with the same priority are arranged behind the bit map information groups.
EXAMPLE eighteen
In this embodiment, as shown in fig. 25, in the mapping queue of the CSI report, the sequence is sequentially: the device comprises a public part, a bit bitmap information group and a quantization coefficient information group, wherein the bit bitmap information group comprises an information group 0-an information group 6, and the quantization coefficient information group comprises an information group 0-an information group 3. From the tail of a mapping queue of the CSI report, alternately arranging bit bitmap information groups and quantization coefficient information groups; from the tail of the mapping queue of the CSI report, the priority of the 1 st quantization coefficient information group is the same as that of the 1 st bit map information group, the priority of the 2 nd quantization coefficient information group is the same as that of the 2 nd bit map information group, the priority of the 3rd quantization coefficient information group is the same as that of the 3rd bit map information group, and the priority of the 4 th quantization coefficient information group is the same as that of the 4 th to 7 th bit map information groups.
The foregoing embodiments describe methods for transmitting CSI reports in different scenarios, and further describe terminals corresponding to the methods with reference to the accompanying drawings.
As shown in fig. 3, a terminal 300 according to an embodiment of the present invention includes a CSI report transmission apparatus, which can send a CSI report to a network side device in the foregoing embodiment, and achieve the same effect, where the terminal 300 specifically includes the following functional modules:
the grouping module 310 is configured to group quantization coefficients of non-zero coefficients to be fed back in the polarization coefficient matrix according to a first grouping length to obtain a plurality of information groups;
a discarding module 320, configured to discard at least one of the following information in the CSI report according to the priority information of the information group: the CSI report comprises a public part, a quantized coefficient and a bit bitmap for indicating the quantized coefficient, wherein the bit number of a discarded CSI report is equal to the bit number which can be borne by uplink channel resources for sending the CSI report, and the bit number which can be borne by the uplink channel resources for sending the CSI report is determined according to a code rate configured by a network side and the uplink channel resources distributed for the CSI report;
a sending module 330, configured to send the discarded CSI report on the uplink channel resource.
In this embodiment, quantization coefficients of non-zero coefficients to be fed back in a polarization coefficient matrix are grouped to obtain a plurality of information groups, information in a CSI report is discarded according to priority information of the information groups, so that the bit number of the CSI report after discarding the information is equal to the bit number that can be borne by an uplink channel resource used for sending the CSI report, and a network side device can receive and analyze the CSI report according to the priority information of the information groups to determine a part of discarded contents of a terminal, which is beneficial for the network side device to accurately know a channel state and optimize CSI feedback performance.
Optionally, the apparatus is further configured to acquire an uplink channel resource for transmitting the CSI report; calculating uplink channel resources required for transmitting the CSI report; and judging that the acquired uplink channel resources are smaller than the transmission resources required by the CSI report.
The Uplink Channel resource includes, but is not limited to, a Physical Uplink Control Channel (PUCCH) and/or a Physical Uplink Shared Channel (PUSCH). Optionally, the uplink channel Resource may be configured to the terminal semi-statically by a Radio Resource Control (RRC) signaling through the network side device, or may be dynamically indicated to the terminal through a Physical Downlink Control Channel (PDCCH).
Optionally, the quantized coefficients comprise at least one of: an amplitude quantization value and a phase quantization value.
Optionally, the grouping module 310 is specifically configured to:
mapping the quantization coefficients indicated by the elements in the bitmap into a quantization coefficient mapping queue according to the bitmap mapping queue or the mapping sequence of the bitmap, wherein the bitmap mapping queue is obtained by mapping the elements in the bitmap;
grouping the quantization coefficient mapping queues according to a first grouping length, and dividing the quantization coefficient mapping queues into E quantization coefficient information groups, wherein each E-1 quantization coefficient information group comprises F quantization coefficients, F is the first grouping length, E is an integer greater than 1, and E is equal to ceil (BF/F) or floor (BF/F), wherein ceil is rounding-up, floor is rounding-down, and BF is the total number of quantization coefficients.
Optionally, the grouping module 310 is specifically configured to:
and according to the sequencing of the quantization coefficient mapping queues, sequentially dividing the quantization coefficient mapping queues into a quantization coefficient information group 0, a quantization coefficient information group 1, …, a quantization coefficient information group E-2 and a quantization coefficient information group E-1 from beginning to end, wherein the priority of the quantization coefficient information group E is higher than that of the quantization coefficient information group E-1 or the priority of the quantization coefficient information group E-1 is higher than that of the quantization coefficient information group E, and E is an integer which is more than or equal to 1 and less than or equal to E.
Optionally, the quantized coefficient information group 0 includes a smaller number of quantized coefficients than the other quantized coefficient information groups; or
The quantized coefficient information group 0 includes a larger number of quantized coefficients than the other quantized coefficient information groups; or
The quantized coefficient information group E-1 includes quantized coefficients whose number is smaller than the quantized coefficients included in the other quantized coefficient information groups; or
The quantized coefficient information group E-1 includes a larger number of quantized coefficients than the other quantized coefficient information groups.
Optionally, the grouping module 310 is further configured to:
and grouping the bitmap mapping queues according to a second packet length, and dividing the bitmap mapping queues into G bitmap information groups, wherein G-1 bitmap information groups all comprise H elements, H is the second packet length, G is an integer larger than 1, E is equal to ceil (BG/H) or floor (BG/H), ceil is rounded up, floor is rounded down, and BG is the total number of bitmap elements.
Optionally, the grouping module 310 is specifically configured to:
and according to the sequence of the bitmap mapping queue, sequentially dividing the bitmap mapping queue into a bitmap information group 0, bitmap information groups 1 and …, a bitmap information group G-2 and a bitmap information group G-1 from beginning to end, wherein if the priority of a quantization coefficient information group e is higher than that of a quantization coefficient information group e-1, the priority of a bitmap information group G is higher than that of a bitmap information group G-1, if the priority of the quantization coefficient information group e-1 is higher than that of the quantization coefficient information group e, the priority of the bitmap information group G-1 is higher than that of the bitmap information group G, and G is an integer which is greater than or equal to 1 and less than or equal to G.
Optionally, the bit map information group 0 includes a smaller number of quantized coefficients than the other quantized coefficient information groups; or
The bit map information group 0 includes quantization coefficients whose number is greater than that of other quantization coefficient information groups; or
The number of quantization coefficients included in the bitmap information group G-1 is smaller than the number of quantization coefficients included in the other quantization coefficient information groups; or
The bit map information group G-1 includes a larger number of quantized coefficients than the other quantized coefficient information groups.
Optionally, the first packet length employs any of:
greater than or equal to the second packet length;
equal to the number of elements of the bitmap information group containing the most elements.
Optionally, the grouping module 310 is specifically configured to perform any one of the following:
mapping each row of elements in the bitmap of each layer into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of the layer indexes, and sequencing the sub-bitmap mapping queues with the same layer index according to the sequence of the row indexes to obtain the bitmap mapping queue;
mapping each row of elements in the bitmap of each layer into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of the row indexes, and sequencing the sub-bitmap mapping queues with the same row index according to the sequence of the layer indexes to obtain the bitmap mapping queue;
mapping each row of elements in each layer of bitmap into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of layer indexes, and sequencing the sub-bitmap mapping queues with the same layer index according to the sequence of the row indexes to obtain the bitmap mapping queue;
mapping each row of elements in each layer of bitmap into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of row indexes, and sequencing the sub-bitmap mapping queues with the same row index according to the sequence of layer indexes to obtain the bitmap mapping queue.
Optionally, the order of the layer indexes adopts any one of the following:
according to the layer index from large to small;
from small to large according to the layer index;
according to the preset sequence of the layer indexes;
the order of the row indexes adopts any one of the following orders:
according to the line index from large to small;
from small to large according to the row index;
according to the preset sequence of the line indexes;
the order of the column indexes adopts any one of the following orders:
from large to small by column index;
from small to large in terms of column index;
in a preset order of column indices.
Optionally, the grouping module 310 is specifically configured to perform any one of the following:
mapping the row element into a bitmap mapping queue according to the column index of the row element from large to small;
mapping the row element into a bitmap mapping queue according to the column index of the row element from small to large;
and mapping the row elements into a bitmap mapping queue according to the preset sequence of the column indexes of the row elements.
Optionally, the grouping module 310 is specifically configured to perform any one of the following:
mapping the row element into a bitmap mapping queue according to the row index of the row element from large to small;
mapping the row element into a bitmap mapping queue according to the row index of the row element from small to large;
and mapping the row elements into a bitmap mapping queue according to the preset sequence of the row indexes of the column elements.
Optionally, the element in the bitmap corresponding to the strongest coefficient is not mapped to the bitmap mapping queue.
Optionally, the first packet length and/or the second packet length is determined by any one of the following methods:
protocol provision;
configuring network side equipment;
the terminal sets and reports the data to the network side equipment;
determining the number of spatial domain beams configured by the network side equipment;
the grouping coefficient is calculated, and is greater than or equal to 0 and less than or equal to 1.
Optionally, the grouping coefficient is determined by any one of the following methods:
protocol provision;
configuring network side equipment;
and the terminal sets and reports the data to the network side equipment.
Optionally, the grouping coefficient includes a first grouping coefficient and/or a second grouping coefficient, and when the first grouping length is calculated by the first grouping coefficient, the first grouping length is calculated by any one of the following methods:
the first packet length (the first packet coefficient BL);
the first packet length (ceil) (the first packet coefficient (BL)), and BL is the number of non-zero coefficients to be fed back;
when the second packet length is calculated by the second packet coefficient, the second packet length is calculated by any one of the following methods:
the second grouping length ceil (the second grouping coefficient dimension), which is equal to the length of a row or column of the bitmap;
the second grouping length (the second grouping coefficient dimension) is equal to the length of the bitmap row or column;
the second packet length is floor (the second packet coefficient BS);
the second packet length ceil (the second packet coefficient BS), and BS is the total number of elements of the bitmap.
Optionally, the priority of the common part is any one of:
higher priority than all bit map information groups and/or higher priority than all quantized coefficient information groups;
equal to the priority of the highest priority set of bitmap information and/or equal to the priority of the highest priority set of bitmap information.
Optionally, the mapping queue of the CSI report adopts any one of:
the mapping sequence of the public part is prior to the mapping sequence of the bit bitmap information group and the quantization coefficient information group, and the mapping sequence of the bit bitmap information group is prior to the mapping sequence of the quantization coefficient information group;
the mapping order of the common part is prior to the mapping order of the bitmap information groups and the quantized coefficient information groups, and the bitmap information groups and the quantized coefficient information groups are alternately arranged from the tail of the mapping queue of the CSI report.
Optionally, if the mapping order of the bitmap information groups is prior to the mapping order of the quantized coefficient information groups, in a direction from the head of the mapping queue of the CSI report to the tail of the mapping queue of the CSI report, the priorities of the bitmap information groups are sequentially decreased, and the priorities of the quantized coefficient information groups are sequentially decreased;
if the bit bitmap information groups and the quantization coefficient information groups are alternately arranged from the tail of the mapping queue of the CSI report, the priority of the bit bitmap information groups is sequentially reduced or the priority of a plurality of continuous bit bitmap information groups is the same in the direction from the head of the mapping queue of the CSI report to the tail of the mapping queue of the CSI report, and the priority of the bit bitmap information groups is the same as the priority of the adjacent quantization coefficient information groups behind.
Optionally, the bit number of the common part is smaller than a bit number D that can be carried by an uplink channel resource used for sending the CSI report, and the discarding module 320 is specifically configured to:
and discarding the quantization coefficient information group with the lowest priority and the bitmap information group with the lowest priority in the mapping queue of the CSI report until the bit number of the CSI report after information discarding is less than or equal to the bit number D which can be borne by the uplink channel resource used for sending the CSI report.
Optionally, the bit number of the common part is greater than or equal to a bit number D that can be carried by an uplink channel resource used for sending the CSI report, and the discarding module 320 is specifically configured to perform any one of the following:
discarding all quantized coefficient information groups and bit bitmap information groups, and discarding the content of the public part bit by bit until the bit number of the CSI report after information discarding is equal to the bit number D which can be borne by the uplink channel resource used for sending the CSI report;
discarding all quantized coefficient information groups, bit map information groups, and the common portion.
Optionally, the discarding module 320 is further configured to discard all bit map information groups after all quantized coefficient information groups are discarded.
Optionally, if the bit number of the CSI report after discarding information is smaller than the bit number D that can be carried by the uplink channel resource used to send the CSI report, before sending the discarded CSI report on the uplink channel resource, the apparatus is further configured to complement a specific bit in the CSI report after discarding information, so that the bit number of the CSI report after complementing the bit is equal to D; or adjusting the code rate, so that the bit number which can be borne by the uplink channel resource used for sending the CSI report is equal to the bit number of the CSI report after the information is discarded.
Optionally, the specific bit is 0 or 1.
Optionally, the common part comprises at least one of the following information: spatial beam information, oversampling information, information of a strongest coefficient, indication information of a frequency domain orthogonal basis, and weak polarization reference amplitude quantization information.
To better achieve the above object, further, fig. 4 is a schematic diagram of a hardware structure of a terminal implementing various embodiments of the present invention, where the terminal 40 includes, but is not limited to: radio frequency unit 41, network module 42, audio output unit 43, input unit 44, sensor 45, display unit 46, user input unit 47, interface unit 48, memory 49, processor 410, and power supply 411. Those skilled in the art will appreciate that the terminal configuration shown in fig. 4 is not intended to be limiting, and that the terminal may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The radio frequency unit 41 is configured to send a channel state information CSI report to a network side device;
the processor 410 is configured to group quantization coefficients of non-zero coefficients to be fed back in the polarization coefficient matrix according to a first group length to obtain a plurality of information groups; according to the priority information of the information group, at least one of the following information in the CSI report is discarded: the CSI report comprises a public part, a quantized coefficient and a bit bitmap for indicating the quantized coefficient, wherein the bit number of the discarded CSI report is equal to the bit number which can be borne by uplink channel resources for sending the CSI report, and the bit number which can be borne by the uplink channel resources for sending the CSI report is determined according to the code rate configured by a network side and the uplink channel resources distributed for the CSI report.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 41 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 410; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 41 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 41 can also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user via the network module 42, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 43 may convert audio data received by the radio frequency unit 41 or the network module 42 or stored in the memory 49 into an audio signal and output as sound. Also, the audio output unit 43 may also provide audio output related to a specific function performed by the terminal 40 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 43 includes a speaker, a buzzer, a receiver, and the like.
The input unit 44 is for receiving an audio or video signal. The input Unit 44 may include a Graphics Processing Unit (GPU) 441 and a microphone 442, and the Graphics processor 441 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 46. The image frames processed by the graphic processor 441 may be stored in the memory 49 (or other storage medium) or transmitted via the radio frequency unit 41 or the network module 42. The microphone 442 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 41 in case of the phone call mode.
The terminal 40 also includes at least one sensor 45, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 461 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 461 and/or a backlight when the terminal 40 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 45 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 46 is used to display information input by the user or information provided to the user. The Display unit 46 may include a Display panel 461, and the Display panel 461 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 47 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 47 includes a touch panel 471 and other input devices 472. The touch panel 471, also referred to as a touch screen, may collect touch operations by a user (e.g., operations by a user on or near the touch panel 471 using a finger, a stylus, or any other suitable object or accessory). The touch panel 471 can include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 410, receives a command from the processor 410, and executes the command. In addition, the touch panel 471 can be implemented by various types, such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 47 may include other input devices 472 in addition to the touch panel 471. Specifically, the other input devices 472 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 471 can be overlaid on the display panel 461, and when the touch panel 471 detects a touch operation on or near the touch panel 471, the touch panel transmits the touch operation to the processor 410 to determine the type of the touch event, and then the processor 410 provides a corresponding visual output on the display panel 461 according to the type of the touch event. Although the touch panel 471 and the display panel 461 are shown as two separate components in fig. 4, in some embodiments, the touch panel 471 and the display panel 461 may be integrated to implement the input and output functions of the terminal, and are not limited herein.
The interface unit 48 is an interface for connecting an external device to the terminal 40. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 48 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the terminal 40 or may be used to transmit data between the terminal 40 and external devices.
The memory 49 may be used to store software programs as well as various data. The memory 49 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 49 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 410 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 49 and calling data stored in the memory 49, thereby performing overall monitoring of the terminal. Processor 410 may include one or more processing units; preferably, the processor 410 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
The terminal 40 may further include a power supply 411 (e.g., a battery) for supplying power to various components, and preferably, the power supply 411 may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the terminal 40 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 410, a memory 49, and a computer program stored in the memory 49 and capable of running on the processor 410, where the computer program is executed by the processor 410 to implement each process of the above transmission method for CSI report, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. A terminal may be a wireless terminal or a wired terminal, and a wireless terminal may be a device providing voice and/or other service data connectivity to a user, a handheld device having a wireless connection function, or other processing devices connected to a wireless modem. Wireless terminals, which may be mobile terminals such as mobile telephones (or "cellular" telephones) and computers having mobile terminals, such as portable, pocket, hand-held, computer-included, or vehicle-mounted mobile devices, may communicate with one or more core networks via a Radio Access Network (RAN), which may exchange language and/or data with the RAN. For example, Personal Communication Service (PCS) phones, cordless phones, Session Initiation Protocol (SIP) phones, Wireless Local Loop (WLL) stations, Personal Digital Assistants (PDAs), and the like. A wireless Terminal may also be referred to as a system, a Subscriber Unit (Subscriber Unit), a Subscriber Station (Subscriber Station), a Mobile Station (Mobile), a Remote Station (Remote Station), a Remote Terminal (Remote Terminal), an access Terminal (access Terminal), a User Terminal (User Terminal), a User Agent (User Agent), and a User Equipment (User device User Equipment), which are not limited herein.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the foregoing method for transmitting a CSI report on a terminal side, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
The above embodiment describes the transmission method of the CSI report of the present invention from the terminal side, and the following embodiment further describes the transmission method of the CSI report of the network side device side with reference to the accompanying drawings.
As shown in fig. 5, the method for transmitting a CSI report according to an embodiment of the present invention is applied to a network side device, and the method may include the following steps:
step 201: receiving a Channel State Information (CSI) report of a terminal, wherein quantized coefficients of nonzero coefficients to be fed back in a polarization coefficient matrix in the CSI report are divided into a plurality of information groups;
step 202: and demodulating the bit bitmap and the quantized coefficients of the CSI report according to the priority information of the information group.
In this embodiment, the network side device may receive and analyze the CSI report according to the priority information of the information group, and determine the content in the CSI report, which is beneficial for the network side device to accurately know the channel state and optimize the CSI feedback performance.
The CSI report received by the network side device in this embodiment is the CSI exposure sent by the terminal in the foregoing embodiment after the discard processing.
According to the configured uplink channel resources, the CSI parameters configured by the network side equipment and the information carried by part1 in the CSI report, the network side equipment decodes part2 to bit level after receiving the CSI report, wherein the part of the non-bit bitmap and the quantization coefficient in part2 can be obtained by part1 and the CSI parameter information configured by the network side equipment, and the rest part, namely the part of the bit bitmap and the quantization coefficient, sequentially determines the included bit bitmap and the corresponding quantization coefficient according to the priority information of the corresponding information group and the priority from high to low.
Optionally, after demodulating the bitmap and the quantized coefficients of the CSI report, the method further includes any one of:
and if the CSI report is filled with specific bits, restoring the codebook of the CSI report without using the specific bits.
Optionally, the specific bit is 0 or 1.
The quantized coefficients include at least one of: an amplitude quantization value and a phase quantization value.
The foregoing embodiments respectively describe in detail the transmission methods of CSI reports in different scenarios, and the following embodiments further describe the corresponding network side devices with reference to the accompanying drawings.
As shown in fig. 6, a network-side device 600 according to an embodiment of the present invention includes a CSI report transmission apparatus, and can receive a CSI report of channel state information in the foregoing embodiment, and achieve the same effect, where the network-side device 600 specifically includes the following functional modules:
a receiving module 610, configured to receive a CSI report of a terminal, where quantized coefficients of non-zero coefficients to be fed back in a polarization coefficient matrix in the CSI report are divided into multiple information groups;
and a processing module 620, configured to demodulate the bitmap and the quantized coefficients of the CSI report according to the priority information of the information group.
Optionally, after demodulating the bitmap and the quantized coefficients of the CSI report, the processing module 620 is further configured to: and if the CSI report is filled with specific bits, restoring the codebook of the CSI report without using the specific bits.
Optionally, the specific bit is 0 or 1.
The quantized coefficients include at least one of: an amplitude quantization value and a phase quantization value.
The CSI report received by the receiving module 610 of this embodiment is the CSI exposure sent by the terminal in the foregoing embodiment after the discard processing.
It should be noted that the division of each module of the network side device and the terminal is only a division of a logical function, and all or part of the actual implementation may be integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the determining module may be a processing element separately set up, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the function of the determining module is called and executed by a processing element of the apparatus. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), etc. For another example, when some of the above modules are implemented in the form of a processing element scheduler code, the processing element may be a general purpose processor, such as a Central Processing Unit (CPU) or other processor that can invoke the program code. As another example, these modules may be integrated together, implemented in the form of a system-on-a-chip (SOC).
In order to better achieve the above object, an embodiment of the present invention further provides a network side device, where the network side device includes a processor, a memory, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the computer program, the steps in the transmission method for CSI report as described above are implemented, and the same technical effect can be achieved, and in order to avoid repetition, details are not repeated here.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the transmission method applied to a CSI report of a network-side device as described above are implemented, and the same technical effects can be achieved, and are not described herein again to avoid repetition.
Specifically, the embodiment of the invention also provides a network side device. As shown in fig. 7, the network-side device 700 includes: an antenna 71, a radio frequency device 72, a baseband device 73. The antenna 71 is connected to a radio frequency device 72. In the uplink direction, the rf device 72 receives information via the antenna 71 and sends the received information to the baseband device 73 for processing. In the downlink direction, the baseband device 73 processes information to be transmitted and transmits the information to the rf device 72, and the rf device 72 processes the received information and transmits the processed information through the antenna 71.
The above-mentioned band processing means may be located in the baseband device 73, and the method performed by the network side device in the above embodiment may be implemented in the baseband device 73, where the baseband device 73 includes a processor 74 and a memory 75.
The baseband device 73 may include, for example, at least one baseband board, on which a plurality of chips are disposed, as shown in fig. 7, where one of the chips, for example, the processor 74, is connected to the memory 75 to call up the program in the memory 75 to perform the network-side device operation shown in the above method embodiment.
The baseband device 73 may further include a network interface 76, such as a Common Public Radio Interface (CPRI), for exchanging information with the radio frequency device 72.
The processor may be a single processor or a combination of multiple processing elements, for example, the processor may be a CPU, an ASIC, or one or more integrated circuits configured to implement the method performed by the above network-side device, for example: one or more microprocessors DSP, or one or more field programmable gate arrays FPGA, or the like. The storage element may be a memory or a combination of a plurality of storage elements.
The memory 75 may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (PROM), an erasable programmable Read-only memory (erasabprom, EPROM), an electrically erasable programmable Read-only memory (EEPROM), or a flash memory. The volatile memory may be a Random Access Memory (RAM) which functions as an external cache. By way of example, but not limitation, many forms of RAM are available, such as static random access memory (staticiram, SRAM), dynamic random access memory (dynamic RAM, DRAM), synchronous dynamic random access memory (syncronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), enhanced synchronous dynamic random access memory (EnhancedSDRAM, ESDRAM), synchronous link dynamic random access memory (synchlink DRAM, SLDRAM), and direct memory bus random access memory (DRRAM). The memory 75 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
Specifically, the network side device of the embodiment of the present invention further includes: a computer program stored on the memory 75 and executable on the processor 74, the processor 74 calling the computer program in the memory 75 to execute the method performed by each module shown in fig. 6.
Specifically, the computer program, when invoked by the processor 74, may be configured to perform a CSI report of channel state information of a receiving terminal, where quantized coefficients of non-zero coefficients to be fed back in a polarization coefficient matrix in the CSI report are divided into a plurality of information groups; and demodulating the bit bitmap and the quantized coefficients of the CSI report according to the priority information of the information group.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention or a part thereof, which essentially contributes to the prior art, can be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network side device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
Furthermore, it is to be noted that in the device and method of the invention, it is obvious that the individual components or steps can be decomposed and/or recombined. These decompositions and/or recombinations are to be regarded as equivalents of the present invention. Also, the steps of performing the series of processes described above may naturally be performed chronologically in the order described, but need not necessarily be performed chronologically, and some steps may be performed in parallel or independently of each other. It will be understood by those skilled in the art that all or any of the steps or elements of the method and apparatus of the present invention may be implemented in any computing device (including processors, storage media, etc.) or network of computing devices, in hardware, firmware, software, or any combination thereof, which can be implemented by those skilled in the art using their basic programming skills after reading the description of the present invention.
Thus, the objects of the invention may also be achieved by running a program or a set of programs on any computing device. The computing device may be a general purpose device as is well known. The object of the invention is thus also achieved solely by providing a program product comprising program code for implementing the method or the apparatus. That is, such a program product also constitutes the present invention, and a storage medium storing such a program product also constitutes the present invention. It is to be understood that the storage medium may be any known storage medium or any storage medium developed in the future. It is further noted that in the apparatus and method of the present invention, it is apparent that each component or step can be decomposed and/or recombined. These decompositions and/or recombinations are to be regarded as equivalents of the present invention. Also, the steps of executing the series of processes described above may naturally be executed chronologically in the order described, but need not necessarily be executed chronologically. Some steps may be performed in parallel or independently of each other.
While the preferred embodiments of the present invention have been described, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the following claims.

Claims (32)

1. A transmission method of CSI report is applied to a terminal side, and is characterized by comprising the following steps:
grouping the quantization coefficients of the nonzero coefficients to be fed back in the polarization coefficient matrix according to the first grouping length to obtain a plurality of information groups;
according to the priority information of the information group, at least one of the following information in the CSI report is discarded: the CSI report comprises a public part, a quantized coefficient and a bit bitmap for indicating the quantized coefficient, wherein the bit number of a discarded CSI report is equal to the bit number which can be borne by uplink channel resources for sending the CSI report, and the bit number which can be borne by the uplink channel resources for sending the CSI report is determined according to a code rate configured by a network side and the uplink channel resources distributed for the CSI report;
and sending the CSI report after the discarding processing on the uplink channel resource.
2. The method of claim 1, wherein grouping quantized coefficients of non-zero coefficients to be fed back in a polarization coefficient matrix according to a first packet length comprises:
mapping the quantization coefficients indicated by the elements in the bitmap into a quantization coefficient mapping queue according to the bitmap mapping queue or the mapping sequence of the bitmap, wherein the bitmap mapping queue is obtained by mapping the elements in the bitmap;
grouping the quantization coefficient mapping queues according to a first grouping length, and dividing the quantization coefficient mapping queues into E quantization coefficient information groups, wherein each E-1 quantization coefficient information group comprises F quantization coefficients, F is the first grouping length, E is an integer greater than 1, and E is equal to ceil (BF/F) or floor (BF/F), wherein ceil is rounding-up, floor is rounding-down, and BF is the total number of quantization coefficients.
3. The method of claim 2, wherein the dividing the quantized coefficient map queue into E quantized coefficient information groups comprises:
and according to the sequencing of the quantization coefficient mapping queues, sequentially dividing the quantization coefficient mapping queues into a quantization coefficient information group 0, a quantization coefficient information group 1, …, a quantization coefficient information group E-2 and a quantization coefficient information group E-1 from beginning to end, wherein the priority of the quantization coefficient information group E is higher than that of the quantization coefficient information group E-1 or the priority of the quantization coefficient information group E-1 is higher than that of the quantization coefficient information group E, and E is an integer which is more than or equal to 1 and less than or equal to E.
4. The method for transmitting the CSI report according to claim 3,
the quantized coefficient information group 0 includes the number of quantized coefficients smaller than the number of quantized coefficients included in the other quantized coefficient information groups; or
The quantized coefficient information group 0 includes a larger number of quantized coefficients than the other quantized coefficient information groups; or
The quantized coefficient information group E-1 includes quantized coefficients whose number is smaller than the quantized coefficients included in the other quantized coefficient information groups; or
The quantized coefficient information group E-1 includes a larger number of quantized coefficients than the other quantized coefficient information groups.
5. The method for transmitting the CSI report according to claim 3, further comprising:
and grouping the bitmap mapping queues according to a second packet length, and dividing the bitmap mapping queues into G bitmap information groups, wherein G-1 bitmap information groups all comprise H elements, H is the second packet length, G is an integer larger than 1, E is equal to ceil (BG/H) or floor (BG/H), ceil is rounded up, floor is rounded down, and BG is the total number of bitmap elements.
6. The method of claim 5, wherein the dividing the bitmap mapping queue into G bitmap information groups comprises:
and according to the sequence of the bitmap mapping queue, sequentially dividing the bitmap mapping queue into a bitmap information group 0, bitmap information groups 1 and …, a bitmap information group G-2 and a bitmap information group G-1 from beginning to end, wherein if the priority of a quantization coefficient information group e is higher than that of a quantization coefficient information group e-1, the priority of a bitmap information group G is higher than that of a bitmap information group G-1, if the priority of the quantization coefficient information group e-1 is higher than that of the quantization coefficient information group e, the priority of the bitmap information group G-1 is higher than that of the bitmap information group G, and G is an integer which is greater than or equal to 1 and less than or equal to G.
7. The method for transmitting the CSI report according to claim 6,
the number of quantization coefficients included in the bitmap information group 0 is smaller than the number of quantization coefficients included in the other quantization coefficient information groups; or
The bit map information group 0 includes quantization coefficients whose number is greater than that of other quantization coefficient information groups; or
The number of quantization coefficients included in the bitmap information group G-1 is smaller than the number of quantization coefficients included in the other quantization coefficient information groups; or
The bit map information group G-1 includes a larger number of quantized coefficients than the other quantized coefficient information groups.
8. The method for transmitting the CSI report according to claim 5, wherein the first packet length is any one of the following:
greater than or equal to the second packet length;
equal to the number of elements of the bitmap information group containing the most elements.
9. The method for transmitting the CSI report according to claim 2, wherein mapping the elements in the bitmap to a bitmap mapping queue comprises any one of:
mapping each row of elements in the bitmap of each layer into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of the layer indexes, and sequencing the sub-bitmap mapping queues with the same layer index according to the sequence of the row indexes to obtain the bitmap mapping queue;
mapping each row of elements in the bitmap of each layer into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of the row indexes, and sequencing the sub-bitmap mapping queues with the same row index according to the sequence of the layer indexes to obtain the bitmap mapping queue;
mapping each row of elements in each layer of bitmap into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of layer indexes, and sequencing the sub-bitmap mapping queues with the same layer index according to the sequence of the row indexes to obtain the bitmap mapping queue;
mapping each row of elements in each layer of bitmap into a sub-bitmap mapping queue, sequencing the sub-bitmap mapping queues according to the sequence of row indexes, and sequencing the sub-bitmap mapping queues with the same row index according to the sequence of layer indexes to obtain the bitmap mapping queue.
10. The method for transmitting CSI report according to claim 9, wherein the order of the layer indexes is any of the following:
according to the layer index from large to small;
from small to large according to the layer index;
according to the preset sequence of the layer indexes;
the order of the row indexes adopts any one of the following orders:
according to the line index from large to small;
from small to large according to the row index;
according to the preset sequence of the line indexes;
the order of the column indexes adopts any one of the following orders:
from large to small by column index;
from small to large in terms of column index;
in a preset order of column indices.
11. The method for transmitting the CSI report according to claim 9, wherein mapping a row of elements in the bitmap to the bitmap mapping queue comprises any one of:
mapping the row element into a bitmap mapping queue according to the column index of the row element from large to small;
mapping the row element into a bitmap mapping queue according to the column index of the row element from small to large;
and mapping the row elements into a bitmap mapping queue according to the preset sequence of the column indexes of the row elements.
12. The method for transmitting the CSI report according to claim 9, wherein mapping a list of elements in the bitmap to the bitmap mapping queue comprises any one of:
mapping the row element into a bitmap mapping queue according to the row index of the row element from large to small;
mapping the row element into a bitmap mapping queue according to the row index of the row element from small to large;
and mapping the row elements into a bitmap mapping queue according to the preset sequence of the row indexes of the column elements.
13. The method of claim 9, wherein the elements in the bitmap corresponding to the strongest coefficients are not mapped to the bitmap mapping queue.
14. The method for transmitting the CSI report according to claim 5, wherein the first packet length and/or the second packet length is determined by any one of the following methods:
protocol provision;
configuring network side equipment;
the terminal sets and reports the data to the network side equipment;
determining the number of spatial domain beams configured by the network side equipment;
the grouping coefficient is calculated, and is greater than or equal to 0 and less than or equal to 1.
15. The method for transmitting CSI reports according to claim 14, wherein the grouping coefficients are determined by any one of the following methods:
protocol provision;
configuring network side equipment;
and the terminal sets and reports the data to the network side equipment.
16. The CSI report transmission method according to claim 14, wherein the grouping coefficients comprise a first grouping coefficient and/or a second grouping coefficient, and when the first grouping length is calculated by the first grouping coefficient, the first grouping length is calculated by any one of the following methods:
the first packet length (the first packet coefficient BL);
the first packet length (ceil) (the first packet coefficient (BL)), and BL is the number of non-zero coefficients to be fed back;
when the second packet length is calculated by the second packet coefficient, the second packet length is calculated by any one of the following methods:
the second grouping length ceil (the second grouping coefficient dimension), which is equal to the length of a row or column of the bitmap;
the second grouping length (the second grouping coefficient dimension) is equal to the length of the bitmap row or column;
the second packet length is floor (the second packet coefficient BS);
the second packet length ceil (the second packet coefficient BS), and BS is the total number of elements of the bitmap.
17. The method for transmitting the CSI report according to claim 6, wherein the priority of the common part is any one of the following:
higher priority than all bit map information groups and/or higher priority than all quantized coefficient information groups;
equal to the priority of the highest priority set of bitmap information and/or equal to the priority of the highest priority set of bitmap information.
18. The method for transmitting CSI reports according to claim 17, wherein the mapping queue of CSI reports employs any one of:
the mapping sequence of the public part is prior to the mapping sequence of the bit bitmap information group and the quantization coefficient information group, and the mapping sequence of the bit bitmap information group is prior to the mapping sequence of the quantization coefficient information group;
the mapping order of the common part is prior to the mapping order of the bitmap information groups and the quantized coefficient information groups, and the bitmap information groups and the quantized coefficient information groups are alternately arranged from the tail of the mapping queue of the CSI report.
19. The method for transmitting a CSI report according to claim 18,
if the mapping sequence of the bitmap information groups is prior to the mapping sequence of the quantized coefficient information groups, the priorities of the bitmap information groups are sequentially reduced in the direction from the head of the mapping queue of the CSI report to the tail of the mapping queue of the CSI report, and the priorities of the quantized coefficient information groups are sequentially reduced;
if the bit bitmap information groups and the quantization coefficient information groups are alternately arranged from the tail of the mapping queue of the CSI report, the priority of the bit bitmap information groups is sequentially reduced or the priority of a plurality of continuous bit bitmap information groups is the same in the direction from the head of the mapping queue of the CSI report to the tail of the mapping queue of the CSI report, and the priority of the bit bitmap information groups is the same as the priority of the adjacent quantization coefficient information groups behind.
20. The method for transmitting the CSI report according to claim 19, wherein the bit number of the common part is smaller than the bit number D that can be carried by the uplink channel resource used to transmit the CSI report, and discarding at least one of the following information in the CSI report according to the priority information of the information group comprises:
and discarding the quantization coefficient information group with the lowest priority and the bitmap information group with the lowest priority in the mapping queue of the CSI report until the bit number of the CSI report after information discarding is less than or equal to the bit number D which can be borne by the uplink channel resource used for sending the CSI report.
21. The method for transmitting the CSI report according to claim 19, wherein the number of bits of the common part is equal to or greater than the number of bits D that can be carried by the uplink channel resource used to transmit the CSI report, and discarding at least one of the following information in the CSI report comprises any of:
discarding all quantized coefficient information groups and bit bitmap information groups, and discarding the content of the public part bit by bit until the bit number of the CSI report after information discarding is equal to the bit number D which can be borne by the uplink channel resource used for sending the CSI report;
discarding all quantized coefficient information groups, bit map information groups, and the common portion.
22. The method for transmitting the CSI report according to claim 21 or 20, wherein discarding at least one of the following information in the CSI report further comprises:
after all quantized coefficient information groups are discarded, all bitmap information groups are discarded.
23. The method according to claim 21 or 20, wherein if the number of bits of the discarded CSI report is smaller than the number of bits D that can be carried by the uplink channel resource used to send the CSI report, before sending the discarded CSI report on the uplink channel resource, the method further comprises any of:
supplementing specific bits in the CSI report after information discarding so that the bit number of the CSI report after bit supplementing is equal to D;
and adjusting the code rate to ensure that the bit number which can be borne by the uplink channel resource used for sending the CSI report is equal to the bit number of the CSI report after the information is discarded.
24. The method for transmitting the CSI report of claim 23, wherein the specific bit is 0 or 1.
25. The method for transmitting the CSI report according to claim 1, wherein the common part comprises at least one of the following information: spatial beam information, oversampling information, information of a strongest coefficient, indication information of a frequency domain orthogonal basis, and weak polarization reference amplitude quantization information.
26. A transmission method of CSI report is applied to a network side device side, and is characterized by comprising the following steps:
receiving a Channel State Information (CSI) report of a terminal, wherein quantized coefficients of nonzero coefficients to be fed back in a polarization coefficient matrix in the CSI report are divided into a plurality of information groups;
and demodulating the bit bitmap and the quantized coefficients of the CSI report according to the priority information of the information group.
27. The method for transmitting the CSI report according to claim 26, wherein after demodulating the bitmap and the quantized coefficients of the CSI report, the method further comprises any one of the following:
and if the CSI report is filled with specific bits, restoring the codebook of the CSI report without using the specific bits.
28. The method for transmitting the CSI report of claim 27, wherein the specific bit is 0 or 1.
29. A device for transmitting CSI reports, applied to a terminal, comprising:
the grouping module is used for grouping the quantization coefficients of the nonzero coefficients to be fed back in the polarization coefficient matrix according to the first grouping length to obtain a plurality of information groups;
a discarding module, configured to discard at least one of the following information in the CSI report according to priority information of an information group: the CSI report comprises a public part, a quantized coefficient and a bit bitmap for indicating the quantized coefficient, wherein the bit number of a discarded CSI report is equal to the bit number which can be borne by uplink channel resources for sending the CSI report, and the bit number which can be borne by the uplink channel resources for sending the CSI report is determined according to a code rate configured by a network side and the uplink channel resources distributed for the CSI report;
and the sending module is used for sending the CSI report after the discarding processing on the uplink channel resource.
30. A transmission device of CSI report is applied to a network side device side, and is characterized by comprising:
the terminal comprises a receiving module, a feedback module and a feedback module, wherein the receiving module is used for receiving a Channel State Information (CSI) report of the terminal, and quantized coefficients of nonzero coefficients to be fed back in a polarization coefficient matrix in the CSI report are divided into a plurality of information groups;
and the processing module is used for demodulating the bit map and the quantized coefficients of the CSI report according to the priority information of the information group.
31. A communication device, characterized in that the communication device comprises a processor, a memory and a computer program stored on the memory and run on the processor, which when executed by the processor implements the steps of the method for transmission of a channel state information, CSI, report according to any of claims 1 to 28.
32. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when being executed by a processor, carries out the steps of the method for transmission of a channel state information, CSI, report according to any of the claims 1 to 28.
CN201910786073.0A 2019-08-23 2019-08-23 Transmission method, terminal and network side equipment for Channel State Information (CSI) report Active CN111835459B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910786073.0A CN111835459B (en) 2019-08-23 2019-08-23 Transmission method, terminal and network side equipment for Channel State Information (CSI) report

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910786073.0A CN111835459B (en) 2019-08-23 2019-08-23 Transmission method, terminal and network side equipment for Channel State Information (CSI) report

Publications (2)

Publication Number Publication Date
CN111835459A true CN111835459A (en) 2020-10-27
CN111835459B CN111835459B (en) 2023-12-01

Family

ID=72911601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910786073.0A Active CN111835459B (en) 2019-08-23 2019-08-23 Transmission method, terminal and network side equipment for Channel State Information (CSI) report

Country Status (1)

Country Link
CN (1) CN111835459B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023011570A1 (en) * 2021-08-05 2023-02-09 华为技术有限公司 Channel information feedback method and communication apparatus
WO2023131042A1 (en) * 2022-01-04 2023-07-13 维沃移动通信有限公司 Channel state information (csi) feedback method and apparatus, terminal, and network side device
WO2024066168A1 (en) * 2022-09-27 2024-04-04 富士通株式会社 Method and apparatus for sending channel state information, method and apparatus for receiving channel state information, and communication system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120039252A1 (en) * 2010-08-16 2012-02-16 Qualcomm Incorporated Channel state information feedback for carrier aggregation
CN102938680A (en) * 2011-08-15 2013-02-20 华为技术有限公司 Method for transmitting channel state information, user equipment and base station
CN103262604A (en) * 2010-12-17 2013-08-21 三星电子株式会社 Apparatus and method for periodic channel state reporting in a wireless network
CN108111199A (en) * 2017-05-05 2018-06-01 中兴通讯股份有限公司 Feedback, method of reseptance and device, equipment, the storage medium of channel state information
CN109302272A (en) * 2018-02-13 2019-02-01 中兴通讯股份有限公司 Sending, receiving method and device, the electronic device of CSI report

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120039252A1 (en) * 2010-08-16 2012-02-16 Qualcomm Incorporated Channel state information feedback for carrier aggregation
CN103270714A (en) * 2010-08-16 2013-08-28 高通股份有限公司 Channel state information feedback for carrier aggregation
CN103262604A (en) * 2010-12-17 2013-08-21 三星电子株式会社 Apparatus and method for periodic channel state reporting in a wireless network
CN102938680A (en) * 2011-08-15 2013-02-20 华为技术有限公司 Method for transmitting channel state information, user equipment and base station
US20140169204A1 (en) * 2011-08-15 2014-06-19 Huawei Technologies Co., Ltd. Method for transmitting channel state information, user equipment, and base station
CN106411465A (en) * 2011-08-15 2017-02-15 华为技术有限公司 Method for transmitting channel state information, user equipment and base station
CN108111199A (en) * 2017-05-05 2018-06-01 中兴通讯股份有限公司 Feedback, method of reseptance and device, equipment, the storage medium of channel state information
CN109302272A (en) * 2018-02-13 2019-02-01 中兴通讯股份有限公司 Sending, receiving method and device, the electronic device of CSI report

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
ERICSSO: ""R1-1907076 On CSI omission procedure"", 《3GPP TSG_RAN\WG1_RL1》 *
ERICSSO: ""R1-1907076 On CSI omission procedure"", 《3GPP TSG_RAN\WG1_RL1》, 4 May 2019 (2019-05-04), pages 3 - 5 *
NIKOLAOS NOMIKOS; THEMISTOKLIS CHARALAMBOUS; DEMOSTHENES VOUYIOUKAS; GEORGE K. KARAGIANNIDIS: "Low-Complexity Buffer-Aided Link Selection With Outdated CSI and Feedback Errors", 《IEEE TRANSACTIONS ON COMMUNICATIONS》 *
刘文红等: "基于CSI及QSI的D2D传输策略", 《太赫兹科学与电子信息学报》 *
刘文红等: "基于CSI及QSI的D2D传输策略", 《太赫兹科学与电子信息学报》, no. 06, 25 December 2015 (2015-12-25) *
张一衡;崔琪楣;陶小峰;: "多用户MIMO-OFDM系统低速率CSI反馈方法及信道容量分析", 电子与信息学报, no. 09 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023011570A1 (en) * 2021-08-05 2023-02-09 华为技术有限公司 Channel information feedback method and communication apparatus
WO2023131042A1 (en) * 2022-01-04 2023-07-13 维沃移动通信有限公司 Channel state information (csi) feedback method and apparatus, terminal, and network side device
WO2024066168A1 (en) * 2022-09-27 2024-04-04 富士通株式会社 Method and apparatus for sending channel state information, method and apparatus for receiving channel state information, and communication system

Also Published As

Publication number Publication date
CN111835459B (en) 2023-12-01

Similar Documents

Publication Publication Date Title
CN111277360B (en) Transmission method, terminal and network equipment for CSI report
CN111278120B (en) Configuration method and transmission method of uplink channel, network side equipment and terminal
CN111614390B (en) Transmission method, terminal and network equipment for CSI report
CN110505040B (en) Information transmission method, terminal and network equipment
CN109587793B (en) TCI state updating method, base station and terminal
EP3926875A1 (en) Hybrid automatic repeat request (harq) feedback method, terminal, and network device
US11552686B2 (en) Beam reporting based on detection of a trigger event
EP3955673A1 (en) Beam information determination method and apparatus, and communication device
CN109391948B (en) Processing method of beam indication, mobile terminal and network side equipment
CN111435855B (en) Transmission method, terminal and network equipment for CSI report
CN111435862B (en) Transmission method, terminal and network equipment for Channel State Information (CSI) report
CN110474667B (en) Information processing method and device, terminal and communication equipment
CN111835459B (en) Transmission method, terminal and network side equipment for Channel State Information (CSI) report
CN111132314B (en) Aperiodic channel state information reference signal configuration method, network equipment and terminal
CN111836309B (en) Transmission method, terminal and network side equipment for Channel State Information (CSI) report
CN111615142B (en) Transmission method, terminal and network equipment for Channel State Information (CSI) report
CN110139390B (en) Resource scheduling indication method, terminal and network equipment
CN111614435B (en) Transmission method, terminal and network equipment for Channel State Information (CSI) report
CN111132216B (en) Information reporting method, terminal and network equipment
CN112887067B (en) Resource determination method and device and communication equipment
CN116095742A (en) Channel state information transmission method and device, terminal and network side equipment
CN111263400A (en) CSI report discarding method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant