WO2023245576A1 - Ai模型确定方法、装置、通信设备及存储介质 - Google Patents

Ai模型确定方法、装置、通信设备及存储介质 Download PDF

Info

Publication number
WO2023245576A1
WO2023245576A1 PCT/CN2022/100906 CN2022100906W WO2023245576A1 WO 2023245576 A1 WO2023245576 A1 WO 2023245576A1 CN 2022100906 W CN2022100906 W CN 2022100906W WO 2023245576 A1 WO2023245576 A1 WO 2023245576A1
Authority
WO
WIPO (PCT)
Prior art keywords
model
information
capability
identification information
csi reporting
Prior art date
Application number
PCT/CN2022/100906
Other languages
English (en)
French (fr)
Inventor
朱亚军
Original Assignee
北京小米移动软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小米移动软件有限公司 filed Critical 北京小米移动软件有限公司
Priority to CN202280002321.8A priority Critical patent/CN115349279A/zh
Priority to PCT/CN2022/100906 priority patent/WO2023245576A1/zh
Publication of WO2023245576A1 publication Critical patent/WO2023245576A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0203Power saving arrangements in the radio access network or backbone network of wireless communication networks
    • H04W52/0206Power saving arrangements in the radio access network or backbone network of wireless communication networks in access points, e.g. base stations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L5/00Arrangements affording multiple use of the transmission path
    • H04L5/003Arrangements for allocating sub-channels of the transmission path
    • H04L5/0053Allocation of signaling, i.e. of overhead other than pilot signals
    • H04L5/0057Physical resource allocation for CQI
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/04Protocols for data compression, e.g. ROHC
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices

Definitions

  • the present disclosure relates to but is not limited to the field of wireless communication technology, and in particular, to an AI model determination method, device, communication equipment and storage medium.
  • AI Artificial Intelligence
  • CSI Channel State Information
  • a mainstream method is to adopt a 'two-sided' AI structure: the AI-based CSI compression encoder is on the user equipment (User Equipment, UE) side. The UE compresses the measured CSI according to the AI CSI compression encoder and compresses the compressed CSI.
  • the CSI information is then sent to the base station; the AI-based CSI decompression encoder is on the base station side, and the base station uses the CSI decompression encoder corresponding to the AI CSI compression encoder to decompress and restore the compressed CSI information sent by the UE.
  • Compression and decompression are two parts of the AI model and appear in pairs. Therefore, it is necessary to ensure that the AI model used by the UE and the base station is consistent.
  • CSI compression which is an AI model that can only be implemented on the UE side and the base station side
  • the AI model if the AI model is delivered to the UE by the base station, or the AI model is reported by the UE to the base station, the AI model needs to be exchanged between the UE and the base station. This results in increased signaling and increased power consumption.
  • the embodiments of the present disclosure are an AI model determination method, device, communication equipment and storage medium.
  • an AI model determination method executed by a UE, including:
  • the AI capability information is used by the base station to determine the AI model of the CSI used by the UE; where the AI capability information includes at least one of the following:
  • AI capability indication information used to indicate whether the UE supports AI capabilities
  • AI level indication information is used to indicate the level of the AI capabilities supported by the UE
  • the identification information of the AI model is used to indicate the AI model supported by the UE;
  • the identification information of the AI platform is used to indicate the AI platform supported by the UE;
  • AI reasoning indication information used to indicate whether the UE supports AI reasoning capabilities
  • AI training indication information used to indicate whether the UE supports the training capability of the AI model.
  • AI capability information is reported per UE.
  • AI capability information is optionally reported or conditionally mandatory.
  • methods include:
  • CSI reported configuration information is determined by the base station based on AI capability information
  • At least one AI model used by the UE is determined.
  • At least one AI model used by the UE is determined based on the CSI reporting configuration information, including:
  • the CSI reporting configuration information including identification information of at least one CSI reporting configuration and identification information of the corresponding AI model, it is determined that different CSI reporting configurations use corresponding AI models.
  • the method includes: sending AI model usage information; wherein the AI model usage information is used to determine the AI model used by the UE.
  • the AI model uses information including at least one of the following:
  • the identification information of the CSI reporting configuration corresponding to the identification information of the AI model is the identification information of the CSI reporting configuration corresponding to the identification information of the AI model.
  • sending AI model usage information includes:
  • the AI model usage information is sent; wherein the reporting request information is used to request the AI model used by the UE.
  • an AI model determination method is provided, which is executed by a base station and includes:
  • AI capability information includes at least one of the following:
  • AI capability indication information used to indicate whether the UE supports AI capabilities
  • AI level indication information is used to indicate the level of the AI capabilities supported by the UE
  • the identification information of the AI model is used to indicate the AI model supported by the UE;
  • the identification information of the AI platform is used to indicate the AI platform supported by the UE;
  • AI reasoning indication information used to indicate whether the UE supports AI reasoning capabilities
  • AI training indication information used to indicate whether the UE supports the training capability of the AI model.
  • the method includes: sending CSI reporting configuration information, where the CSI reporting configuration information is used to instruct the UE to use a supported AI model.
  • sending CSI reporting configuration information includes:
  • the UE In response to determining that the UE supports an AI model, it is determined to send CSI reporting configuration information that does not include identification information of the AI model; wherein the CSI reporting configuration information is used to instruct the UE to use an AI model supported by the UE.
  • sending CSI reporting configuration information includes:
  • the CSI reporting configuration information is sent, where the CSI reporting configuration information includes: identification information of at least one CSI reporting configuration and identification information of the corresponding AI model.
  • methods include:
  • the AI model of the CSI used by the base station is determined.
  • the AI model uses information including at least one of the following:
  • the identification information of the CSI reporting configuration corresponding to the identification information of the AI model is the identification information of the CSI reporting configuration corresponding to the identification information of the AI model.
  • the method includes: sending reporting request information; wherein the reporting request information is used to request an AI model used by the UE.
  • an AI model determination device including:
  • the first sending module is configured to send the AI capability information of the UE, where the AI capability information is used by the base station to determine the AI model of the CSI used by the UE; where the AI capability information includes at least one of the following:
  • AI capability indication information used to indicate whether the UE supports AI capabilities
  • AI level indication information is used to indicate the level of the AI capabilities supported by the UE
  • the identification information of the AI model is used to indicate the AI model supported by the UE;
  • the identification information of the AI platform is used to indicate the AI platform supported by the UE;
  • AI reasoning indication information used to indicate whether the UE supports AI reasoning capabilities
  • AI training indication information used to indicate whether the UE supports the training capability of the AI model.
  • AI capability information is reported per UE.
  • AI capability information is optionally reported or conditionally mandatory.
  • the device includes:
  • the first receiving module is configured to receive CSI reported configuration information; wherein the CSI reported configuration information is determined by the base station based on AI capability information;
  • the first processing module is configured to determine at least one AI model used by the UE based on the CSI reporting configuration information.
  • the first processing module is configured to determine that the UE uses an AI model supported by the UE based on the identification information of the AI model not included in the CSI reported configuration information; or,
  • the first processing module is configured to determine that different CSI reporting configurations use corresponding AI models based on the identification information of at least one CSI reporting configuration and the identification information of the corresponding AI model included in the CSI reporting configuration information.
  • the first sending module is configured to send AI model usage information; wherein the AI model usage information is used to determine the AI model used by the UE.
  • the AI model uses information including at least one of the following:
  • the identification information of the CSI reporting configuration corresponding to the identification information of the AI model is the identification information of the CSI reporting configuration corresponding to the identification information of the AI model.
  • the first sending module is configured to send AI model usage information based on receiving reporting request information from the network device; wherein the reporting request information is used to request the AI model used by the UE.
  • an AI model determination device including:
  • the second receiving module is configured to receive the AI capability information of the UE
  • the second processing module is configured to determine the AI model of at least one channel state information CSI used by the UE based on the AI capability information;
  • AI capability information includes at least one of the following:
  • AI capability indication information used to indicate whether the UE supports AI capabilities
  • AI level indication information is used to indicate the level of the AI capabilities supported by the UE
  • the identification information of the AI model is used to indicate the AI model supported by the UE;
  • the identification information of the AI platform is used to indicate the AI platform supported by the UE;
  • AI reasoning indication information used to indicate whether the UE supports AI reasoning capabilities
  • AI training indication information used to indicate whether the UE supports the training capability of the AI model.
  • the apparatus includes: a second sending module configured to send CSI reporting configuration information, where the CSI reporting configuration information is used to instruct the UE to use a supported AI model.
  • the second sending module is configured to, in response to determining that the UE supports an AI model, determine to send CSI reporting configuration information that does not include identification information of the AI model; wherein the CSI reporting configuration information is used to instruct the UE to use the UE A supported AI model.
  • the second sending module is configured to send CSI reporting configuration information in response to determining that the UE supports multiple AI models, where the CSI reporting configuration information includes: identification information of at least one CSI reporting configuration and the corresponding AI Model identification information.
  • the second receiving module is configured to receive the AI model usage information sent by the UE;
  • the second processing module is configured to determine the AI model of the CSI used by the base station based on the AI model usage information.
  • the AI model uses information including at least one of the following:
  • the identification information of the CSI reporting configuration corresponding to the identification information of the AI model is the identification information of the CSI reporting configuration corresponding to the identification information of the AI model.
  • the second sending module is configured to send reporting request information; wherein the reporting request information is used to request the AI model used by the UE.
  • a communication device includes:
  • Memory used to store instructions executable by the processor
  • the processor is configured to implement the AI model determination method of any embodiment of the present disclosure when running executable instructions.
  • a computer storage medium stores a computer executable program.
  • the executable program is executed by a processor, the AI model determination method of any embodiment of the present disclosure is implemented.
  • the UE sends the AI capability information of the UE, where the AI capability information is used by the base station to determine the AI model of the CSI used by the UE; where the AI capability information includes AI capability indication information, AI level indication information, and AI model At least one of the identification information, the identification information of the AI platform, the AI inference instruction information, and the AI training instruction information.
  • the base station can know which AI model the UE uses for CSI compression, which is beneficial to the base station to decompress the compressed CSI reported by the UE based on the AI model to ensure that the compressed CSI reported by the UE can be correctly decompressed.
  • CSI since there is no need to interact with the AI model between the base station and the UE, the signaling of the interactive AI model can be reduced, and the power consumption of the UE and the base station can be reduced.
  • Figure 1 is a schematic structural diagram of a wireless communication system according to an exemplary embodiment.
  • Figure 2 is a schematic diagram of an AI model determination method according to an exemplary embodiment.
  • Figure 3 is a schematic diagram of an AI model determination method according to an exemplary embodiment.
  • Figure 4 is a schematic diagram of an AI model determination method according to an exemplary embodiment.
  • Figure 5 is a schematic diagram of an AI model determination method according to an exemplary embodiment.
  • Figure 6 is a schematic diagram of an AI model determination method according to an exemplary embodiment.
  • Figure 7 is a block diagram of an AI model determination device according to an exemplary embodiment.
  • Figure 8 is a block diagram of an AI model determination device according to an exemplary embodiment.
  • Figure 9 is a block diagram of a UE according to an exemplary embodiment.
  • Figure 10 is a block diagram of a base station according to an exemplary embodiment.
  • first, second, third, etc. may be used to describe various information in the embodiments of the present disclosure, the information should not be limited to these terms. These terms are only used to distinguish information of the same type from each other.
  • first information may also be called second information, and similarly, the second information may also be called first information.
  • word “if” as used herein may be interpreted as "when” or "when” or "in response to determination.”
  • FIG. 1 shows a schematic structural diagram of a wireless communication system provided by an embodiment of the present disclosure.
  • the wireless communication system is a communication system based on cellular mobile communication technology.
  • the wireless communication system may include several user equipments 110 and several base stations 120.
  • user equipment 110 may be a device that provides voice and/or data connectivity to a user.
  • the user equipment 110 can communicate with one or more core networks via a Radio Access Network (RAN).
  • RAN Radio Access Network
  • the user equipment 110 can be an Internet of Things user equipment, such as a sensor device, a mobile phone (or a "cellular" phone) ) and computers with IoT user equipment, which may be, for example, fixed, portable, pocket-sized, handheld, computer-built-in, or vehicle-mounted devices.
  • the user equipment 110 may also be equipment of an unmanned aerial vehicle.
  • the user equipment 110 may also be a vehicle-mounted device, for example, it may be an on-board computer with a wireless communication function, or a wireless user equipment connected to an external on-board computer.
  • the user equipment 110 may also be a roadside device, for example, it may be a streetlight, a signal light or other roadside device with a wireless communication function.
  • the base station 120 may be a network-side device in a wireless communication system.
  • the wireless communication system can be the 4th generation mobile communication technology (the 4th generation mobile communication, 4G) system, also known as the Long Term Evolution (LTE) system; or the wireless communication system can also be a 5G system, Also called new air interface system or 5G NR system.
  • the wireless communication system may also be a next-generation system of the 5G system.
  • the access network in the 5G system can be called the New Generation-Radio Access Network (NG-RAN).
  • NG-RAN New Generation-Radio Access Network
  • the base station 120 may be an evolved base station (eNB) used in the 4G system.
  • the base station 120 may also be a base station (gNB) that adopts a centralized distributed architecture in the 5G system.
  • eNB evolved base station
  • gNB base station
  • the base station 120 adopts a centralized distributed architecture it usually includes a centralized unit (central unit, CU) and at least two distributed units (distributed unit, DU).
  • the centralized unit is equipped with a protocol stack including the Packet Data Convergence Protocol (PDCP) layer, the Radio Link Control protocol (Radio Link Control, RLC) layer, and the Media Access Control (Medium Access Control, MAC) layer;
  • PDCP Packet Data Convergence Protocol
  • RLC Radio Link Control
  • MAC Media Access Control
  • the distribution unit is provided with a physical (Physical, PHY) layer protocol stack, and the embodiment of the present disclosure does not limit the specific implementation of the base station 120.
  • a wireless connection may be established between the base station 120 and the user equipment 110 through a wireless air interface.
  • the wireless air interface is a wireless air interface based on the fourth generation mobile communication network technology (4G) standard; or the wireless air interface is a wireless air interface based on the fifth generation mobile communication network technology (5G) standard, such as
  • the wireless air interface is a new air interface; alternatively, the wireless air interface may also be a wireless air interface based on the next generation mobile communication network technology standard of 5G.
  • an E2E (End to End, end-to-end) connection can also be established between user equipments 110 .
  • vehicle-to-vehicle (V2V) communication vehicle-to-roadside equipment (vehicle to Infrastructure, V2I) communication and vehicle-to-person (vehicle to pedestrian, V2P) communication in vehicle networking communication (vehicle to everything, V2X) Wait for the scene.
  • V2V vehicle-to-vehicle
  • V2I vehicle-to-roadside equipment
  • V2P vehicle-to-person communication in vehicle networking communication
  • V2X vehicle networking communication
  • the above user equipment can be considered as the terminal equipment of the following embodiments.
  • the above-mentioned wireless communication system may also include a network management device 130.
  • the network management device 130 may be a core network device in a wireless communication system.
  • the network management device 130 may be a mobility management entity (Mobility Management Entity) in an evolved packet core network (Evolved Packet Core, EPC). MME).
  • the network management device can also be other core network devices, such as serving gateway (Serving GateWay, SGW), public data network gateway (Public Data Network GateWay, PGW), policy and charging rules functional unit (Policy and Charging Rules) Function, PCRF) or Home Subscriber Server (HSS), etc.
  • serving gateway Serving GateWay, SGW
  • public data network gateway Public Data Network GateWay, PGW
  • Policy and Charging Rules Policy and Charging Rules
  • PCRF Policy and Charging Rules
  • HSS Home Subscriber Server
  • the embodiments of the present disclosure enumerate multiple implementations to clearly describe the technical solutions of the embodiments of the present disclosure.
  • the multiple embodiments provided in the embodiments of the present disclosure can be executed alone or in combination with the methods of other embodiments in the embodiments of the present disclosure. They can also be executed alone or in combination. It is then executed together with some methods in other related technologies; the embodiments of the present disclosure do not limit this.
  • the complexity of AI soft implementation is much higher than the complexity of AI hard implementation.
  • the AI soft implementation can be some basic models built into the chip; some parameters are changed through the base station, and the UE side completes the model through C language.
  • AI hard implementation can be to directly build the AI model into the chip, and the AI model to be built into the chip can be solidified in the hardware through the Verilog language.
  • an embodiment of the present disclosure provides an AI model determination method, which is executed by the UE and includes:
  • Step S21 Send the AI capability information of the UE, where the AI capability information is used by the base station to determine the AI model of the CSI used by the UE;
  • AI capability information includes at least one of the following:
  • AI capability indication information used to indicate whether the UE supports AI capabilities
  • AI level indication information is used to indicate the level of the AI capabilities supported by the UE
  • the identification information of the AI model is used to indicate the AI model supported by the UE;
  • the identification information of the AI platform is used to indicate the AI platform supported by the UE;
  • AI reasoning indication information used to indicate whether the UE supports AI reasoning capabilities
  • AI training indication information used to indicate whether the UE supports the training capability of the AI model.
  • the UE may be various mobile terminals or fixed terminals.
  • the UE may be, but is not limited to, a mobile phone, a computer, a server, a wearable device, a vehicle-mounted terminal, a roadside unit (RSU, Road Side Unit), a game control platform or a multimedia device, etc.
  • RSU Road Side Unit
  • step S21 may be: sending the UE's AI capability information to the base station.
  • the base station may be various types of base stations.
  • the base station may be a 2G base station, a 3G base station, a 4G base station, a 5G base station or other evolved base stations.
  • step S21 may be: sending AI capability information to the network device.
  • the network device may be an access network device or a core network device.
  • the access network equipment can be various base stations.
  • the core network device may be various logical nodes or functions of the core network; for example, the core network device may be an Access and Mobility Management Function (AMF) or a Network Function (NF).
  • AMF Access and Mobility Management Function
  • NF Network Function
  • step S21 may be: reporting the AI capability information of the UE.
  • the UE reports the UE's AI capability information to the base station.
  • the AI capability indication information, AI level indication information, AI model identification information, AI platform identification information, AI inference indication information, and AI training instruction information may each be carried or indicated by at least one bit of the AI capability information.
  • a capability information includes but is not limited to at least one of the following:
  • AI capability indication information can be used to indicate whether the UE's chip supports AI capabilities
  • AI level indication information can be used to indicate the level of AI capabilities supported by the UE's chip.
  • the AI capability information indication takes a first value, such as "0", it is used to indicate that the UE supports AI capabilities; when the AI capability information indication takes a second value, such as "1", it is used to indicate that the UE does not support AI capabilities. .
  • the AI capability may refer to any kind of AI capability; for example, the AI capability may be, but is not limited to, CSI compression AI capability, or AI positioning capability, etc.
  • the level to which the AI ability belongs may be, but is not limited to, a level or series of at least one of x, y, and z.
  • the agreement stipulates that the UE's chip supports AI capabilities at three levels: x, y, and z.
  • the base station and the UE negotiate that the UE's chip supports the AI capability to three levels: x, y, and z.
  • the three levels of x, y, and z respectively correspond to different computing power capabilities and/or storage capabilities of the UE. In this way, based on the computing power capability and/or storage capability of the UE, it can be determined which level among the three levels of x, y, and z the UE supports AI capabilities.
  • the AI level indication information can take different values to indicate the level to which the UE or the UE's chip supports the AI capability.
  • the AI model can be any kind of AI model.
  • the AI model can be, but is not limited to, a CNN model, an RNN model, a transformer model, etc.
  • the identification information of the AI model is used to uniquely identify the AI model.
  • the UE can report the AI model indication information to inform the base station of the AI model supported by the UE, thereby facilitating the base station to determine the AI model used by the UE.
  • the AI platform can be any AI platform.
  • the AI platform can be, but is not limited to, TensorFlow platform or Pytorch platform.
  • AI inference indication information when the AI inference indication information takes a first value, such as "0", it is used to indicate that the UE supports AI inference capabilities; when the AI inference indication information takes a second value, such as "1", it is used to indicate that the UE does not support it.
  • AI s reasoning capabilities.
  • the AI training indication information when the AI training indication information takes a first value, such as "0", it is used to indicate that the UE supports the training capability of the AI model; when the AI training indication information takes a second value, such as "1", it is used to indicate that the UE does not Supports the training capabilities of AI models.
  • the AI capability information may be used to indicate the AI platform used by the AI model; and/or the AI capability information may be used to indicate that the UE or the UE's chip supports an AI platform corresponding to the AI capability level.
  • the AI capability information can also be used by the base station to determine the corresponding AI platform when the UE supports different AI models, and/or used by the base station to determine the corresponding AI platform when the UE or the chip of the UE supports different AI capability levels.
  • AI platform may be used to indicate the AI platform used by the AI model; and/or the AI capability information may be used to indicate that the UE or the UE's chip supports an AI platform corresponding to the AI capability level.
  • the AI capability information is used by the base station to determine the AI model for CSI compression used by the UE; and/or the AI capability information can also be used by the base station to determine the AI model for CSI decompression used.
  • the UE sends the AI capability information of the UE, where the AI capability information is used by the base station to determine the AI model of the CSI used by the UE; the AI capability information includes AI capability indication information, AI level indication information, and AI model At least one of identification information, identification information of the AI platform, AI inference instruction information, and AI training instruction information.
  • the base station can know which AI model the UE uses for CSI compression, which is beneficial to the base station to decompress the compressed CSI reported by the UE based on the AI model to ensure that the compressed CSI reported by the UE can be correctly decompressed.
  • CSI since there is no need to interact with the AI model between the base station and the UE, the signaling of the interactive AI model can be reduced, and the power consumption of the UE and the base station can be reduced.
  • AI capability information is reported per UE or per feature.
  • reporting per UE means that no matter how many frequency bands the UE supports, the UE only needs to report the AI capability information once.
  • per-function reporting means that the UE reports the AI capability information for each frequency band and each frequency band group.
  • AI capability information is optionally reported or conditionally mandatory.
  • the non-mandatory reporting of AI capability information means that when reporting the UE capability, the UE is forced to report the AI capability information contained in the AI capability.
  • conditional mandatory reporting of AI capability information means that when the UE supports a specific AI capability, the UE needs to report the AI capability information corresponding to the AI capability supported by the UE.
  • the UE supports at least one of the following AI capabilities
  • the AI capability information of at least one of the following AI capabilities is reported: the UE supports the reporting of CSI compressed by the AI model, the UE supports the AI reasoning capability and supports the training of the AI model. ability.
  • the AI capability information of the AI capabilities can be conditionally and compulsorily reported.
  • this embodiment of the present disclosure provides an AI model determination method, which is executed by the UE, including:
  • Step S31 Receive CSI reporting configuration information; wherein, CSI reporting configuration information is determined by the base station based on AI capability information;
  • Step S32 Based on the CSI reporting configuration information, determine at least one AI model used by the UE.
  • the AI capability information may be the AI capability information in step S21; the AI model may be the AI model in step S21.
  • the AI capability information includes at least one of AI capability indication information, AI level indication information, AI model identification information, AI platform identification information, AI reasoning instruction information, and AI training instruction information.
  • receiving the CSI reporting configuration information in step S31 may be: receiving the CSI reporting configuration information sent by the base station.
  • the CSI reporting configuration can be any CSI reporting configuration.
  • the CSI reporting configuration may be a periodic CSI reporting configuration or aperiodic CSI reporting configuration; another example, the CSI reporting configuration may be a CSI reporting configuration based on a predetermined frequency band; another example, the CSI reporting configuration may be a CSI reporting based on a certain PUCCH resource Configuration; for another example, the CSI reporting configuration may be a CSI reporting configuration based on a certain beam; and so on.
  • the identification information of the CSI reporting configuration is used to uniquely identify the CSI reporting configuration. For example, when the identification information of the reported configuration is "00", it can be used to identify a specific CSI reporting configuration.
  • step S32 includes:
  • Embodiments of the present disclosure provide an AI model determination method, which is executed by a UE, including:
  • Embodiments of the present disclosure provide an AI model determination method, which is executed by a UE, including:
  • the AI model indicated by the identification information of the AI model used by the UE is determined.
  • the UE supports an AI model, and the UE receives the CSI report configuration information sent by the base station; if the UE determines that the CSI report configuration information does not include the identification information of the AI model, it determines that the UE uses the AI model supported by the UE.
  • the UE supports one or more AI models, and the UE receives the CSI reporting configuration information sent by the base station; if the UE determines that the CSI reporting configuration information includes the identification information of an AI model, it determines that the UE uses the identification information of the AI model. The indicated AI model.
  • the UE supports multiple AI models, and the UE receives the CSI reporting configuration information sent by the base station; if the UE determines that the CSI reporting configuration information includes: the identification information "00" of the CSI reporting configuration and the identification information of the first AI model. , and the CSI report configures the identification information "01" and the identification information of the second AI model. If the UE determines that the CSI reporting configuration indicated by the identification information "00" of the CSI reporting configuration is currently used, it determines that the AI model used by the UE is the first AI model; or if it determines that the CSI indicated by the identification information "01" of the reporting configuration is currently used. After reporting the configuration, it is determined that the AI model used by the UE is the second AI model.
  • the UE can accurately determine that the CSI compression AI model used by the UE corresponds to the base station's CSI decompression AI model by receiving the CSI reporting configuration information sent by the base station; this can ensure that the UE and the base station While the corresponding AI model is used, there is no need to exchange specific AI parameter information of the AI model between the UE and the base station.
  • the AI models between the base station and the UE may be corresponding.
  • the UE uses the AI compression model to compress the CSI and sends the compressed CSI to the base station;
  • the base station uses the AI decompression model corresponding to the UE to decompress the compressed CSI.
  • this embodiment of the present disclosure provides an AI model determination method, which is executed by the UE and includes:
  • Step S41 Send AI model usage information; where the AI model usage information is used to determine the AI model used by the UE.
  • Embodiments of the present disclosure provide an AI model determination method, executed by a UE, including: after sending AI capability information, sending AI model usage information.
  • the AI model usage information is used to indicate the AI model used by the UE.
  • the AI model usage information can be used by the base station to determine the AI model used by the UE; and/or the AI model usage information can be used by the base station to determine the AI model used by the base station.
  • the AI model usage information can be used by the UE to determine the AI model used by the UE.
  • the AI model uses information including at least one of the following:
  • the identification information of the CSI reporting configuration corresponding to the identification information of the AI model is the identification information of the CSI reporting configuration corresponding to the identification information of the AI model.
  • the UE sends AI model usage information to the base station, where the AI model usage information includes identification information of the AI model, and the AI model usage information is used to indicate the AI model used by the UE.
  • the base station can be directly informed of the AI model used by the UE.
  • the UE sends AI model usage information to the base station, where the AI model usage information includes: identification information of the CSI reporting configuration; after receiving the identification information of the CSI reporting configuration, the base station can report the configuration identification based on the received CSI.
  • the information corresponds to the stored identification information of the AI model and the identification information reported and configured by the CSI to determine the identification information of the AI model used by the UE. In this way, the AI model used by the UE can also be determined through the identification information configured in the received CSI report.
  • the AI model usage information can be reported through the UE. If the AI model usage information carries the identification information of the AI model, the base station can be directly informed of the AI model used by the UE; and/or if the AI model usage information CSI reporting configuration information is carried in the CSI reporting configuration information, the base station can determine the identification information of the AI model based on the identification information of the CSI reporting configuration information, thereby also enabling the base station to determine the AI model used by the UE. In this way, the base station can be informed in various ways of the AI model used by the UE, so that it can be applied to more application scenarios.
  • Embodiments of the present disclosure provide an AI model determination method, which is executed by a UE, including: determining the AI model used by the UE.
  • Embodiments of the present disclosure provide an AI model determination method, which is executed by a UE, including: determining the AI model used by the UE based on the scenario in which the UE is located.
  • Embodiments of the present disclosure provide an AI model determination method, which is executed by a UE and includes at least one of the following:
  • the UE supports one AI model, it is determined that the AI model used by the UE is the AI model supported by the UE; or, if the UE supports multiple AI models, it is determined that the AI model used by the UE is any one of the multiple AI models. ;etc.
  • the AI model used by the UE is determined, including:
  • the UE Based on the fact that the UE is in the first speed scenario and/or urban scenario, determine that the UE uses the first level AI model; or,
  • the UE uses the second level AI model; wherein the first speed is less than or equal to the second speed.
  • Embodiments of the present disclosure provide an AI model determination method, executed by a UE, including: determining that the UE uses a first-level AI model based on the UE being in a first speed scenario and/or an urban scenario; or, based on the UE being in a second speed scenario and/or in rural scenarios, determine that the UE uses the second level AI model; wherein the first speed is less than or equal to the second speed.
  • the UE's AI capabilities can be divided into the first level or the second level; or, the UE's AI capabilities can be divided into the 1st to Nth levels, where N is greater than 1. Integer; or classify the UE's AI capabilities into x, y or z levels.
  • different levels correspond to different computing power capabilities and/or storage capabilities of the UE.
  • the AI capabilities of the UE are divided into first level or second level.
  • the computing power capability and/or storage capability corresponding to the first level is smaller than the computing power capability and/or storage capability corresponding to the second level.
  • the UE can select an appropriate AI model to perform CSI compression based on the scenario in which the UE is located.
  • the base station can be caused to perform CSI decompression based on the corresponding AI model.
  • sending AI model usage information in step S41 includes:
  • the AI model usage information is sent; wherein the reporting request information is used to request the AI model used by the UE.
  • Embodiments of the present disclosure provide an AI model determination method, which is executed by a UE, including:
  • the AI model usage information is sent; wherein the reporting request information is used to request the AI model used by the UE.
  • the UE receives the reporting request information sent by the base station, and sends the identification information of the AI model and/or the identification information of the CSI reporting configuration corresponding to the AI model identification information; the reporting request information is used to request the AI model used by the UE.
  • the UE can be triggered by the base station, that is, the reporting request information sent by the base station, to report the AI model usage information; this allows the base station to accurately determine the AI model used by the UE.
  • the following AI model determination method is performed by the base station and is similar to the above description of the AI model determination method performed by the UE; and, for the technical details not disclosed in the embodiments of the AI model determination method performed by the base station , please refer to the description of the example of the AI model determination method performed by the UE, which will not be described in detail here.
  • this embodiment of the present disclosure provides an AI model determination method, which is executed by a base station and includes:
  • Step S51 Receive the AI capability information of the UE
  • Step S52 Based on the AI capability information, determine the AI model of at least one CSI used by the UE;
  • AI capability information includes at least one of the following:
  • AI capability indication information used to indicate whether the UE supports AI capabilities
  • AI level indication information is used to indicate the level of the AI capabilities supported by the UE
  • the identification information of the AI model is used to indicate the AI model supported by the UE;
  • the identification information of the AI platform is used to indicate the AI platform supported by the UE;
  • AI reasoning indication information used to indicate whether the UE supports AI reasoning capabilities
  • AI training indication information used to indicate whether the UE supports the training capability of the AI model.
  • the AI model determination method involved in the embodiments of the present disclosure can also be executed by a network device; the network device can be a core network device.
  • the AI capability information may be the AI capability information in step S21; the AI model may be the AI model in the above embodiments.
  • step S51 may be: receiving the AI capability information of the UE reported by the UE.
  • AI capability information includes but is not limited to at least one of the following:
  • AI capability indication information can be used to indicate whether the UE's chip supports AI capabilities
  • AI level indication information can be used to indicate the level of AI capabilities supported by the UE's chip.
  • step S52 may be: based on at least one of AI capability indication information, AI level indication information, AI model identification information, AI platform identification information, AI inference indication information, and AI training indication information, Determine at least one AI model of CSI used by the UE.
  • the base station receives the AI capability information sent by the UE, where the AI capability information includes AI capability indication information and AI model identification information; if the base station determines that the AI capability indication information indicates that the UE supports AI capability information, it can based on the AI model
  • the identification information determines the AI model supported by the UE; the base station selects one or more AI models from the AI models supported by the UE as the AI model used by the UE.
  • the base station receives the AI capability information sent by the UE, where the AI capability information includes AI capability indication information and AI level indication information; if the base station determines that the AI capability indication information indicates that the UE supports AI capability information, it can based on the AI level indication The information determines the level of the AI capability supported by the UE; the base station determines that the AI model used by the UE is one or more AI models included in the level of the AI capability.
  • the base station receives the AI capability information sent by the UE, where the AI capability information includes the identification information of the AI platform; the base station can determine the AI platform supported by the UE based on the identification information of the AI platform; the base station determines the AI used by the UE.
  • the model is one or more AI models corresponding to the AI platform supported by the UE.
  • the AI model suitable for use by the UE can be determined through the AI capability information of the UE sent by the UE.
  • the AI model used by the UE can be determined in a variety of ways, so that it can be adapted to more application scenarios.
  • this embodiment of the present disclosure provides an AI model determination method, which is executed by a base station and includes:
  • Step S61 Send CSI reporting configuration information, where the CSI reporting configuration information is used to instruct the UE to use a supported AI model.
  • the CSI reporting configuration information may be the CSI reporting configuration information in step S31; the identification information of the CSI reporting configuration information may be the identification information of the CSI reporting configuration information in the above embodiments.
  • the CSI reporting configuration information is determined based on the UE's AI capability information.
  • sending CSI reporting configuration information in step S61 includes:
  • the UE In response to determining that the UE supports an AI model, it is determined to send CSI reporting configuration information that does not include identification information of the AI model; wherein the CSI reporting configuration information is used to instruct the UE to use an AI model supported by the UE.
  • Embodiments of the present disclosure provide an AI model determination method, executed by a base station, including: in response to determining that the UE supports an AI model, determining to send CSI reporting configuration information that does not include identification information of the AI model; wherein, the CSI reporting configuration information is used to Instructs the UE to use an AI model supported by the UE.
  • the base station may also carry the identification information of the AI model in the CSI report configuration information.
  • the base station determines that the UE supports multiple AI models, it selects an AI model from the multiple AI models, and carries the identification information of the AI model in the CSI reporting configuration information and sends it to the UE.
  • the base station sends CSI reporting configuration information to the UE to instruct the UE that the CSI reported can be compressed using the AI model that the base station instructs the UE to use.
  • sending CSI reporting configuration information in step S61 includes:
  • the CSI reporting configuration information is sent, where the CSI reporting configuration information includes: identification information of at least one CSI reporting configuration and identification information of the corresponding AI model.
  • Embodiments of the present disclosure provide an AI model determination method, which is executed by a base station and includes:
  • the CSI reporting configuration information is sent, where the CSI reporting configuration information includes: identification information of at least one CSI reporting configuration and identification information of the corresponding AI model.
  • the identification information reported and configured by the same CSI corresponds to the identification information of an AI model.
  • the base station can configure multiple AI models for the UE, and configure different AI models for different CSI reporting configurations, so that the UE can use appropriate AI for the CSI of different CSI reporting configurations.
  • the model is compressed.
  • Embodiments of the present disclosure provide an AI model determination method, which is executed by a base station and includes:
  • the AI model of the CSI used by the base station is determined.
  • the AI model usage information may be the AI model usage information in step S41.
  • AI model usage information includes at least one of the following:
  • the identification information of the CSI reporting configuration corresponding to the identification information of the AI model is the identification information of the CSI reporting configuration corresponding to the identification information of the AI model.
  • Embodiments of the present disclosure provide an AI model determination method, which is executed by a base station and includes: sending reporting request information; wherein the reporting request information is used to request the AI model used by the UE.
  • sending the reporting request information includes: sending the reporting request information to the UE.
  • the reporting request information is used to trigger the UE to report AI model usage information.
  • Embodiments of the present disclosure provide an AI model determination method, which is executed by a communication device.
  • the communication device includes a UE and a base station; the AI model determination method includes at least one of the following:
  • Step S71 The UE reports the AI capability information of the UE; wherein the AI capability information includes at least one of the following:
  • AI capability indication information used to indicate whether the UE supports AI capabilities
  • AI level indication information is used to indicate the level of the AI capabilities supported by the UE
  • the identification information of the AI model is used to indicate the AI model supported by the UE;
  • the identification information of the AI platform is used to indicate the AI platform supported by the UE;
  • AI reasoning indication information used to indicate whether the UE supports AI reasoning capabilities
  • AI training indication information is used to indicate whether the UE supports the training capability of the AI model.
  • AI capability information is reported per UE or per feature.
  • AI capability information is optionally reported or conditionally mandatory. For example, if the UE supports at least one of the following capabilities, the UE reports the AI capability information of at least one of the following capabilities: the UE supports the reporting of CSI compressed by the AI model, the UE supports the AI reasoning capability, and supports the AI model training capability.
  • Step S72 The base station receives the AI capability information and configures the AI model used by the UE; wherein step S72 includes steps S72a and S72b;
  • Step S72a The base station determines one or more AI models supported by the UE based on the AI capability information of the UE;
  • Step S72b Determine the AI model used by the UE based on one or more AI models supported by the UE; and send CSI reporting configuration information;
  • the base station if it is determined that the UE supports an AI model, the base station sends CSI reporting configuration information that does not include identification information of the AI model.
  • the base station sends CSI reporting configuration information; the CSI reporting configuration information includes: identification information of at least one CSI reporting configuration and identification information of the corresponding AI model.
  • the base station may indicate the AI model used by the UE according to the current channel state.
  • the UE can compress the CSI according to the AI model configured by the base station, and the base station can use the corresponding AI model to decompress the compressed CSI.
  • step S72 can also be replaced by step S73.
  • Step S73 The UE reports AI model usage information, where the AI model usage information is used to determine the AI model used by the UE.
  • the UE determines the AI model used by the UE based on the scenario in which the UE is located.
  • the UE determines that the UE uses an x-level AI model based on the fact that the UE is in a first speed scenario and/or an urban scenario; or, based on the fact that the UE is in a second speed scenario and/or a rural scenario, determines that the UE uses a y-level AI model; where , the first speed is less than or equal to the second speed.
  • the computing power capability and/or storage capacity corresponding to level x is smaller than the computing power capability and/or storage capacity corresponding to level y.
  • the AI model usage information sent by the UE includes at least one of the following: identification information of the AI model, and identification information of the CSI reporting configuration information corresponding to the identification information of the AI model.
  • the AI model means that the usage information can be actively reported by the UE or triggered by the UE based on the base station.
  • the UE triggers reporting based on the base station, which may be: the UE receives the reporting request information sent by the base station and reports.
  • an AI model determination device including:
  • the first sending module 51 is configured to send the AI capability information of the UE, where the AI capability information is used by the base station to determine the AI model of the CSI used by the UE; where the AI capability information includes at least one of the following:
  • AI capability indication information used to indicate whether the UE supports AI capabilities
  • AI level indication information is used to indicate the level of the AI capabilities supported by the UE
  • the identification information of the AI model is used to indicate the AI model supported by the UE;
  • the identification information of the AI platform is used to indicate the AI platform supported by the UE;
  • AI reasoning indication information used to indicate whether the UE supports AI reasoning capabilities
  • AI training indication information used to indicate whether the UE supports the training capability of the AI model.
  • the AI module determination apparatus provided by the embodiment of the present disclosure can be applied in UE.
  • AI capability information is reported per UE.
  • AI capability information is optionally reported or conditionally mandatory.
  • An embodiment of the present disclosure provides an AI model determination device, including:
  • the first receiving module is configured to receive CSI reported configuration information; wherein the CSI reported configuration information is determined by the base station based on AI capability information;
  • the first processing module is configured to determine at least one AI model used by the UE based on the CSI reporting configuration information.
  • Embodiments of the present disclosure provide an AI model determination device, including: a first processing module configured to determine that the UE uses an AI model supported by the UE based on identification information of the AI model not included in the CSI reported configuration information.
  • Embodiments of the present disclosure provide an AI model determination device, including: a first processing module configured to determine different CSI based on the identification information of at least one CSI reporting configuration and the identification information of the corresponding AI model included in the CSI reporting configuration information.
  • the reported configuration uses the corresponding AI model.
  • Embodiments of the present disclosure provide an AI model determination device, including: a first sending module 51 configured to send AI model usage information; where the AI model usage information is used to determine the AI model used by the UE.
  • the AI model uses information including at least one of the following:
  • the identification information of the CSI reporting configuration corresponding to the identification information of the AI model is the identification information of the CSI reporting configuration corresponding to the identification information of the AI model.
  • Embodiments of the present disclosure provide an AI model determination device, including: a first sending module 51 configured to send AI model usage information based on receiving reporting request information from a network device; wherein the reporting request information is used to request the UE to use AI model.
  • an AI model determination device including:
  • the second receiving module 61 is configured to receive the AI capability information of the UE
  • the second processing module 62 is configured to determine the AI model of at least one channel state information CSI used by the UE based on the AI capability information;
  • AI capability information includes at least one of the following:
  • AI capability indication information used to indicate whether the UE supports AI capabilities
  • AI level indication information is used to indicate the level of the AI capabilities supported by the UE
  • the identification information of the AI model is used to indicate the AI model supported by the UE;
  • the identification information of the AI platform is used to indicate the AI platform supported by the UE;
  • AI reasoning indication information used to indicate whether the UE supports AI reasoning capabilities
  • AI training indication information used to indicate whether the UE supports the training capability of the AI model.
  • the AI model determination apparatus provided by the embodiments of the present disclosure can be applied in base stations.
  • Embodiments of the present disclosure provide an AI model determination device, including: a second sending module configured to send CSI reporting configuration information, where the CSI reporting configuration information is used to instruct the UE to use a supported AI model.
  • Embodiments of the present disclosure provide an AI model determination device, including: a second sending module configured to, in response to determining that the UE supports an AI model, determine to send CSI reporting configuration information that does not include identification information of the AI model; wherein, CSI reporting The configuration information is used to instruct the UE to use an AI model supported by the UE.
  • Embodiments of the present disclosure provide an AI model determination device, including: a second sending module configured to send CSI reporting configuration information in response to determining that the UE supports multiple AI models, wherein the CSI reporting configuration information includes: at least one CSI reporting The configured identification information and the identification information of the corresponding AI model.
  • An embodiment of the present disclosure provides an AI model determination device, including:
  • the second receiving module 61 is configured to receive the AI model usage information sent by the UE;
  • the second processing module 62 is configured to determine the AI model of the CSI used by the base station based on the AI model usage information.
  • the AI model uses information including at least one of the following:
  • the identification information of the CSI reporting configuration corresponding to the identification information of the AI model is the identification information of the CSI reporting configuration corresponding to the identification information of the AI model.
  • Embodiments of the present disclosure provide an AI model determination device, including: a second sending module configured to send reporting request information; wherein the reporting request information is used to request the AI model used by the UE.
  • An embodiment of the present disclosure provides a communication device, including:
  • Memory used to store instructions executable by the processor
  • the processor is configured to implement the AI model determination method of any embodiment of the present disclosure when running executable instructions.
  • the communication device may include but is not limited to at least one of: a UE and a base station.
  • the processor may include various types of storage media, which are non-transitory computer storage media that can continue to memorize the information stored thereon after the user equipment is powered off.
  • the processor may be connected to the memory through a bus or the like, and be used to read the executable program stored in the memory, for example, at least one of the methods shown in FIGS. 2 to 6 .
  • Embodiments of the present disclosure also provide a computer storage medium.
  • the computer storage medium stores a computer executable program.
  • the executable program is executed by a processor, the AI model determination method of any embodiment of the present disclosure is implemented. For example, at least one of the methods shown in Figures 2 to 6.
  • Figure 9 is a block diagram of a user equipment 800 according to an exemplary embodiment.
  • the user device 800 may be a mobile phone, a computer, a digital broadcast user device, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like.
  • the user device 800 may include one or more of the following components: a processing component 802 , a memory 804 , a power supply component 806 , a multimedia component 808 , an audio component 810 , an input/output (I/O) interface 812 , and a sensor component 814 , and communication component 816.
  • Processing component 802 generally controls the overall operations of user device 800, such as operations associated with display, phone calls, data communications, camera operations, and recording operations.
  • the processing component 802 may include one or more processors 820 to execute instructions to complete all or part of the steps of the above method.
  • processing component 802 may include one or more modules that facilitate interaction between processing component 802 and other components.
  • processing component 802 may include a multimedia module to facilitate interaction between multimedia component 808 and processing component 802.
  • Memory 804 is configured to store various types of data to support operations at user device 800 . Examples of such data include instructions for any application or method operating on user device 800, contact data, phonebook data, messages, pictures, videos, etc.
  • Memory 804 may be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EEPROM), Programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EEPROM erasable programmable read-only memory
  • EPROM Programmable read-only memory
  • PROM programmable read-only memory
  • ROM read-only memory
  • magnetic memory flash memory, magnetic or optical disk.
  • Power supply component 806 provides power to various components of user equipment 800.
  • Power supply components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power to user device 800 .
  • Multimedia component 808 includes a screen that provides an output interface between the user device 800 and the user.
  • the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide action.
  • multimedia component 808 includes a front-facing camera and/or a rear-facing camera.
  • the front camera and/or the rear camera may receive external multimedia data.
  • Each front-facing camera and rear-facing camera can be a fixed optical lens system or have a focal length and optical zoom capabilities.
  • Audio component 810 is configured to output and/or input audio signals.
  • audio component 810 includes a microphone (MIC) configured to receive external audio signals when user device 800 is in operating modes, such as call mode, recording mode, and speech recognition mode. The received audio signal may be further stored in memory 804 or sent via communication component 816 .
  • audio component 810 also includes a speaker for outputting audio signals.
  • the I/O interface 812 provides an interface between the processing component 802 and a peripheral interface module, which may be a keyboard, a click wheel, a button, etc. These buttons may include, but are not limited to: Home button, Volume buttons, Start button, and Lock button.
  • Sensor component 814 includes one or more sensors that provide various aspects of status assessment for user device 800 .
  • the sensor component 814 can detect the open/closed state of the device 800, the relative positioning of components, such as the display and keypad of the user device 800, the sensor component 814 can also detect the user device 800 or a component of the user device 800. position changes, the presence or absence of user contact with user device 800 , user device 800 orientation or acceleration/deceleration and temperature changes of user device 800 .
  • Sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor component 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 816 is configured to facilitate wired or wireless communication between user device 800 and other devices.
  • User equipment 800 may access a wireless network based on a communication standard, such as WiFi, 4G or 5G, or a combination thereof.
  • the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel.
  • the communications component 816 also includes a near field communications (NFC) module to facilitate short-range communications.
  • NFC near field communications
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • user equipment 800 may be configured by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable A programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic component implementation is used to perform the above method.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable A programmable gate array
  • controller microcontroller, microprocessor or other electronic component implementation is used to perform the above method.
  • a non-transitory computer-readable storage medium including instructions such as a memory 804 including instructions, which can be executed by the processor 820 of the user device 800 to complete the above method is also provided.
  • the non-transitory computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
  • an embodiment of the present disclosure shows the structure of a base station.
  • the base station 900 may be provided as a network side device.
  • base station 900 includes a processing component 922, which further includes one or more processors, and memory resources represented by memory 932 for storing instructions, such as application programs, executable by processing component 922.
  • the application program stored in memory 932 may include one or more modules, each corresponding to a set of instructions.
  • the processing component 922 is configured to execute instructions to perform any of the foregoing methods applied to the base station.
  • Base station 900 may also include a power supply component 926 configured to perform power management of base station 900, a wired or wireless network interface 950 configured to connect base station 900 to a network, and an input/output (I/O) interface 958.
  • Base station 900 may operate based on an operating system stored in memory 932, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Security & Cryptography (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

本公开实施例提供一种AI模型确定方法、装置、通信设备及存储介质;AI模型确定方法,由UE执行,包括:发送UE的AI能力信息,其中,AI能力信息用于基站确定UE使用的CSI的AI模型;其中,AI能力信息包括AI能力指示信息、AI等级指示信息、AI模型的标识信息、AI平台的标识信息、AI推理指示信息以及AI训练指示信息的其中至少之一。

Description

AI模型确定方法、装置、通信设备及存储介质 技术领域
本公开涉及但不限于无线通信技术领域,尤其涉及一种AI模型确定方法、装置、通信设备及存储介质。
背景技术
目前,可以利用人工智能(Artificial Intelligence,AI)技术提高空口的性能,例如可利用AI技术降低信道状态信息(Channel State Information,CSI)反馈开销以及提高信道估计的准确性。一种主流的方式是采用‘two-sided’AI结构:基于AI的CSI压缩编码器在用户设备(User Equipment,UE)侧,UE根据AI的CSI压缩编码器将测量的CSI压缩并将压缩的后CSI信息发送给基站;基于AI的CSI解压缩编码器在基站侧,基站利用与AI的CSI压缩编码器相应的CSI解压缩编码器将UE发送的压缩后的CSI信息解压还原出来。压缩和解压缩是AI模型的两个部分,是成对出现的,因此需要保证UE和基站所使用的AI模型是由共识的。
然而,对于CSI压缩这种基于UE侧和基站侧才能实现的AI模型,如果AI模型由基站下发给UE,或者AI模型由UE上报给基站;则需要在UE与基站之间交互AI模型,导致信令增加及功耗增加。
发明内容
本公开实施例一种AI模型确定方法、装置、通信设备及存储介质。
根据本公开的第一方面,提供一种AI模型确定方法,由UE执行,包括:
发送UE的AI能力信息,其中,AI能力信息用于基站确定UE使用的CSI的AI模型;其中,AI能力信息包括以下至少之一:
AI能力指示信息,用于指示UE是否支持AI能力;
AI等级指示信息,用于指示UE支持的AI能力所属的等级;
AI模型的标识信息,用于指示UE支持的AI模型;
AI平台的标识信息,用于指示UE支持的AI平台;
AI推理指示信息,用于指示UE是否支持AI的推理能力;
以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
在一些实施例中,AI能力信息是每UE上报的。
在一些实施例中,AI能力信息是非强制性上报的或者是有条件的强制性上报。
在一些实施例中,方法包括:
接收CSI上报配置信息;其中,CSI上报配置信息是基站基于AI能力信息确定的;
基于CSI上报配置信息,确定UE使用的至少一个AI模型。
在一些实施例中,基于CSI上报配置信息,确定UE使用的至少一个AI模型,包括:
基于CSI上报配置信息中未包括AI模型的标识信息,确定UE使用UE支持的一个AI模型;或者,
基于CSI上报配置信息中包括至少一个CSI上报配置的标识信息与对应的AI模型的标识信息,确定不同的CSI上报配置使用对应的AI模型。
在一些实施例中,方法包括:发送AI模型使用信息;其中,AI模型使用信息用于确定UE使用的AI模型。
在一些实施例中,AI模型使用信息,包括以下至少之一:
AI模型的标识信息;
与AI模型的标识信息对应的CSI上报配置的标识信息。
在一些实施例中,发送AI模型使用信息,包括:
基于接收到来自网络设备的上报请求信息,发送AI模型使用信息;其中,上报请求信息用于请求UE使用的AI模型。
根据本公开的第二方面,提供一种AI模型确定方法,由基站执行,包括:
接收UE的AI能力信息;
基于AI能力信息,确定UE使用的至少一种信道状态信息CSI的AI模型;
其中,AI能力信息包括以下至少之一:
AI能力指示信息,用于指示UE是否支持AI能力;
AI等级指示信息,用于指示UE支持的AI能力所属的等级;
AI模型的标识信息,用于指示UE支持的AI模型;
AI平台的标识信息,用于指示UE支持的AI平台;
AI推理指示信息,用于指示UE是否支持AI的推理能力;
以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
在一些实施例中,方法包括:发送CSI上报配置信息,其中,CSI上报配置信息用于指示UE使用支持的一个AI模型。
在一些实施例中,发送CSI上报配置信息,包括:
响应于确定UE支持一个AI模型,确定发送未包括AI模型的标识信息的CSI上报配置信息;其中,CSI上报配置信息用于指示UE使用UE支持的一个AI模型。
在一些实施例中,发送CSI上报配置信息,包括:
响应于确定UE支持多种AI模型,发送CSI上报配置信息,其中,CSI上报配置信息包括:至少一个CSI上报配置的标识信息与对应的AI模型的标识信息。
在一些实施例中,方法包括:
接收UE发送的AI模型使用信息;
基于AI模型使用信息,确定基站使用的CSI的AI模型。
在一些实施例中,AI模型使用信息,包括以下至少之一:
AI模型的标识信息;
与AI模型的标识信息对应的CSI上报配置的标识信息。
在一些实施例中,方法包括:发送上报请求信息;其中,上报请求信息用于请求UE使用的AI模型。
根据本公开的第三方面,提供一种AI模型确定装置,包括:
第一发送模块,被配置为发送UE的AI能力信息,其中,AI能力信息用于基站确定UE使用的CSI的AI模型;其中,AI能力信息包括以下至少之一:
AI能力指示信息,用于指示UE是否支持AI能力;
AI等级指示信息,用于指示UE支持的AI能力所属的等级;
AI模型的标识信息,用于指示UE支持的AI模型;
AI平台的标识信息,用于指示UE支持的AI平台;
AI推理指示信息,用于指示UE是否支持AI的推理能力;
以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
在一些实施例中,AI能力信息是每UE上报的。
在一些实施例中,AI能力信息是非强制性上报的或者是有条件的强制性上报。
在一些实施例中,装置包括:
第一接收模块,被配置为接收CSI上报配置信息;其中,CSI上报配置信息是基站基于AI能力信息确定的;
第一处理模块,被配置为基于CSI上报配置信息,确定UE使用的至少一个AI模型。
在一些实施例中,第一处理模块,配置为基于CSI上报配置信息中未包括AI模型的标识信息,确定UE使用UE支持的一个AI模型;或者,
第一处理模块,被配置为基于CSI上报配置信息中包括至少一个CSI上报配置的标识信息与对应的AI模型的标识信息,确定不同的CSI上报配置使用对应的AI模型。
在一些实施例中,第一发送模块,被配置为发送AI模型使用信息;其中,AI模型使用信息用于确定UE使用的AI模型。
在一些实施例中,AI模型使用信息,包括以下至少之一:
AI模型的标识信息;
与AI模型的标识信息对应的CSI上报配置的标识信息。
在一些实施例中,第一发送模块,被配置为基于接收到来自网络设备的上报请求信息,发送AI模型使用信息;其中,上报请求信息用于请求UE使用的AI模型。
根据本公开的第四方面,提供一种AI模型确定装置,包括:
第二接收模块,被配置为接收UE的AI能力信息;
第二处理模块,被配置为基于AI能力信息,确定UE使用的至少一种信道状态信息CSI的AI模型;
其中,AI能力信息包括以下至少之一:
AI能力指示信息,用于指示UE是否支持AI能力;
AI等级指示信息,用于指示UE支持的AI能力所属的等级;
AI模型的标识信息,用于指示UE支持的AI模型;
AI平台的标识信息,用于指示UE支持的AI平台;
AI推理指示信息,用于指示UE是否支持AI的推理能力;
以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
在一些实施例中,装置包括:第二发送模块,被配置为发送CSI上报配置信息,其中,CSI上报配置信息用于指示UE使用支持的一个AI模型。
在一些实施例中,第二发送模块,被配置为响应于确定UE支持一个AI模型,确定发送未包括AI模型的标识信息的CSI上报配置信息;其中,CSI上报配置信息用于指示UE使用UE支持的一个AI模型。
在一些实施例中,第二发送模块,被配置为响应于确定UE支持多种AI模型,发送CSI上报配置信息,其中,CSI上报配置信息包括:至少一个CSI上报配置的标识信息与对应的AI模型的标识信息。
在一些实施例中,第二接收模块,被配置为接收UE发送的AI模型使用信息;
第二处理模块,被配置为基于AI模型使用信息,确定基站使用的CSI的AI模型。
在一些实施例中,AI模型使用信息,包括以下至少之一:
AI模型的标识信息;
与AI模型的标识信息对应的CSI上报配置的标识信息。
在一些实施例中,第二发送模块,被配置为发送上报请求信息;其中,上报请求信息用于请求UE使用的AI模型。
根据本公开的第五方面,提供一种通信设备,通信设备,包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,处理器被配置为:用于运行可执行指令时,实现本公开任意实施例的AI模型确定方法。
根据本公开的第六方面,提供一种计算机存储介质,计算机存储介质存储有计算机可执行程序,可执行程序被处理器执行时实现本公开任意实施例的AI模型确定方法。
本公开实施例提供的技术方案可以包括以下有益效果:
在本公开实施例中,UE发送UE的AI能力信息,其中,AI能力信息用于基站确定UE使用的CSI的AI模型;其中,AI能力信息包括AI能力指示信息、AI等级指示信息、AI模型的标识信息、AI平台的标识信息、AI推理指示信息、以及AI训练指示信息的其中至少之一。如此,可以使得基站知晓UE进行CSI压缩的AI模型是哪一个,从而有利于基站也基于与该AI模型对UE上报的压缩后的CSI进行解压,以确保能够正确解压出UE上报的压缩后的CSI。并且,由于无需在基站和UE之间AI模型的交互,从而可以减少交互AI模型的信令,可以降低UE与基站的功耗等。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本公开实施例。
附图说明
图1是根据一示例性实施例示出的一种无线通信系统的结构示意图。
图2是根据一示例性实施例示出的一种AI模型确定方法的示意图。
图3是根据一示例性实施例示出的一种AI模型确定方法的示意图。
图4是根据一示例性实施例示出的一种AI模型确定方法的示意图。
图5是根据一示例性实施例示出的一种AI模型确定方法的示意图。
图6是根据一示例性实施例示出的一种AI模型确定方法的示意图。
图7是根据一示例性实施例示出的一种AI模型确定装置的框图。
图8是根据一示例性实施例示出的一种AI模型确定装置的框图。
图9是根据一示例性实施例示出的一种UE的框图。
图10是根据一示例性实施例示出的一种基站的框图。
具体实施方式
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开实施例相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开实施例的一些方面相一致的装置和方法的例子。
在本公开实施例使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本公开实施例。在本公开实施例和所附权利要求书中所使用的单数形式的“一种”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本公开实施例可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本公开实施例范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取 决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”。
请参考图1,其示出了本公开实施例提供的一种无线通信系统的结构示意图。如图1所示,无线通信系统是基于蜂窝移动通信技术的通信系统,该无线通信系统可以包括:若干个用户设备110以及若干个基站120。
其中,用户设备110可以是指向用户提供语音和/或数据连通性的设备。用户设备110可以经无线接入网(Radio Access Network,RAN)与一个或多个核心网进行通信,用户设备110可以是物联网用户设备,如传感器设备、移动电话(或称为“蜂窝”电话)和具有物联网用户设备的计算机,例如,可以是固定式、便携式、袖珍式、手持式、计算机内置的或者车载的装置。例如,站(Station,STA)、订户单元(subscriber unit)、订户站(subscriber station),移动站(mobile station)、移动台(mobile)、远程站(remote station)、接入点、远程用户设备(remote terminal)、接入用户设备(access terminal)、用户装置(user terminal)、用户代理(user agent)、用户设备(user device)、或用户设备(user equipment)。或者,用户设备110也可以是无人飞行器的设备。或者,用户设备110也可以是车载设备,比如,可以是具有无线通信功能的行车电脑,或者是外接行车电脑的无线用户设备。或者,用户设备110也可以是路边设备,比如,可以是具有无线通信功能的路灯、信号灯或者其它路边设备等。
基站120可以是无线通信系统中的网络侧设备。其中,该无线通信系统可以是第四代移动通信技术(the 4th generation mobile communication,4G)系统,又称长期演进(Long Term Evolution,LTE)系统;或者,该无线通信系统也可以是5G系统,又称新空口系统或5G NR系统。或者,该无线通信系统也可以是5G系统的再下一代系统。其中,5G系统中的接入网可以称为新一代无线接入网(New Generation-Radio Access Network,NG-RAN)。
其中,基站120可以是4G系统中采用的演进型基站(eNB)。或者,基站120也可以是5G系统中采用集中分布式架构的基站(gNB)。当基站120采用集中分布式架构时,通常包括集中单元(central unit,CU)和至少两个分布单元(distributed unit,DU)。集中单元中设置有分组数据汇聚协议(Packet Data Convergence Protocol,PDCP)层、无线链路层控制协议(Radio Link Control,RLC)层、媒体接入控制(Medium Access Control,MAC)层的协议栈;分布单元中设置有物理(Physical,PHY)层协议栈,本公开实施例对基站120的具体实现方式不加以限定。
基站120和用户设备110之间可以通过无线空口建立无线连接。在不同的实施方式中,该无线空口是基于第四代移动通信网络技术(4G)标准的无线空口;或者,该无线空口是基于第五代移动通信网络技术(5G)标准的无线空口,比如该无线空口是新空口;或者,该无线空口也可以是基于5G的更下一代移动通信网络技术标准的无线空口。
在一些实施例中,用户设备110之间还可以建立E2E(End to End,端到端)连接。比如车联网通信(vehicle to everything,V2X)中的车对车(vehicle to vehicle,V2V)通信、车对路边设备(vehicle to Infrastructure,V2I)通信和车对人(vehicle to pedestrian,V2P)通信等场景。
这里,上述用户设备可认为是下面实施例的终端设备。
在一些实施例中,上述无线通信系统还可以包含网络管理设备130。
若干个基站120分别与网络管理设备130相连。其中,网络管理设备130可以是无线通信系统中的核心网设备,比如,该网络管理设备130可以是演进的数据分组核心网(Evolved Packet Core,EPC)中的移动性管理实体(Mobility Management Entity,MME)。或者,该网络管理设备也可以是其它的核心网设备,比如服务网关(Serving GateWay,SGW)、公用数据网网关(Public Data Network GateWay,PGW)、策略与计费规则功能单元(Policy and Charging Rules Function,PCRF)或者归属签约用户服务器(Home Subscriber Server,HSS)等。对于网络管理设备130的实现形态,本公开实施例不做限定。
为了便于本领域内技术人员理解,本公开实施例列举了多个实施方式以对本公开实施例的技术方案进行清晰地说明。当然,本领域内技术人员可以理解,本公开实施例提供的多个实施例,可以被单独执行,也可以与本公开实施例中其他实施例的方法结合后一起被执行,还可以单独或结合后与其他相关技术中的一些方法一起被执行;本公开实施例并不对此作出限定。
为了更好地理解本公开任一个实施例所描述的技术方案,首先,对相关技术中进行部分说明:
在一个实施例中,AI软实现的复杂度要比AI硬实现的复杂度高很多。示例性的,AI软实现可以是在芯片内置一些基本模型;通过基站转变一些参数,UE侧通过C语言把模型完善。AI硬实现可以是直接把AI模型内置在芯片,将AI模型内置在芯片中可以是通过Verilog语言固化在硬件中的。
如图2所示,本公开实施例体提供一种AI模型确定方法,由UE执行,包括:
步骤S21:发送UE的AI能力信息,其中,AI能力信息用于基站确定UE使用的CSI的AI模型;
其中,AI能力信息包括以下至少之一:
AI能力指示信息,用于指示UE是否支持AI能力;
AI等级指示信息,用于指示UE支持的AI能力所属的等级;
AI模型的标识信息,用于指示UE支持的AI模型;
AI平台的标识信息,用于指示UE支持的AI平台;
AI推理指示信息,用于指示UE是否支持AI的推理能力;
以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
这里,UE可以是各种移动终端或固定终端。例如,该UE可以是但不限于是手机、计算机、服务器、可穿戴设备、车载终端、路侧单元(RSU,Road Side Unit)、游戏控制平台或多媒体设备等。
在一个实施例中,步骤S21可以是:向基站发送UE的AI能力信息。
这里,基站可以是各种类型基站。例如,该基站可以是2G基站、3G基站、4G基站、5G基站或其它演进型基站。
在另一个实施例中,步骤S21可以是:向网络设备发送AI能力信息。
这里,网络设备可以是接入网设备或者核心网设备。该接入网设备可以是各种基站。该核心网设备可以是核心网的各种逻辑节点或者功能;例如,该核心网设备可以是接入与移动性管理功能(Access and Mobility Management Function,AMF)或者网络功能(Network Function,NF)。若UE向核心网设备发送UE的AI能力信息,可以是:UE向基站发送UE的AI能力信息,基站将UE的AI能力信息转发给核心网设备。
在一个实施例中,步骤S21可以是:上报UE的AI能力信息。示例性的,UE向基站上报UE的AI能力信息。
这里,AI能力指示信息、AI等级指示信息、AI模型的标识信息、AI平台的标识信息、AI推理指示信息及AI训练指示信息可分别通过AI能力信息的至少一个比特携带或者指示。
在另一些实施例中,A能力信息包括但不限于以下至少之一:
AI能力指示信息,可用于指示UE的芯片是否支持AI能力;
AI等级指示信息,可用于指示UE的芯片支持的AI能力所属的等级。
示例性的,AI能力信息指示取第一值,例如“0”时,用于指示UE支持AI能力;AI能力信息指示取第二值,例如“1”时,用于指示UE不支持AI能力。
这里,AI能力可以是指任意一种AI能力;例如,该AI能力可以是但不限于是CSI压缩的AI能力、或者AI定位能力等。
示例性的,AI能力所属等级可以是但不限于是x、y、及z的其中至少之一的等级或者系列。例如,协议规定了UE的芯片支持AI能力所属等级可以为x、y、及z三个等级。又如,基站和UE协商UE的芯片支持AI能力所属等级可以为x、y、及z三个等级。这里,x、y、及z三个等级分别与UE的不同算力能力和/或存储能力相对应。如此,基于UE的算力能力和/或存储能力等,可以确定出UE支持AI能力是属于x、y、及z三个等级中的哪个等级。
这里,可通过AI等级指示信息取不同的值,以指示UE或者UE的芯片支持AI能力所属的等级。
示例性的,AI模型可以是任意一种AI模型。例如,该AI模型可以是但不限于是CNN模型、RNN模型、或者transformer模型等。
这里,AI模型的标识信息用于唯一标识AI模型。如此,可以通过UE上报AI模型指示信息,以告知基站UE支持的AI模型,从而有利于基站确定出UE使用的AI模型。
示例性的,AI平台可以是任意一种AI平台。例如,该AI平台可以是但不限于是TensorFlow平台或者Pytorch平台等。
示例性的,AI推理指示信息取第一值,例如“0”时,用于指示UE支持AI的推理能力;AI推理指示信息取第二值,例如“1”时,用于指示UE不支持AI的推理能力。
示例性的,AI训练指示信息取第一值,例如“0”时,用于指示UE支持AI模型的训练能力;AI训练指示信息取第二值,例如“1”时,用于指示UE不支持AI模型的训练能力。
在一个实施例中,AI能力信息可用于指示AI模型使用的AI平台;和/或,AI能力信息可用于指示UE或者UE的芯片支持AI能力等级对应的AI平台。如此,在本公开实施例中,AI能力信息还可用于基站确定UE在支持不同AI模型时对应的AI平台,和/或用于基站确定UE或者UE的芯片支持不同AI能力所属等级时对应的AI平台。
如此,可以确保UE进行CSI压缩及基站进行CSI解压缩时使用不同的AI平台。
这里,AI能力信息用于基站确定UE使用的CSI压缩的AI模型;和/或,AI能力信息还可用于基站确定使用的CSI解压缩的AI模型。
在本公开实施例中,UE发送UE的AI能力信息,其中,AI能力信息用于基站确定UE使用的CSI的AI模型;该AI能力信息包括AI能力指示信息、AI等级指示信息、AI模型的标识信息、AI平台的标识信息、AI推理指示信息、以及AI训练指示信息的其中至少之一。如此,可以使得基站知晓UE进行CSI压缩的AI模型是哪一个,从而有利于基站也基于与该AI模型对UE上报的压缩后的CSI进行解压,以确保能够正确解压出UE上报的压缩后的CSI。并且,由于无需在基站和UE之间AI模型的交互,可以减少交互AI模型的信令,可以降低UE与基站的功耗等。
在一些实施例中,AI能力信息是每UE上报的或者每功能(per feature)上报的。例如,每UE上报是指不管UE支持的频段有多少,UE可以只需要上报一次该AI能力信息。例如,每功能上报是指UE针对每个频段每个频段组分别上报该AI能力信息。
在一些实施例中,AI能力信息是非强制性上报的或者是有条件的强制性上报。
这里,AI能力信息是非强制性上报是指:在进行UE能力上报时,强制UE上报该AI能力包含的AI能力信息。
这里,AI能力信息是有条件的强制性上报是指:在UE支持特定AI能力的情况下,则UE需要上报UE支持AI能力对应的AI能力信息。
示例性的,UE支持以下至少之一的AI能力,则上报以下至少之一的AI能力的AI能力信息:UE支持AI模型压缩的CSI的上报、UE支持AI的推理能力及支持AI模型的训练能力。
如此,对于UE支持的AI能力,可有条件的强制性上报该AI能力的AI能力信息。
需要说明的是,本领域内技术人员可以理解,本公开实施例提供的方法,可以被单独执行,也可以与本公开实施例中一些方法或相关技术中的一些方法一起被执行。
如图3所示,本公开实施例提供一种AI模型确定方法,由UE执行,包括:
步骤S31:接收CSI上报配置信息;其中,CSI上报配置信息是基站基于AI能力信息确定的;
步骤S32:基于CSI上报配置信息,确定UE使用的至少一个AI模型。
在本公开的一些实施例中,AI能力信息可以为步骤S21中AI能力信息;AI模型可以为步骤S21中AI模型。示例性的,AI能力信息包括AI能力指示信息、AI等级指示信息、AI模型的标识信息、AI平台的标识信息、AI推理指示信息以及AI训练指示信息的其中至少之一。
在一个实施例中,步骤S31中接收CSI上报配置信息,可以是:接收基站发送的CSI上报配置信息。
这里,CSI上报配置可以是任意一种CSI上报配置。例如,CSI上报配置可以是周期性CSI上报配置或者非周期CSI上报配置;又如,CSI上报配置可以是基于预定频段的CSI上报配置;再如,CSI上报配置基于某个PUCCH资源的上报CSI上报配置;再如,CSI上报配置可以是基于某个波束的CSI上报配置;等等。
这里,CSI上报配置的标识信息用于唯一标识CSI上报配置。例如,上报配置的标识信息为“00”时,可用于标识一种特定的CSI上报配置。
在一些实施例中,步骤S32,包括:
基于CSI上报配置信息中未包括AI模型的标识信息,确定UE使用UE支持的一个AI模型;或者,基于CSI上报配置信息中包括至少一个CSI上报配置的标识信息与对应的AI模型的标识信息,确定不同的CSI上报配置使用对应的AI模型。
本公开实施例提供一种AI模型确定方法,由UE执行,包括:
基于CSI上报配置信息中未包括AI模型的标识信息,确定UE使用UE支持的一个AI模型;或者,基于CSI上报配置信息中包括至少一个CSI上报配置的标识信息与对应的AI模型的标识信息,确定不同的CSI上报配置使用对应的AI模型。
本公开实施例提供一种AI模型确定方法,由UE执行,包括:
基于CSI上报配置信息中包括的AI模型的标识信息,确定UE使用AI模型的标识信息所指示的AI模型。
示例性的,UE支持一个AI模型,且UE接收到基站发送的CSI上报配置信息;UE若确定CSI上报配置信息中未包括AI模型的标识信息,则确定UE使用UE支持的该一个AI模型。
示例性的,UE支持一个或多个AI模型,且UE接收到基站发送的CSI上报配置信息;UE若确定CSI上报配置信息中包括一个AI模型的标识信息,确定UE使用该AI模型的标识信息所指示的AI模型。
示例性的,UE支持多个AI模型,且UE接收到基站发送的CSI上报配置信息;UE若确定CSI上报配置信息中包括:CSI上报配置的标识信息“00”与第一AI模型的标识信息、以及CSI上报配置第标识信息“01”与第二AI模型的标识信息。UE若确定当前使用CSI上报配置的标识信息“00”指示的CSI上报配置,则确定UE使用的AI模型为第一AI模型;或者若确定当前使用上报配置的的标识信息“01”指示的CSI上报配置,则确定UE使用的AI模型为第二AI模型。
在本公开实施例中,UE可以通过接收基站发送的CSI上报配置信息,准确确定出UE所使用的CSI压缩的AI模型为基站CSI解压缩的AI模型是相对应的;如此可以确保UE与基站之间采用对应的AI模型的同时无需UE与基站之间交互AI模型的具体AI参数信息。
这里,基站与UE之间的AI模型可以是相对应的。UE利用该AI压缩模型对CSI压缩,并将压缩后的CSI发送给基站;基站利用与UE相对应的AI解压缩模型对压缩后的CSI解压缩。
如图4所示,本公开实施例提供一种AI模型确定方法,由UE执行,包括:
步骤S41:发送AI模型使用信息;其中,AI模型使用信息用于确定UE使用的AI模型。
本公开实施例提供一种AI模型确定方法,由UE执行,包括:在发送AI能力信息之后,发送AI模型使用信息。
在一个实施例中,AI模型使用信息,用于指示UE使用的AI模型。
在一个实施例中,AI模型使用信息,可用于基站确定UE使用的AI模型;和/或,AI模型使用信息,可用于基站确定基站使用的AI模型。
在一个实施例中,AI模型使用信息,可用于UE确定UE使用的AI模型。
在一些实施例中,AI模型使用信息,包括以下至少之一:
AI模型的标识信息;
与AI模型的标识信息对应的CSI上报配置的标识信息。
示例性的,UE向基站发送AI模型使用信息,其中,AI模型使用信息包括AI模型的标识信息,AI模型使用信息用于指示UE使用的AI模型。如此,可以直接告知基站:UE所使用的AI模型。
示例性的,UE向基站发送AI模型使用信息,其中,AI模型使用信息包括:CSI上报配置的标识信息;则基站接收到该CSI上报配置的标识信息后,可以基于接收的CSI上报配置的标识信息与存储的AI模型的标识信息与CSI上报配置的标识信息的对应关系,确定出UE使用的AI模型的标识信息。如此,也可以通过接收的CSI上报配置的标识信息,确定出UE所使用的AI模型。
如此,在本公开实施例中,可以通过UE上报AI模型使用信息,若AI模型使用信息中携带AI模型的标识信息,可以直接告知基站UE所使用的AI模型;和/或若AI模型使用信息中携带CSI上报配置信息,则可以使得基站基于CSI上报配置信息的标识信息确定出AI模型的标识信息,从而也可以使得基站确定出UE使用的AI模型。如此,可以通过多种方式告知基站:UE所使用的AI模型,从而可以适用更多的应用场景。
本公开实施例提供一种AI模型确定方法,由UE执行,包括:确定UE使用的AI模型。
本公开实施例提供一种AI模型确定方法,由UE执行,包括:基于UE所处场景,确定UE使用的AI模型。
本公开实施例提供一种AI模型确定方法,由UE执行,包括以下至少之一:
基于UE的AI能力信息,确定UE使用的AI模型;
基于UE支持的AI模型,确定UE使用的AI模型。
示例性的,若UE支持一个AI模型,则确定UE使用的AI模型为UE支持的AI模型;或者,若UE支持多个AI模型,则确定UE使用的AI模型为多个AI模型中任意一个;等等。
在一些实施例中,基于UE所处场景,确定UE使用的AI模型,包括:
基于UE处于第一速度场景和/或城市场景,确定UE使用第一等级的AI模型;或者,
基于UE处于第二速度场景和/或乡村场景,确定UE使用第二等级的AI模型;其中,第一速度小于或等于第二速度。
本公开实施例提供一种AI模型确定方法,由UE执行,包括:基于UE处于第一速度场景和/或城市场景,确定UE使用第一等级的AI模型;或者,基于UE处于第二速度场景和/或乡村场景,确定UE使用第二等级的AI模型;其中,第一速度小于或等于第二速度。
这里,通过协议约定或者基站与UE协商,可以将UE的AI能力分为第一等级或者第二等级;或者,将UE的AI能力分为第1至第N等级,其中,N为大于1的整数;或者将UE的AI能力分为x、y或者z等级。这里,不同的等级所对应的UE的算力能力和/或存储能力等不同。例如,将UE的AI能力分为第一等级或者第二等级,第一等级所对应的算力能力和/或存储能力,小于第二等所对应的算力能力和/或存储能力。
如此,在本公开实施例中,UE可以基于UE所处的场景,选择合适的AI模型进行CSI压缩。如此当将该UE使用的AI模型发送给基站时,可以使得基站基于对应的AI模型进行CSI解压缩。
在一些实施例中,步骤S41中发送AI模型使用信息,包括:
基于接收到来自网络设备的上报请求信息,发送AI模型使用信息;其中,上报请求信息用于请求UE使用的AI模型。
本公开实施例提供一种AI模型确定方法,由UE执行,包括:
基于接收到来自网络设备的上报请求信息,发送AI模型使用信息;其中,上报请求信息用于请求UE使用的AI模型。
示例性的,UE接收基站发送的上报请求信息,发送AI模型的标识信息和/或与AI模型标识信息对应的CSI上报配置的标识信息;该上报请求信息用于请求UE使用的AI模型。
如此,在本公开实施例中,UE可以基站的触发,即基站发送的上报请求信息,以上报AI模型是使用信息;如此可以使得基站准确确定出UE使用的AI模型。
需要说明的是,本领域内技术人员可以理解,本公开实施例提供的方法,可以被单独执行,也可以与本公开实施例中一些方法或相关技术中的一些方法一起被执行。
以下一种基于AI模型确定方法,是由基站执行的,与上述由UE执行的AI模型确定方法的描述是类似的;且,对于由基站执行的AI模型确定方法实施例中未披露的技术细节,请参照由UE执行的AI模型确定方法示例的描述,在此不做详细描述说明。
如图5所示,本公开实施例提供一种AI模型确定方法,由基站执行,包括:
步骤S51:接收UE的AI能力信息;
步骤S52:基于AI能力信息,确定UE使用的至少一种CSI的AI模型;
其中,AI能力信息包括以下至少之一:
AI能力指示信息,用于指示UE是否支持AI能力;
AI等级指示信息,用于指示UE支持的AI能力所属的等级;
AI模型的标识信息,用于指示UE支持的AI模型;
AI平台的标识信息,用于指示UE支持的AI平台;
AI推理指示信息,用于指示UE是否支持AI的推理能力;
以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
本公开实施例所涉及的AI模型确定方法,也可以由网络设备执行;该网络设备可以是核心网设备。
在本公开的一些实施例中,AI能力信息可以为步骤S21中AI能力信息;AI模型为上述实施例中AI模型。
在一个实施例中,步骤S51,可以是:接收UE上报的UE的AI能力信息。
在一些实施例中,AI能力信息包括但不限于以下至少之一:
AI能力指示信息,可用于指示UE的芯片是否支持AI能力;
AI等级指示信息,可用于指示UE的芯片支持的AI能力所属的等级。
在一些实施例中,步骤S52,可以是:基于AI能力指示信息、AI等级指示信息、AI模型的标识信息、AI平台的标识信息、AI推理指示信息及AI训练指示信息的其中至少之一,确定UE使用的至少一种CSI的AI模型。
示例性的,基站接收到UE发送的AI能力信息,其中,AI能力信息包括AI能力指示信息及AI模型的标识信息;基站若确定AI能力指示信息指示UE支持AI能力信息,则可以基于AI模型的标识信息确定出UE支持的AI模型;基站从UE支持的AI模型中选择一个或多个AI模型作为UE使用的AI模型。
示例性的,基站接收到UE发送的AI能力信息,其中,AI能力信息包括AI能力指示信息及AI等级指示信息;基站若确定AI能力指示信息指示UE支持AI能力信息,则可以基于AI等级指示信息确定出UE支持的AI能力所属等级;基站确定UE使用的AI模型为AI能力所属等级中包括的一个或多个AI模型。
示例性的,基站接收到UE发送的AI能力信息,其中,AI能力信息包括AI平台的标识信息;基站可以基于AI平台的标识信息,可以确定出UE支持的AI平台;基站确定UE使用的AI模型为与UE支持的AI平台对应的一个或多个AI模型。
如此,本公开实施例中可以通过UE发送的UE的AI能力信息,确定出UE合适使用的AI模型。并且,可以通过多种方式确定出UE使用的AI模型,如此可以适应更多的应用场景。
以上实施方式,具体可以参见UE侧的表述,在此不再赘述。
需要说明的是,本领域内技术人员可以理解,本公开实施例提供的方法,可以被单独执行,也 可以与本公开实施例中一些方法或相关技术中的一些方法一起被执行。
如图6所示,本公开实施例提供一种AI模型确定方法,由基站执行,包括:
步骤S61:发送CSI上报配置信息,其中,CSI上报配置信息用于指示UE使用支持的一个AI模型。
在本公开的一些实施例中,CSI上报配置信息可以为步骤S31中CSI上报配置信息;CSI上报配置信息的标识信息可以为上述实施例中CSI上报配置信息的标识信息。
在一个实施例中,CSI上报配置信息是基于UE的AI能力信确定的。
在一些实施例中,步骤S61中发送CSI上报配置信息,包括:
响应于确定UE支持一个AI模型,确定发送未包括AI模型的标识信息的CSI上报配置信息;其中,CSI上报配置信息用于指示UE使用UE支持的一个AI模型。
本公开实施例提供一种AI模型确定方法,由基站执行,包括:响应于确定UE支持一个AI模型,确定发送未包括AI模型的标识信息的CSI上报配置信息;其中,CSI上报配置信息用于指示UE使用UE支持的一个AI模型。
在其它实施例中,基站若确定UE支持一个AI模型,也可以在CSI上报配置信息中携带AI模型的标识信息。
在其它实施例中,基站若确定UE支持多个AI模型,从多个AI模型中选择一个AI模型,并将该一个AI模型的标识信息携带在CSI上报配置信息中发送给UE。
如此,在本公开实施例中,基站向UE发送CSI上报配置信息,以指示UE进行上报的CSI可以采用基站指示UE使用的AI模型进行压缩。
在一些实施例中,步骤S61中发送CSI上报配置信息,包括:
响应于确定UE支持多种AI模型,发送CSI上报配置信息,其中,CSI上报配置信息包括:至少一个CSI上报配置的标识信息与对应的AI模型的标识信息。
本公开实施例提供一种AI模型确定方法,由基站执行,包括:
响应于确定UE支持多种AI模型,发送CSI上报配置信息,其中,CSI上报配置信息包括:至少一个CSI上报配置的标识信息与对应的AI模型的标识信息。
这里,同一个CSI上报配置的标识信息对应一个AI模型的标识信息。
如此,在本公开实施例中,基站可以给UE配置多个使用的AI模型,对于不同的CSI上报配置,配置不同的AI模型,以使得UE可以对于不同的CSI上报配置的CSI采用合适的AI模型进行压缩。
以上实施方式,具体可以参见UE侧的表述,在此不再赘述。
需要说明的是,本领域内技术人员可以理解,本公开实施例提供的方法,可以被单独执行,也可以与本公开实施例中一些方法或相关技术中的一些方法一起被执行。
本公开实施例提供一种AI模型确定方法,由基站执行,包括:
接收UE发送的AI模型使用信息;
基于AI模型使用信息,确定基站使用的CSI的AI模型。
在本公开的一些实施例中,AI模型是使用信息可以为步骤S41中AI模型使用信息。
示例性的,AI模型使用信息,包括以下至少之一:
AI模型的标识信息;
与AI模型的标识信息对应的CSI上报配置的标识信息。
本公开实施例提供一种AI模型确定方法,由基站执行,包括:发送上报请求信息;其中,上报请求信息用于请求UE使用的AI模型。
这里,发送上报请求信息,包括:向UE发送上报请求信息。该上报请求信息用于触发UE上报AI模型使用信息。
以上实施方式,具体可以参见UE侧的表述,在此不再赘述。
需要说明的是,本领域内技术人员可以理解,本公开实施例提供的方法,可以被单独执行,也可以与本公开实施例中一些方法或相关技术中的一些方法一起被执行。
为了进一步解释本公开任意实施例,以下提供一个具体实施例。
本公开实施例提供一种AI模型确定方法,由通信设备执行,通信设备包括UE和基站;AI模型确定方法包括以下至少之一:
步骤S71:UE上报UE的AI能力信息;其中,AI能力信息包括以下至少之一:
AI能力指示信息,用于指示UE是否支持AI能力;
AI等级指示信息,用于指示UE支持的AI能力所属的等级;
AI模型的标识信息,用于指示UE支持的AI模型;
AI平台的标识信息,用于指示UE支持的AI平台;
AI推理指示信息,用于指示UE是否支持AI的推理能力;
AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
在一个可选实施例中,AI能力信息是每UE上报的或者每功能(per feature)上报的。
在另一个可选实例例中,AI能力信息是非强制性上报的或者是有条件的强制性上报的。例如,若UE支持以下至少之一的能力,则UE上报该以下至少之一能力的AI能力信息:UE支持AI模型压缩的CSI的上报、UE支持AI的推理能力及支持AI模型的训练能力。
步骤S72:基站接收AI能力信息并配置UE使用的AI模型;其中,步骤S72,包括步骤S72a和S72b;
步骤S72a:基站基于UE的AI能力信息,确定UE支持的一个或多个AI模型;
步骤S72b:基于UE支持的一个或多个AI模型,确定UE使用的AI模型;并发送CSI上报配置信息;
在一个可选实施例中,若确定UE支持一个AI模型,基站发送未包括AI模型的标识信息的CSI上报配置信息。
在另一个可选实施例中,若确定UE支持多个AI模型,基站发送CSI上报配置信息;CSI上报 配置信息包括:至少一个CSI上报配置的标识信息与对应的AI模型的标识信息。
这里,基站可根据当前的信道状态指示UE使用的AI模型。
这里,UE可根据基站配置的使用的AI模型CSI压缩,基站可使用相应的AI模型对压缩后的CSI进行解压缩。
在一可选实施例中,步骤S72也可以由步骤S73代替。
步骤S73:UE上报AI模型使用信息,其中,AI模型使用信息用于确定UE使用的AI模型。
在一个可选实施例中,UE基于UE所处场景,确定UE使用的AI模型。
例如,UE基于UE处于第一速度场景和/或城市场景,确定UE使用x等级的AI模型;或者,基于UE处于第二速度场景和/或乡村场景,确定UE使用y等级的AI模型;其中,第一速度小于或等于第二速度。这里,x等级所对应的算力能力和/或存储能力,小于y等级所对应的算力能力和/或存储能力。
在一个可选实施例中,UE发送的AI模型使用信息包括以下至少之一:AI模型的标识信息、及与AI模型的标识信息对应的CSI上报配置信息的标识信息。
这里,AI模型是使用信息可以是UE主动上报,也可以是UE基于基站的触发上报。这里UE基于基站的触发上报,可以是:UE接收到基站发送上报请求信息而上报。
需要说明的是,本领域内技术人员可以理解,本公开实施例提供的方法,可以被单独执行,也可以与本公开实施例中一些方法或相关技术中的一些方法一起被执行。
如图7所示,本公开实施例提供一种AI模型确定装置,包括:
第一发送模块51,被配置为发送UE的AI能力信息,其中,AI能力信息用于基站确定UE使用的CSI的AI模型;其中,AI能力信息包括以下至少之一:
AI能力指示信息,用于指示UE是否支持AI能力;
AI等级指示信息,用于指示UE支持的AI能力所属的等级;
AI模型的标识信息,用于指示UE支持的AI模型;
AI平台的标识信息,用于指示UE支持的AI平台;
AI推理指示信息,用于指示UE是否支持AI的推理能力;
以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
本公开实施例提供的AI模块确定装置可应用于UE中。
在一些实施例中,AI能力信息是每UE上报的。
在一些实施例中,AI能力信息是非强制性上报的或者是有条件的强制性上报。
本公开实施例提供一种AI模型确定装置,包括:
第一接收模块,被配置为接收CSI上报配置信息;其中,CSI上报配置信息是基站基于AI能力信息确定的;
第一处理模块,被配置为基于CSI上报配置信息,确定UE使用的至少一个AI模型。
本公开实施例提供一种AI模型确定装置,包括:第一处理模块,配置为基于CSI上报配置信息中未包括AI模型的标识信息,确定UE使用UE支持的一个AI模型。
本公开实施例提供一种AI模型确定装置,包括:第一处理模块,被配置为基于CSI上报配置信息中包括至少一个CSI上报配置的标识信息与对应的AI模型的标识信息,确定不同的CSI上报配置使用对应的AI模型。
本公开实施例提供一种AI模型确定装置,包括:第一发送模块51,被配置为发送AI模型使用信息;其中,AI模型使用信息用于确定UE使用的AI模型。
在一些实施例中,AI模型使用信息,包括以下至少之一:
AI模型的标识信息;
与AI模型的标识信息对应的CSI上报配置的标识信息。
本公开实施例提供一种AI模型确定装置,包括:第一发送模块51,被配置为基于接收到来自网络设备的上报请求信息,发送AI模型使用信息;其中,上报请求信息用于请求UE使用的AI模型。
如图8所示,本公开实施例提供一种AI模型确定装置,包括:
第二接收模块61,被配置为接收UE的AI能力信息;
第二处理模块62,被配置为基于AI能力信息,确定UE使用的至少一种信道状态信息CSI的AI模型;
其中,AI能力信息包括以下至少之一:
AI能力指示信息,用于指示UE是否支持AI能力;
AI等级指示信息,用于指示UE支持的AI能力所属的等级;
AI模型的标识信息,用于指示UE支持的AI模型;
AI平台的标识信息,用于指示UE支持的AI平台;
AI推理指示信息,用于指示UE是否支持AI的推理能力;
以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
本公开实施例提供的AI模型确定装置可应用于基站中。
本公开实施例提供一种AI模型确定装置,包括:第二发送模块,被配置为发送CSI上报配置信息,其中,CSI上报配置信息用于指示UE使用支持的一个AI模型。
本公开实施例提供一种AI模型确定装置,包括:第二发送模块,被配置为响应于确定UE支持一个AI模型,确定发送未包括AI模型的标识信息的CSI上报配置信息;其中,CSI上报配置信息用于指示UE使用UE支持的一个AI模型。
本公开实施例提供一种AI模型确定装置,包括:第二发送模块,被配置为响应于确定UE支持多种AI模型,发送CSI上报配置信息,其中,CSI上报配置信息包括:至少一个CSI上报配置的标识信息与对应的AI模型的标识信息。
本公开实施例提供一种AI模型确定装置,包括:
第二接收模块61,被配置为接收UE发送的AI模型使用信息;
第二处理模块62,被配置为基于AI模型使用信息,确定基站使用的CSI的AI模型。
在一些实施例中,AI模型使用信息,包括以下至少之一:
AI模型的标识信息;
与AI模型的标识信息对应的CSI上报配置的标识信息。
本公开实施例提供一种AI模型确定装置,包括:第二发送模块,被配置为发送上报请求信息;其中,上报请求信息用于请求UE使用的AI模型。
需要说明的是,本领域内技术人员可以理解,本公开实施例提供的装置,可以被单独执行,也可以与本公开实施例中一些装置或相关技术中的一些装置一起被执行。
关于上述实施例中的装置,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
本公开实施例提供一种通信设备,包括:
处理器;
用于存储处理器可执行指令的存储器;
其中,处理器被配置为:用于运行可执行指令时,实现本公开任意实施例的AI模型确定方法。
在一个实施例中,通信设备可以包括但不限于至少之一:UE及基站。
其中,处理器可包括各种类型的存储介质,该存储介质为非临时性计算机存储介质,在用户设备掉电之后能够继续记忆存储其上的信息。
处理器可以通过总线等与存储器连接,用于读取存储器上存储的可执行程序,例如,如图2至图6示的方法的至少其中之一。
本公开实施例还提供一种计算机存储介质,计算机存储介质存储有计算机可执行程序,可执行程序被处理器执行时实现本公开任意实施例的AI模型确定方法。例如,如图2至图6所示的方法的至少其中之一。
关于上述实施例中的装置或者存储介质,其中各个模块执行操作的具体方式已经在有关该方法的实施例中进行了详细描述,此处将不做详细阐述说明。
图9是根据一示例性实施例示出的一种用户设备800的框图。例如,用户设备800可以是移动电话,计算机,数字广播用户设备,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等。
参照图9,用户设备800可以包括以下一个或多个组件:处理组件802,存储器804,电源组件806,多媒体组件808,音频组件810,输入/输出(I/O)的接口812,传感器组件814,以及通信组件816。
处理组件802通常控制用户设备800的整体操作,诸如与显示,电话呼叫,数据通信,相机操 作和记录操作相关联的操作。处理组件802可以包括一个或多个处理器820来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件802可以包括一个或多个模块,便于处理组件802和其他组件之间的交互。例如,处理组件802可以包括多媒体模块,以方便多媒体组件808和处理组件802之间的交互。
存储器804被配置为存储各种类型的数据以支持在用户设备800的操作。这些数据的示例包括用于在用户设备800上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器804可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
电源组件806为用户设备800的各种组件提供电力。电源组件806可以包括电源管理系统,一个或多个电源,及其他与为用户设备800生成、管理和分配电力相关联的组件。
多媒体组件808包括在所述用户设备800和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(LCD)和触摸面板(TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件808包括一个前置摄像头和/或后置摄像头。当用户设备800处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。
音频组件810被配置为输出和/或输入音频信号。例如,音频组件810包括一个麦克风(MIC),当用户设备800处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器804或经由通信组件816发送。在一些实施例中,音频组件810还包括一个扬声器,用于输出音频信号。
I/O接口812为处理组件802和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。
传感器组件814包括一个或多个传感器,用于为用户设备800提供各个方面的状态评估。例如,传感器组件814可以检测到设备800的打开/关闭状态,组件的相对定位,例如所述组件为用户设备800的显示器和小键盘,传感器组件814还可以检测用户设备800或用户设备800一个组件的位置改变,用户与用户设备800接触的存在或不存在,用户设备800方位或加速/减速和用户设备800的温度变化。传感器组件814可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件814还可以包括光传感器,如CMOS或CCD图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件814还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。
通信组件816被配置为便于用户设备800和其他设备之间有线或无线方式的通信。用户设备800可以接入基于通信标准的无线网络,如WiFi,4G或5G,或它们的组合。在一个示例性实施例中,通信组件816经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件816还包括近场通信(NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(RFID)技术,红外数据协会(IrDA)技术,超宽带(UWB)技术,蓝牙(BT)技术和其他技术来实现。
在示例性实施例中,用户设备800可以被一个或多个应用专用集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理设备(DSPD)、可编程逻辑器件(PLD)、现场可编程门阵列(FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述方法。
在示例性实施例中,还提供了一种包括指令的非临时性计算机可读存储介质,例如包括指令的存储器804,上述指令可由用户设备800的处理器820执行以完成上述方法。例如,所述非临时性计算机可读存储介质可以是ROM、随机存取存储器(RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
如图10所示,本公开一实施例示出一种基站的结构。例如,基站900可以被提供为一网络侧设备。参照图10,基站900包括处理组件922,其进一步包括一个或多个处理器,以及由存储器932所代表的存储器资源,用于存储可由处理组件922的执行的指令,例如应用程序。存储器932中存储的应用程序可以包括一个或一个以上的每一个对应于一组指令的模块。此外,处理组件922被配置为执行指令,以执行上述方法前述应用在所述基站的任意方法。
基站900还可以包括一个电源组件926被配置为执行基站900的电源管理,一个有线或无线网络接口950被配置为将基站900连接到网络,和一个输入输出(I/O)接口958。基站900可以操作基于存储在存储器932的操作系统,例如Windows Server TM,Mac OS XTM,UnixTM,LinuxTM,FreeBSDTM或类似。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本发明的其它实施方案。本公开旨在涵盖本发明的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本发明的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本发明的真正范围和精神由下面的权利要求指出。
应当理解的是,本发明并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本发明的范围仅由所附的权利要求来限制。

Claims (19)

  1. 一种AI模型确定方法,其中,由用户设备UE执行,包括:
    发送UE的人工智能AI能力信息,其中,所述AI能力信息用于基站确定所述UE使用的信道状态信息CSI的AI模型;其中,所述AI能力信息包括以下至少之一:
    AI能力指示信息,用于指示UE是否支持AI能力;
    AI等级指示信息,用于指示UE支持的AI能力所属的等级;
    AI模型的标识信息,用于指示UE支持的AI模型;
    AI平台的标识信息,用于指示UE支持的AI平台;
    AI推理指示信息,用于指示UE是否支持AI的推理能力;
    以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
  2. 根据权利要求1所述的方法,其中,所述AI能力信息是每UE上报的。
  3. 根据权利要求1所述的方法,其中,所述AI能力信息是非强制性上报的或者是有条件的强制性上报。
  4. 根据权利要求1至3任一项所述的方法,其中,所述方法还包括:
    接收CSI上报配置信息;其中,所述CSI上报配置信息是所述基站基于所述AI能力信息确定的;
    基于所述CSI上报配置信息,确定所述UE使用的至少一个所述AI模型。
  5. 根据权利要求4所述的方法,其中,所述基于所述CSI上报配置信息,确定所述UE使用的至少一个所述AI模型,包括:
    基于所述CSI上报配置信息中未包括所述AI模型的标识信息,确定所述UE使用所述UE支持的一个所述AI模型;
    或者,
    基于所述CSI上报配置信息中包括至少一个CSI上报配置的标识信息与对应的AI模型的标识信息,确定不同的CSI上报配置使用对应的所述AI模型。
  6. 根据权利要求4所述的方法,其中,所述方法还包括:
    发送AI模型使用信息;其中,所述AI模型使用信息用于确定UE使用的AI模型。
  7. 根据权利要求6所述的方法,其中,所述AI模型使用信息,包括以下至少之一:
    AI模型的标识信息;
    与所述AI模型的标识信息对应的CSI上报配置的标识信息。
  8. 根据权利要求6或7所述的方法,其中,所述发送AI模型使用信息,包括:
    基于接收到来自网络设备的上报请求信息,发送所述AI模型使用信息;其中,所述上报请求信息用于请求所述UE使用的AI模型。
  9. 一种AI模型确定方法,其中,由基站执行,包括:
    接收用户设备UE的人工智能AI能力信息;
    基于所述AI能力信息,确定所述UE使用的至少一种信道状态信息CSI的AI模型;
    其中,所述AI能力信息包括以下至少之一:
    AI能力指示信息,用于指示UE是否支持AI能力;
    AI等级指示信息,用于指示UE支持的AI能力所属的等级;
    AI模型的标识信息,用于指示UE支持的AI模型;
    AI平台的标识信息,用于指示UE支持的AI平台;
    AI推理指示信息,用于指示UE是否支持AI的推理能力;
    以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
  10. 根据权利要求9所述的方法,其中,所述方法包括:
    发送CSI上报配置信息,其中,所述CSI上报配置信息用于指示所述UE使用指示一个所述AI模型。
  11. 根据权利要求9所述的方法,其中,所述发送所述CSI上报配置信息,包括:
    响应于确定所述UE支持一种AI模型,确定发送未包括AI模型的标识信息的CSI上报配置信息;其中,所述CSI上报配置信息用于指示所述UE使用所述UE支持的一个所述AI模型。
  12. 根据权利要求9所述的方法,其中,所述发送所述CSI上报配置信息,包括:
    响应于确定所述UE支持多种AI模型,发送CSI上报配置信息,其中,所述CSI上报配置信息包括:至少一个CSI上报配置的标识信息与对应的所述AI模型的标识信息。
  13. 根据权利要求10所述的方法,其中,所述方法包括:
    接收UE发送的AI模型使用信息;
    基于所述AI模型使用信息,确定基站使用的CSI的AI模型。
  14. 根据权利要求13所述的方法,其中,所述AI模型使用信息,包括以下至少之一:
    AI模型的标识信息;
    与所述AI模型的标识信息对应的CSI上报配置的标识信息。
  15. 根据权利要求14所述的方法,其中,所述方法包括:
    发送上报请求信息;其中,所述上报请求信息用于请求所述UE使用的AI模型。
  16. 一种AI模型确定装置,其中,包括:
    第一发送模块,被配置为发送用户设备UE的人工智能AI能力信息,其中,所述AI能力信息用于基站确定所述UE使用的信道状态信息CSI的AI模型;其中,所述AI能力信息包括以下至少之一:
    AI能力指示信息,用于指示UE是否支持AI能力;
    AI等级指示信息,用于指示UE支持的AI能力所属的等级;
    AI模型的标识信息,用于指示UE支持的AI模型;
    AI平台的标识信息,用于指示UE支持的AI平台;
    AI推理指示信息,用于指示UE是否支持AI的推理能力;
    以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
  17. 一种AI模型确定装置,其中,包括:
    第二接收模块,被配置为接收用户设备UE的人工智能AI能力信息;
    第二处理模块,被配置为基于所述AI能力信息,确定所述UE使用的至少一种信道状态信息CSI的AI模型;
    其中,所述AI能力信息包括以下至少之一:
    AI能力指示信息,用于指示UE是否支持AI能力;
    AI等级指示信息,用于指示UE支持的AI能力所属的等级;
    AI模型的标识信息,用于指示UE支持的AI模型;
    AI平台的标识信息,用于指示UE支持的AI平台;
    AI推理指示信息,用于指示UE是否支持AI的推理能力;
    以及AI训练指示信息,用于指示UE是否支持AI模型的训练能力。
  18. 一种通信设备,其中,所述通信设备,包括:
    处理器;
    用于存储所述处理器可执行指令的存储器;
    其中,所述处理器被配置为:用于运行所述可执行指令时,实现权利要求1至8、或者权利要求9至15任一项所述的AI模型确定方法。
  19. 一种计算机存储介质,其中,所述计算机存储介质存储有计算机可执行程序,所述可执行程序被处理器执行时实现权利要求1至8、或者权利要求11至15任一项所述的AI模型确定方法。
PCT/CN2022/100906 2022-06-23 2022-06-23 Ai模型确定方法、装置、通信设备及存储介质 WO2023245576A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202280002321.8A CN115349279A (zh) 2022-06-23 2022-06-23 Ai模型确定方法、装置、通信设备及存储介质
PCT/CN2022/100906 WO2023245576A1 (zh) 2022-06-23 2022-06-23 Ai模型确定方法、装置、通信设备及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/100906 WO2023245576A1 (zh) 2022-06-23 2022-06-23 Ai模型确定方法、装置、通信设备及存储介质

Publications (1)

Publication Number Publication Date
WO2023245576A1 true WO2023245576A1 (zh) 2023-12-28

Family

ID=83957702

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/100906 WO2023245576A1 (zh) 2022-06-23 2022-06-23 Ai模型确定方法、装置、通信设备及存储介质

Country Status (2)

Country Link
CN (1) CN115349279A (zh)
WO (1) WO2023245576A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024093057A1 (en) * 2023-02-24 2024-05-10 Lenovo (Beijing) Limited Devices, methods, and computer readable storage medium for communication
CN117856947A (zh) * 2024-02-21 2024-04-09 荣耀终端有限公司 Csi压缩模型指示方法及通信装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210160149A1 (en) * 2019-11-22 2021-05-27 Huawei Technologies Co., Ltd. Personalized tailored air interface
US20210195462A1 (en) * 2019-12-19 2021-06-24 Qualcomm Incorporated Configuration of artificial intelligence (ai) modules and compression ratios for user-equipment (ue) feedback
WO2022028450A1 (zh) * 2020-08-05 2022-02-10 展讯半导体(南京)有限公司 Ai网络模型支持能力上报、接收方法及装置、存储介质、用户设备、基站
CN114143799A (zh) * 2020-09-03 2022-03-04 华为技术有限公司 通信方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210160149A1 (en) * 2019-11-22 2021-05-27 Huawei Technologies Co., Ltd. Personalized tailored air interface
US20210195462A1 (en) * 2019-12-19 2021-06-24 Qualcomm Incorporated Configuration of artificial intelligence (ai) modules and compression ratios for user-equipment (ue) feedback
WO2022028450A1 (zh) * 2020-08-05 2022-02-10 展讯半导体(南京)有限公司 Ai网络模型支持能力上报、接收方法及装置、存储介质、用户设备、基站
CN114143799A (zh) * 2020-09-03 2022-03-04 华为技术有限公司 通信方法及装置

Also Published As

Publication number Publication date
CN115349279A (zh) 2022-11-15

Similar Documents

Publication Publication Date Title
WO2022000188A1 (zh) 用户设备辅助信息的上报方法及装置、用户设备、存储介质
US20230276430A1 (en) Resource scheduling method and apparatus, communication device and storage medium
WO2023245576A1 (zh) Ai模型确定方法、装置、通信设备及存储介质
WO2021030974A1 (zh) 寻呼配置方法、装置、通信设备及存储介质
US20230269047A1 (en) Positioning reference signaling configuration method and apparatus, user equipment, and storage medium
EP4319259A1 (en) Measurement gap processing method and apparatus, and communication device and storage medium
WO2022016466A1 (zh) 资源请求信息处理方法及装置、通信设备及存储介质
WO2023060490A1 (zh) 能力信息的上报方法、装置、通信设备及存储介质
WO2023065091A1 (zh) 寻呼过滤规则确定方法及装置、通信设备及存储介质
WO2022205341A1 (zh) 测量间隔预配置处理方法、装置、通信设备及存储介质
WO2022006759A1 (zh) 信息传输方法、装置、通信设备和存储介质
WO2022147662A1 (zh) 测量间隙调度方法及装置、通信设备及存储介质
WO2022016450A1 (zh) 逻辑信道复用方法及装置、通信设备及存储介质
US20220408469A1 (en) Downlink control information configuration method and apparatus, and communication device and storage medium
US20220295447A1 (en) Network access method, apparatus, communication device, and storage medium
WO2023155111A1 (zh) 信息处理方法、装置、通信设备及存储介质
WO2022267039A1 (zh) Bwp指示方法、装置、通信设备及存储介质
US20230224769A1 (en) Method and apparatus for controlling data transmission rate communication device, and storage medium
WO2023221025A1 (zh) 波束确定方法、装置、通信设备及存储介质
WO2023077524A1 (zh) 一种寻呼过滤的方法、装置、通信设备及存储介质
WO2024036632A1 (zh) Bsr增强调度方法以及装置、通信设备及存储介质
WO2023102701A1 (zh) 信息处理方法、装置、通信设备及存储介质
WO2023240643A1 (zh) 信息处理方法、装置、通信设备及存储介质
WO2023151055A1 (zh) 发送配置信息的方法、装置、通信设备及存储介质
WO2024020756A1 (zh) 无线通信方法、装置、通信设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22947347

Country of ref document: EP

Kind code of ref document: A1