WO2023116836A1 - Procédé et appareil d'acquisition de trame d'image, et dispositif de communication - Google Patents

Procédé et appareil d'acquisition de trame d'image, et dispositif de communication Download PDF

Info

Publication number
WO2023116836A1
WO2023116836A1 PCT/CN2022/141148 CN2022141148W WO2023116836A1 WO 2023116836 A1 WO2023116836 A1 WO 2023116836A1 CN 2022141148 W CN2022141148 W CN 2022141148W WO 2023116836 A1 WO2023116836 A1 WO 2023116836A1
Authority
WO
WIPO (PCT)
Prior art keywords
image frame
video service
communication device
target video
information
Prior art date
Application number
PCT/CN2022/141148
Other languages
English (en)
Chinese (zh)
Inventor
刘进华
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2023116836A1 publication Critical patent/WO2023116836A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W36/00Hand-off or reselection arrangements
    • H04W36/0005Control or signalling for completing the hand-off
    • H04W36/0055Transmission or use of information for re-establishing the radio link
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W36/00Hand-off or reselection arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W36/00Hand-off or reselection arrangements
    • H04W36/0005Control or signalling for completing the hand-off
    • H04W36/0055Transmission or use of information for re-establishing the radio link
    • H04W36/0058Transmission of hand-off measurement information, e.g. measurement reports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W36/00Hand-off or reselection arrangements
    • H04W36/16Performing reselection for specific purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W36/00Hand-off or reselection arrangements
    • H04W36/16Performing reselection for specific purposes
    • H04W36/165Performing reselection for specific purposes for reducing network power consumption

Definitions

  • the present application belongs to the technical field of wireless communication, and in particular relates to an image frame acquisition method, device and communication equipment.
  • extended reality may include augmented reality (Augmented Reality, AR) business, virtual reality (Virtual Reality, VR) business and mixed AR and MR (Mixed AR and VR, MR) business, in
  • image data needs to be encoded to achieve image data compression.
  • image data is encoded into three types of image frames: Intra-coded picture (I) frame, Predictive-coded picture (P ) frame and bidirectional predicted picture (Bidirectional predicted picture, B) frame.
  • an I frame is also called an independently decoded frame (IDR frame), which is a complete image frame and can be generated and presented independently of other frames.
  • the P frame only contains the image change information relative to the previous frame, and the receiver needs to combine the previous frame to generate the current frame and complete the display on the receiving terminal.
  • the B frame is used to indicate the change information of the current frame relative to the previous frame and the subsequent frame, and the receiver needs to combine the previous frame and the subsequent frame to generate the current frame.
  • the preceding frames and the following frames are sorted according to the frame presentation time or the image acquisition time of the source end, and the actual sending and receiving time may be adjusted according to the image decoding time of the receiver. For example, the sender can send according to the decoding time sequence of the image frames of the receiver.
  • XR service data can be transmitted end-to-end through the wireless network.
  • the transmission may be interrupted for some reason.
  • the wireless network may need to be switched or the wireless link may fail.
  • the subsequent wireless link re-establishment or the start of the measurement gap may cause the transmission to be interrupted, so that the image frames of some XR services cannot be successfully transmitted. Aiming at the situation of the interruption of the transmission, no relevant solution has been provided at present.
  • Embodiments of the present application provide a method, device, and communication device for acquiring image frames, and provide a solution for video service transmission interruption.
  • a method for acquiring an image frame including: a first communication device determining data transmission interruption information of a target video service, wherein the data transmission interruption information indicates that the transmission of the target video service is interrupted; the The first communication device acquires the first image frame based on the data transmission interruption information, wherein the data transmission interruption information includes at least one of the following: first indication information, wherein the first indication information is used to indicate the The transmission interruption of the target video service; the time information of the transmission interruption, wherein the time information includes at least one of the following: the duration of the transmission interruption, the start time point of the transmission interruption, and the end time point of the transmission interruption; the second image The number of frames, the second image frame is an image frame lost due to the interruption of the transmission; the type of the second image frame; the identification of the second image frame; the information of the target image frame, wherein the The target image frame is before the transmission interruption occurs, the last correct image frame of the target video service; target information, wherein the target information is the Internet protocol (Internet Protocol, IP)
  • an image frame acquisition device including: a determining module, configured to determine information in data transmission of a target video service, wherein the data transmission interruption information indicates that the transmission of the target video service is interrupted; A module, configured to acquire a first image frame based on the data transmission interruption information, wherein the data transmission interruption information includes at least one of the following: first indication information, wherein the first indication information is used to indicate the The transmission interruption of the target video service; the time information of the transmission interruption, wherein the time information includes at least one of the following: the duration of the transmission interruption, the start time point of the transmission interruption, and the end time point of the transmission interruption; the second image The number of frames, the second image frame is an image frame lost due to the interruption of the transmission; the type of the second image frame; the identification of the second image frame; the information of the target image frame, wherein the The target image frame is before the transmission interruption occurs, the last correct image frame of the target video service; target information, wherein the target information is the Internet protocol (Internet Protocol, IP) data
  • a communication device in a third aspect, includes a processor and a memory, the memory stores programs or instructions that can run on the processor, and the programs or instructions are implemented when executed by the processor The steps of the method as described in the first aspect.
  • a communication device including a processor and a communication interface, wherein the processor is used to implement the steps of the method according to the first aspect, and the communication interface is used to communicate with an external device.
  • a readable storage medium is provided, and a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, the steps of the method according to the first aspect are implemented.
  • a sixth aspect provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, the processor is used to run programs or instructions, and implement the method as described in the first aspect A step of.
  • a computer program/program product is provided, the computer program/program product is stored in a storage medium, and the computer program/program product is executed by at least one processor to implement the method described in the first aspect method steps.
  • the first communication device determines the data transmission interruption information of the target video service, and acquires the first image frame based on the data transmission interruption information, so that the decoding of the acquired first image frame does not depend on Due to the second image frame lost due to data transmission interruption, it can avoid the problem of useless data transmission due to the decoding of the image frame transmitted after the transmission interruption needs to rely on the second image frame, and save wireless resources . Moreover, it can also avoid the situation that the interruption time of video playing is longer than the interruption time of video service data transmission due to useless data transmission.
  • FIG. 1 shows a block diagram of a wireless communication system to which an embodiment of the present application is applicable
  • Fig. 2 shows a schematic flow chart of an image frame acquisition method provided by an embodiment of the present application
  • Figure 3a shows a schematic diagram of video service transmission in the embodiment of the present application
  • FIG. 3b shows a schematic diagram of another video service transmission in the embodiment of the present application.
  • Fig. 4a shows a schematic diagram of a way to regenerate an image frame in the embodiment of the present application
  • Fig. 4b shows a schematic diagram of another way to regenerate image frames in the embodiment of the present application
  • FIG. 5 shows a schematic diagram of a protocol layer structure in an embodiment of the present application
  • FIG. 6 shows another schematic flow chart of the method for acquiring an image frame provided by the embodiment of the present application
  • FIG. 7 shows a schematic structural diagram of an image frame acquisition device provided by an embodiment of the present application.
  • FIG. 8 shows a schematic structural diagram of a communication device provided by an embodiment of the present application.
  • FIG. 9 shows a schematic diagram of a hardware structure of a terminal provided by an embodiment of the present application.
  • FIG. 10 shows a schematic diagram of a hardware structure of a network side device provided by an embodiment of the present application.
  • first, second and the like in the specification and claims of the present application are used to distinguish similar objects, and are not used to describe a specific sequence or sequence. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein and that "first" and “second” distinguish objects. It is usually one category, and the number of objects is not limited. For example, there may be one or more first objects.
  • “and/or” in the description and claims means at least one of the connected objects, and the character “/” generally means that the related objects are an "or” relationship.
  • LTE Long Term Evolution
  • LTE-Advanced LTE-Advanced
  • LTE-A Long Term Evolution-Advanced
  • CDMA Code Division Multiple Access
  • TDMA Time Division Multiple Access
  • FDMA Frequency Division Multiple Access
  • OFDMA Orthogonal Frequency Division Multiple Access
  • SC-FDMA Single-carrier Frequency Division Multiple Access
  • system and “network” in the embodiments of the present application are often used interchangeably, and the described technologies can be used for the above-mentioned systems and radio technologies as well as other systems and radio technologies.
  • NR New Radio
  • the following description describes the New Radio (NR) system for illustrative purposes, and uses NR terminology in most of the following descriptions, but these techniques can also be applied to applications other than NR system applications, such as the 6th generation (6 th Generation, 6G) communication system.
  • 6G 6th Generation
  • Fig. 1 shows a block diagram of a wireless communication system to which the embodiment of the present application is applicable.
  • the wireless communication system includes a terminal 11 and a network side device 12 .
  • the terminal 11 can be a mobile phone, a tablet computer (Tablet Personal Computer), a laptop computer (Laptop Computer) or a notebook computer, a personal digital assistant (Personal Digital Assistant, PDA), a palmtop computer, a netbook, a super mobile personal computer (ultra-mobile personal computer, UMPC), mobile Internet device (Mobile Internet Device, MID), augmented reality (augmented reality, AR) / virtual reality (virtual reality, VR) equipment, robot, wearable device (Wearable Device) , vehicle equipment (VUE), pedestrian terminal (PUE), smart home (home equipment with wireless communication functions, such as refrigerators, TVs, washing machines or furniture, etc.), game consoles, personal computers (personal computers, PCs), teller machines or self-service Wearable devices include: smart watches, smart bracelets, smart headphones, smart glasses, smart jewelry (
  • the network side device 12 may include an access network device and/or a core network device, wherein the access network device 12 may also be called a radio access network device, a radio access network (Radio Access Network, RAN), a radio access network function or radio access network unit.
  • RAN Radio Access Network
  • the access network device 12 may include a base station, a WLAN access point, or a WiFi node, etc., and the base station may be called a Node B, an evolved Node B (eNB), an access point, a Base Transceiver Station (Base Transceiver Station, BTS), a radio Base station, radio transceiver, Basic Service Set (BSS), Extended Service Set (ESS), Home Node B, Home Evolved Node B, Transmitting Receiving Point (TRP) or all As long as the same technical effect is achieved, the base station is not limited to a specific technical vocabulary. It should be noted that in this embodiment of the application, only the base station in the NR system is used as an example for introduction, and The specific type of the base station is not limited.
  • Core network equipment may include but not limited to at least one of the following: core network nodes, core network functions, mobility management entities (Mobility Management Entity, MME), access mobility management functions (Access and Mobility Management Function, AMF), session management functions (Session Management Function, SMF), User Plane Function (UPF), Policy Control Function (Policy Control Function, PCF), Policy and Charging Rules Function (PCRF), edge application service Discovery function (Edge Application Server Discovery Function, EASDF), unified data management (Unified Data Management, UDM), unified data storage (Unified Data Repository, UDR), home subscriber server (Home Subscriber Server, HSS), centralized network configuration ( Centralized network configuration, CNC), network storage function (Network Repository Function, NRF), network exposure function (Network Exposure Function, NEF), local NEF (Local NEF, or L-NEF), binding support function (Binding Support Function, BSF), application function (Application Function, AF), etc. It should be noted that, in the embodiment of the present application, only the core
  • FIG. 2 shows a schematic flowchart of a method for acquiring an image frame provided by an embodiment of the present application, and the method 200 may be executed by a first communication device.
  • the method may be performed by software or hardware installed on the first communication device.
  • the method may include the following steps.
  • the first communication device determines data transmission interruption information of the target video service, where the data transmission interruption information indicates transmission interruption of the target video service.
  • the first communication device may be a sending end device or a receiving end device of a target video service.
  • the application program (APP) carried by the terminal obtains video service data from the content server through the wireless network through the modem (modem) of the UE, and the first communication device It can be a terminal or a content server.
  • the video service is transmitted between two terminals (that is, UE1 and UE2), and the modem of the UE and the base station run a wireless transmission protocol to transmit video service data, then the first communication device can It can be UE1 or UE2.
  • the target video service includes but not limited to XR service.
  • the data transmission interruption information may include at least one of the following:
  • First indication information wherein the first indication information is used to indicate that the transmission of the target video service is interrupted, for example, due to handover, measurement gap, wireless link failure, beam failure, or consistency of listening before speaking (Listen Before Talk, LBT) failure, one or more factors in network congestion cause data transmission interruption of the target video service being transmitted.
  • LBT Long Before Talk
  • Time information of transmission interruption wherein the time information includes at least one of the following: the duration of transmission interruption, the start time point of transmission interruption, and the end time point of transmission interruption; through the time information of transmission interruption, the first The communication device can determine the second image frame that was lost due to the interruption of transmission.
  • the wireless access protocol layer of the first communication device may obtain an identifier of an image frame lost due to transmission interruption, for example, a frame number.
  • the wireless access protocol layer of the first communication device may acquire frame numbers of image frames transmitted from the beginning of the measurement interval to the end of the measurement interval, that is, the lost image frames.
  • the lost image frame is a control frame (for example, an I frame), or the lost image frame is a non-control frame (for example, a B frame or a P frame).
  • An identifier of the second image frame for example, a frame number. With this identification, image frames lost due to transmission interruptions can be identified.
  • the target image frame is the last correct image frame transmitted by the target video service before transmission interruption occurs.
  • the information of the target image frame includes but is not limited to at least one of the following: an identifier of the target image frame, and a type of the target image frame. Through the information of the target image frame, it can be determined that image frames subsequent to the target image frame are missing.
  • Target information is the information carried in the header of the IP data packet of the image frame of the target video service. Since the information carried in the packet headers of the image frames of the same target video service is the same, the service whose transmission is interrupted can be identified through the target information.
  • the target information includes at least one of the following: source IP address, target IP address, and service type.
  • the first communication device acquires a first image frame based on the data transmission interruption information.
  • the first communication device acquires the first image frame based on the data transmission interruption information, where the first image frame may be an image frame that has no dependency relationship with the second image frame, that is, the first image frame
  • the decoding of an image frame does not depend on the second image frame, for example, the first image frame and the second image frame do not belong to the same video frame combination.
  • the first communication device may acquire the first image frame that does not depend on the second image frame based on the data transmission interruption information, thereby avoiding that the decoding of the newly generated image frame needs to rely on the The second image frame, while causing the problem of useless data transmission, saves wireless resources.
  • the first communication device may control the image frame generation program of the target video service to regenerate the Describe the first image frame.
  • the target video service is a downlink service, that is, the first communication device is the receiving end of the target video service
  • the first communication device may notify the communication peer (that is, the sending end of the target video service), so After receiving the notification from the first communication device, the communication peer can control the image frame generation program of the target video service, regenerate the first image frame, and send it to the first communication device.
  • the transmission interruption of the first communication device occurs during the transmission process of the XR service.
  • some image frames of the XR service cannot be transmitted within the delay budget due to the inability to perform data transmission during the switching gap. And lost, and rely on these lost image frames to successfully decode the image frame into the situation of invalid data transmission, until the next I frame transmission is successful and just get back to normal.
  • the B frame cannot be successfully transmitted to the data receiver. decoded, resulting in an invalid transmission of that B-frame.
  • the multiple image frames in the subsequent transmission may be all Ineffective transmission, which in turn leads to longer interruption of XR service transmission.
  • image frame transmission failure caused by measurement gap, wireless reconnection after wireless link failure, beam failure, and consistent LBT failure has a similar impact on XR services as image frame transmission failure during handover has a similar impact on XR services .
  • the peer end is required to receive and process more image frames in a short period of time. This solution will not only increase the number of air interface
  • the peak throughput requirements of the receiving end also impose a multiplication requirement on the receiving end's wireless data receiving processing and image decoding capabilities (the multiplier depends on the number and average size of all accumulated image frames).
  • the receiving end is a UE, then It will obviously increase the cost of UE and the difficulty and complexity of product design. Therefore, in the embodiment of the present application, when it is determined that the transmission of the target video service is interrupted, the first image frame whose decoding does not depend on the image frame lost due to the transmission interruption is reacquired, on the one hand, the transmission of invalid image frames can be avoided, On the other hand, the peak requirement for the throughput of the air interface will not be increased, and the requirement for the peak capability of wireless data reception processing and image decoding at the receiving end will not be increased.
  • S212 may include: when the target video service is an uplink service, the first communication device performs at least one of the following:
  • the decoding of the first image frame does not depend on the second image frame
  • the image decoding of the newly generated first image frame does not depend on The second image frame is lost, but it can be relied on that the correct image frame has been transmitted before the data transmission interruption of the target video service occurs.
  • the newly generated image frame includes one or more of I frame, P frame and B frame.
  • I frame, P frame and B frame For example, in Figure 4a, when switching windows, 4 image frames are lost due to transmission interruption, and the newly generated image frames (that is, the image frames generated after switching windows) do not depend on the lost 4 image frames, but Rely on the correct image frame to be transmitted before the transmission is interrupted.
  • the decoding of the first image frame does not depend on the third image frame
  • the third image frame is an image frame generated before the data transmission interruption information is determined.
  • the image decoding of the newly generated first image frame does not depend on the third image frame generated before determining the data transmission interruption information of the target video service, and the newly generated image frame includes IDR frame, I frame and depends on these I frame Or P frame and B frame of IDR frame. That is to say, the decoding of the first image frame does not depend on all image frames generated before the interruption of transmission.
  • the decoding of the fourth image frame depends on the second image frame or depends on the third image frame.
  • the first communication device may stop generating image frames that depend on previously lost data packets or image frames, for example, discard already generated image frames that depend on lost frames/packets.
  • the application layer of the terminal can transmit the target video service with the content server or another terminal through the modem of the terminal through the wireless network, and the target video service is coded at the application layer to obtain the image frame to be transmitted.
  • the modem is used to transmit via the wireless network, and the interruption of transmission is generally caused by the wireless network. Therefore, the modem of the terminal generally detects the interruption of transmission. Therefore, in a possible implementation manner, in S210, the first communication device determining the data transmission interruption information of the target video service includes: the modem of the first communication device sends a message to the application layer of the first communication device The data transmission interruption information.
  • the modem of the first communication device determines that the terminal performs wireless network switching, or the modem determines that the terminal needs to enter a measurement gap, or the modem determines that the terminal's wireless link fails and performs wireless reconnection, or the modem determines that the terminal's beam fails , or, the modem determines that the terminal conformance LTB fails, thereby causing data transmission interruption of the target video service, the modem may obtain the above data transmission interruption information, and send the data transmission interruption information to the application layer of the first communication device.
  • S212 may include: the application layer of the first communication device adjusts the generation program of the image frame of the target video service to obtain a newly generated first image frame.
  • the application layer of the first communication device adjusts the encoding program for the APP that transmits the target video service to encode the target video service, re-encodes the target video service, and generates a new The second image frame is decoded from the first image frame.
  • the application layer of the first communication device may also stop generating the fourth image frame.
  • the application layer of the first communication device may also suspend the encoding of the target video service until the encoding is restarted after the transmission interruption ends.
  • the method may further include: the The application layer of the first communication device sends feedback information to a modem of the first communication device, where the feedback information carries update information of the target video service; and the modem receives the feedback information.
  • the modem can know that the image frame to be sent subsequently is an image frame that does not depend on the second image frame.
  • the feedback information includes but is not limited to at least one of the following:
  • Second indication information wherein the second indication information is used to indicate that the image frame has been regenerated.
  • the method further includes at least one of the following:
  • Discard untransmitted image frame data in the buffer for example, packet data convergence protocol (Packet Data Convergence Protocol, PDCP) and/or radio link control (Radio Link Control, RLC) layers discard untransmitted image frames in the buffer data.
  • packet data convergence protocol Packet Data Convergence Protocol
  • RLC Radio Link Control
  • cancel the data sending procedure of the target video service If there is a data sending procedure of the target video service that has been triggered or is in progress, cancel the data sending procedure of the target video service. For example, cancel the scheduling request (Scheduling Request, SR) program and/or buffer status report (Buffer Status Report, BSR) sending program triggered by XR data, and cancel the hybrid automatic repeat request (Hybrid automatic repeat) used to transmit XR business data request, HARQ) transmission program, etc.
  • the first communication device may determine the data transmission interruption information by receiving notification information sent by the second communication device. Therefore, in this possible implementation manner, S210 may include: the first communication device receives notification information sent by the second communication device, where the notification information indicates that the data transmission of the target video service is interrupted, and the The second communication device is the receiving end of the target video service. For example, the receiving end of the target video service may send the notification message to the first communication device if the data transmission of the target video service is interrupted due to switching of the wireless network.
  • the application layer of the terminal may transmit it through the wireless network through the modem of the terminal, and the above notification information may be transmitted by the application layer (ie APP) of the receiving end according to the relevant image encoding
  • the relevant processes implemented by the application layer control message are sent to the originator. Therefore, in a possible implementation manner, the receiving the notification information sent by the second communication device by the first communication device may include: the application layer of the first communication device receives the notification information sent by the application layer of the second communication device. notification information.
  • the acquisition of the newly generated first image frame by the first communication device based on the data transmission interruption information may include:
  • Step 1 when the target video service is a downlink service, the first communication device sends notification information to the second communication device, wherein the notification information indicates that the data transmission of the target video service is interrupted, and the The second communication device is the sending end of the target video service;
  • Step 2 receiving the first image frame sent by the second communication device.
  • the target video service is a downlink service, that is, when the first communication device is the receiving end of the target video service
  • the first communication device determines that the transmission of the target video service is interrupted
  • it may Notify the sender of the target video service (that is, the second communication device), so that the sender of the target video service can generate a new first image that does not depend on the second image frame according to the method adopted by the first communication device frame, and transmit it to the first communication device through the wireless network, so as to avoid the transmission of invalid data.
  • the notification information may include the foregoing data transmission interruption information, so that the second communication device may determine the second image frame.
  • the sending the notification information to the second communication device by the first communication device may include: sending the notification information to the application layer of the second communication device by an application layer of the first communication device.
  • the application layer of the first communication device may realize the transmission of the communication information according to the application layer control message of the relevant image coding transmission, and the specific process will not be repeated in this embodiment of the application.
  • the network side device may configure and enable the first communication device to perform the above step of acquiring the newly generated first image frame when video service transmission is interrupted. Therefore, before the first communication device determines the data transmission interruption information of the target video service, the method may further include: the first communication device receiving a configuration enable message sent by the network side device, wherein the configuration enables The enable message is used to indicate that the transmission of the target video service is determined to be interrupted when a predetermined condition is met. That is to say, the network side device may configure the terminal to determine that the transmission of the target video service is interrupted when a predetermined condition is met, thereby triggering step S212.
  • the predetermined condition includes at least one of the following:
  • step S212 is triggered.
  • step S212 The lost image frames of the target video service exceed the second number threshold; that is, when the lost image frames of the target video service exceed the second number threshold, step S212 is triggered.
  • step S212 is triggered.
  • the transmission of the target image frame of the target video service fails, wherein the target image frame is an image frame of a predetermined type; for example, the type of the target image frame is an I frame, a P frame or a B frame.
  • a handover command is received; for example, a handover command sent by the source base station is received, and step S212 is triggered.
  • the handover is completed after the handover command is received; for example, after the handover command is received, the handover process is executed to complete the handover, and step S212 is triggered.
  • the congestion situation is relieved.
  • the network side device may send indication information to the first communication device when the congestion situation is relieved after network congestion occurs.
  • the measurement gap starts; that is, when the first communication device needs to start measuring, step S212 is triggered.
  • the transmission delay of the image frame exceeds the pre-configured maximum transmission delay budget. For example, if a certain image frame is transmitted, but the transmission delay of the image frame exceeds the preconfigured maximum transmission delay budget, it is considered that the image frame is not transmitted correctly, and a transmission interruption occurs, and step S212 is triggered.
  • a request for adjusting image coding sent by the network side device is received.
  • the base station requests the first communication device to adjust image coding.
  • the receiving, by the first communication device, the configuration enabling message sent by the network side device includes: receiving, by the modem of the first communication device, the configuration enabling message sent by the network side device.
  • the configuration enabling message sent by the network side device is received by a modem of the first communication device.
  • the network side device includes but is not limited to a base station.
  • An adaptation layer can be added between the radio resource control (Radio Resource Control, RRC) layer and the application layer.
  • the adaptation layer can be located in the operating system of the UE, or as a sublayer of the application layer or wireless protocol stack .
  • the function of the adaptation layer is to rewrite the data transmission interruption information sent by the RRC layer of the modem to the application layer into a format that the application layer can parse, and interpret the target video service information sent by the application layer to the RRC layer as the RRC layer can parse format.
  • the adaptation layer provides some functions for access by the application layer and the RRC layer, and the application layer calls these functions to obtain data transmission interruption information from the RRC layer or send adjusted target video service information to the RRC layer, and the RRC layer calls some functions Send data transmission interruption information to the application layer or obtain adjusted target video service information from the application layer.
  • the RRC layer of the radio access network protocol layer in the modem is used to process the interaction with the application layer. It is also possible to introduce another layer on top of the RRC layer, that is, the APP control (Control) layer, as a sublayer of layer 3, for processing interaction with the application layer.
  • the APP control (Control) layer is also possible to introduce another layer on top of the RRC layer, that is, the APP control (Control) layer, as a sublayer of layer 3, for processing interaction with the application layer.
  • the first image frame whose decoding does not depend on the image frame lost due to the interruption of transmission can be re-acquired, on the one hand, it can avoid invalid
  • the transmission of image frames on the other hand, will not increase the peak throughput requirements of the air interface, and will not increase the wireless data reception processing and image decoding capability peak requirements of the receiving end.
  • FIG. 6 shows another schematic flow chart of the method for acquiring an image frame provided by the embodiment of the present application. As shown in FIG. 6 , the method mainly has the following steps.
  • the base station configures and enables the UE to start a procedure of notifying an application layer to adjust image frame generation when XR service data transmission is interrupted.
  • the base station may send a configuration and enabling message to the UE, and the message may include at least one of the following:
  • a wireless link failure occurs, or the wireless link is successfully rebuilt after the wireless link failure occurs; when a wireless link failure occurs, start a procedure for notifying the application layer to adjust image frame generation. Or after the wireless link fails, the wireless link is successfully rebuilt, and the procedure for notifying the application layer to adjust the image frame generation is started;
  • the network congestion situation is lifted after the network congestion; when the network congestion situation is lifted after the network congestion, start the program for notifying the application layer to adjust the image frame generation
  • S601 is an optional step.
  • the UE modem determines the occurrence of XR service transmission interruption and related information according to the configuration or autonomously.
  • the modem of the UE sends the transmission interruption information of the XR service to the application layer through the interaction mechanism between the modem of the UE and the application (APP) layer.
  • APP application
  • the XR service data transmission interruption information sent by the modem of the UE to the application layer includes at least one of the following:
  • Time information of XR service transmission interruption including the duration, time start and/or time end of XR service transmission interruption
  • the sequence number may be a sequence number offset relative to a previous I-frame or a next expected I-frame;
  • IP header of the IP packet of the image frame including source IP address, destination IP address, business type and other information.
  • the XR service data transmission interruption information can be defined/referred to as a reset image frame coding request, which is used to request the application layer to generate some or all of the image frames that do not depend on receiving the reset image frame coding request new image frame.
  • the application layer of the UE adjusts the generation of the image frame according to the received XR service transmission interruption information, so that the newly generated image frame can be decoded by the receiving end without depending on the lost image frame.
  • the application layer of the UE may perform at least one of the following options:
  • Option a regenerate the image frame, the image decoding of the newly generated image frame does not depend on the image frame lost due to the XR service data transmission interruption, but can rely on the correct image frame being transmitted before the XR service data transmission interruption occurs, new
  • the generated image frames include one or more of I frames, P frames and B frames;
  • Option b Regenerate the image frame.
  • the image decoding of the newly generated image frame does not depend on the image frame generated before receiving the XR service data transmission interruption message to realize the image decoding.
  • the newly generated image frame includes IDR frame, I frame and P-frames and B-frames that depend on these I-frames or IDR frames;
  • Option c Stop generating image frames that depend on previous lost data packets or image frames, including discarding already generated frames that depend on lost frames/packets;
  • Option d Notify the APP layer to suspend encoding until the end of transmission interruption and restart encoding.
  • FIG. 4a and FIG. 4b show a schematic diagram of the dependence relationship between the image frames regenerated by the APP layer of the UE when the XR service data transmission interruption is caused by handover (the XR service data transmission interruption caused by other factors mentioned above is related to similar to this).
  • the newly generated image frame depends on the transmission of the correct XR data packet (corresponding to option a) before the handover occurs. This situation is applicable to the modem of the UE that can directly or indirectly transmit the sequence number information of the lost image frame Notify the situation to the application layer.
  • the newly generated image frame does not depend on the image frame before the switching is completed (that is, the APP receives the XR service data transmission terminal message), but the dependency relationship between the newly generated image frames is not limited.
  • the application layer of the UE sends the adjusted XR service information to the modem of the UE.
  • This step is optional.
  • the UE's application layer can send feedback information to the UE's modem, which carries the update information of the XR service, including indicating whether to regenerate the image frame, the dependency relationship between the generated image frame and the previously generated frame (For example, whether to rely on previously generated frames, etc.).
  • the modem of the UE reports all or part of the received XR service information to the base station.
  • the PDCP/RLC layer discards the untransmitted image frame data in the cache, and/or cancels the uplink XR data transmission procedure, including the SR/ BSR procedure, HARQ transmission for transmitting XR service data, etc.
  • the application layer of the UE For the downlink XR service, after the application layer of the UE receives the XR service data transmission interruption information from the modem of the UE, in S604, the application layer of the UE needs to notify the opposite end of the XR service (peer XR transmitter for short) to regenerate the image frame.
  • the application layer of the UE can realize the related process according to the existing application layer control message of relevant image coding transmission, which will not be repeated here.
  • the application layer of the UE After receiving the response from the peer end, the application layer of the UE can execute the above S605 and S606.
  • the modem of the UE determines that the transmission of the target video service is interrupted, it notifies the application layer to reacquire the first image frame whose decoding does not depend on the image frame lost due to the transmission interruption, On the one hand, it can avoid the transmission of invalid image frames, on the other hand, it will not increase the peak throughput requirements of the air interface, and will not increase the wireless data reception processing and image decoding capability peak requirements of the receiving end.
  • the image frame acquisition method provided in the embodiment of the present application may be executed by an image frame acquisition device.
  • the method for obtaining the image frame performed by the device for obtaining the image frame is taken as an example to describe the device for obtaining the image frame provided in the embodiment of the present application.
  • FIG. 7 shows a schematic structural diagram of an image frame acquisition device provided by an embodiment of the present application.
  • the device 700 mainly includes: a determination module 701 and an acquisition module 702 .
  • the determining module 701 is configured to determine the data transmission interruption information of the target video service, wherein the data transmission interruption information indicates that the transmission of the target video service is interrupted; the obtaining module 702 is configured to determine based on the The data transmission interruption information is to obtain the first image frame, wherein the data transmission interruption information includes at least one of the following:
  • First indication information wherein the first indication information is used to indicate that the transmission of the target video service is interrupted
  • Time information of transmission interruption includes at least one of the following: the duration of transmission interruption, the start time point of transmission interruption, and the end time point of transmission interruption;
  • the target image frame is the last correct image frame transmitted by the target video service before transmission interruption occurs;
  • Target information wherein the target information is the information carried in the header of the IP data packet of the image frame of the target video service.
  • the information of the target image frame includes at least one of the following: an identifier of the target image frame, and a type of the target image frame.
  • the target information includes at least one of the following: a source IP address, a target IP address, and a service type.
  • the acquisition module 702 acquires the newly generated first image frame, including:
  • the acquisition module 702 performs at least one of the following:
  • the decoding of the first image frame does not depend on the second image frame, or, the decoding of the first image frame does not depend on the third image frame, the first image frame
  • the three image frames are image frames generated before the data transmission interruption information is determined
  • Stop generating a fourth image frame optionally, the decoding of the fourth image frame depends on the second image frame or depends on the third image frame;
  • the encoding of the target video service is suspended until the encoding is restarted after the transmission interruption ends.
  • the obtaining module 702 stops generating the fourth image frame, including: discarding the generated fourth image frame.
  • the determination module 701 determines the data transmission interruption information of the target video service, including:
  • the notification information sent by the second communication device is received, wherein the notification information indicates that the data transmission of the target video service is interrupted, and the second communication device is a receiving end of the target video service.
  • the acquisition module 702 acquires the newly generated first image frame based on the data transmission interruption information, including:
  • notification information is sent to a second communication device, wherein the notification information indicates that the data transmission of the target video service is interrupted, and the second communication device is the target video service The sender of the business;
  • the apparatus further includes: a receiving module, configured to receive a configuration enablement message sent by a network side device, where the configuration enablement message is used to indicate that the target is determined when a predetermined condition is met.
  • the data transmission of the video service is interrupted.
  • the predetermined condition includes at least one of the following:
  • the interruption duration of the target video service transmission exceeds the first time threshold
  • the image frames lost by the target video service exceed the second number threshold
  • the lost data packets of the target video service exceed a third threshold
  • the transmission of the target image frame of the target video service fails, wherein the target image frame is an image frame of a predetermined type
  • the transmission delay of the image frame exceeds the pre-configured maximum transmission delay budget
  • a request for adjusting image encoding sent by the network side device is received.
  • the image frame acquisition apparatus in the embodiment of the present application may be an electronic device, such as an electronic device with an operating system, or a component in the electronic device, such as an integrated circuit or a chip.
  • the electronic device may be a terminal, or other devices other than the terminal.
  • the terminal may include, but not limited to, the types of terminal 11 listed above, and other devices may be servers, Network Attached Storage (NAS), etc., which are not specifically limited in this embodiment of the present application.
  • NAS Network Attached Storage
  • the image frame acquisition device provided in the embodiment of the present application can implement the various processes implemented by the first communication device in the method embodiments in FIG. 2 to FIG. 6 , and achieve the same technical effect. To avoid repetition, details are not repeated here.
  • this embodiment of the present application also provides a communication device 800, including a processor 801 and a memory 802, and the memory 802 stores programs or instructions that can run on the processor 801, for example , when the communication device 800 is a terminal or a content server, when the program or instruction is executed by the processor 801, each step of the above-mentioned method for obtaining an image frame can be achieved, and the same technical effect can be achieved. To avoid repetition, it is not repeated here repeat.
  • the embodiment of the present application also provides a terminal, including a processor and a communication interface, the processor is used to implement the steps of the above image frame acquisition method embodiment, and the communication interface is used to communicate with an external communication device.
  • This terminal embodiment corresponds to the above-mentioned method embodiment, and each implementation process and implementation mode of the above-mentioned method embodiment can be applied to this terminal embodiment, and can achieve the same technical effect.
  • FIG. 9 is a schematic diagram of a hardware structure of a terminal implementing an embodiment of the present application.
  • the terminal 900 includes, but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, and a processor 910. At least some parts.
  • the terminal 900 can also include a power supply (such as a battery) for supplying power to various components, and the power supply can be logically connected to the processor 910 through the power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • a power supply such as a battery
  • the terminal structure shown in FIG. 9 does not constitute a limitation on the terminal, and the terminal may include more or fewer components than shown in the figure, or combine some components, or arrange different components, which will not be repeated here.
  • the input unit 904 may include a graphics processing unit (Graphics Processing Unit, GPU) 9041 and a microphone 9042, and the graphics processor 9041 is used in a video capture mode or an image capture mode by an image capture device (such as the image data of the still picture or video obtained by the camera) for processing.
  • the display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 907 includes at least one of a touch panel 9071 and other input devices 9072.
  • the touch panel 9071 is also called a touch screen.
  • the touch panel 9071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 9072 may include, but are not limited to, physical keyboards, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, and joysticks, which will not be repeated here.
  • the radio frequency unit 901 may transmit the downlink data from the network side device to the processor 910 for processing after receiving the downlink data; in addition, the radio frequency unit 901 may send the uplink data to the network side device.
  • the radio frequency unit 901 includes, but is not limited to, an antenna, an amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the memory 909 can be used to store software programs or instructions as well as various data.
  • the memory 909 may mainly include a first storage area for storing programs or instructions and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or instructions required by at least one function (such as a sound playing function, image playback function, etc.), etc.
  • memory 909 may include volatile memory or nonvolatile memory, or, memory 909 may include both volatile and nonvolatile memory.
  • the non-volatile memory can be read-only memory (Read-Only Memory, ROM), programmable read-only memory (Programmable ROM, PROM), erasable programmable read-only memory (Erasable PROM, EPROM), electronically programmable Erase Programmable Read-Only Memory (Electrically EPROM, EEPROM) or Flash.
  • ROM Read-Only Memory
  • PROM programmable read-only memory
  • Erasable PROM Erasable PROM
  • EPROM erasable programmable read-only memory
  • Electrical EPROM Electrical EPROM
  • EEPROM electronically programmable Erase Programmable Read-Only Memory
  • Volatile memory can be random access memory (Random Access Memory, RAM), static random access memory (Static RAM, SRAM), dynamic random access memory (Dynamic RAM, DRAM), synchronous dynamic random access memory (Synchronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (Double Data Rate SDRAM, DDRSDRAM), enhanced synchronous dynamic random access memory (Enhanced SDRAM, ESDRAM), synchronous connection dynamic random access memory (Synch link DRAM , SLDRAM) and Direct Memory Bus Random Access Memory (Direct Rambus RAM, DRRAM).
  • RAM Random Access Memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous dynamic random access memory
  • SDRAM double data rate synchronous dynamic random access memory
  • Double Data Rate SDRAM Double Data Rate SDRAM
  • DDRSDRAM double data rate synchronous dynamic random access memory
  • Enhanced SDRAM, ESDRAM enhanced synchronous dynamic random access memory
  • Synch link DRAM , SLDRAM
  • Direct Memory Bus Random Access Memory Direct Rambus
  • the processor 910 may include one or more processing units; optionally, the processor 910 integrates an application processor and a modem processor, wherein the application processor mainly processes operations related to the operating system, user interface, and application programs, etc., Modem processors mainly process wireless communication signals, such as baseband processors. It can be understood that the foregoing modem processor may not be integrated into the processor 910 .
  • processor 910 is used for:
  • the first image frame is acquired, wherein the data transmission interruption information includes at least one of the following:
  • First indication information wherein the first indication information is used to indicate that the transmission of the target video service is interrupted
  • Time information of transmission interruption includes at least one of the following: the duration of transmission interruption, the start time point of transmission interruption, and the end time point of transmission interruption;
  • the target image frame is the last correct image frame transmitted by the target video service before transmission interruption occurs;
  • Target information wherein the target information is the information carried in the header of the Internet Protocol IP data packet of the image frame of the target video service.
  • the processor 910 acquires a newly generated first image frame, including:
  • the target video service is an uplink service
  • the decoding of the first image frame does not depend on the second image frame, or, the decoding of the first image frame does not depend on the third image frame, the first image frame
  • the three image frames are image frames generated before the data transmission interruption information is determined
  • Stop generating a fourth image frame optionally, the decoding of the fourth image frame depends on the second image frame or depends on the third image frame;
  • the encoding of the target video service is suspended until the encoding is restarted after the transmission interruption ends.
  • the newly generated first image frame can be obtained, and the decoding of the newly generated first image frame does not depend on The second image frame is lost due to interruption of data transmission, thereby avoiding the problem of useless data transmission due to the decoding of a newly generated image frame depending on the second image frame, and saving wireless resources.
  • the embodiment of the present application also provides a network side device, including a processor and a communication interface, the processor is used to implement the steps of the image frame acquisition method above, and the communication interface is used to communicate with an external communication device.
  • the network-side device embodiment corresponds to the above-mentioned method embodiment, and each implementation process and implementation mode of the above-mentioned method embodiment can be applied to this network-side device embodiment, and can achieve the same technical effect.
  • the embodiment of the present application also provides a network side device.
  • the network side device 1000 includes: a processor 1001 , a network interface 1002 and a memory 1003 .
  • the network interface 1002 is, for example, a common public radio interface (common public radio interface, CPRI).
  • the network-side device 1000 in the embodiment of the present invention further includes: instructions or programs stored in the memory 1003 and executable on the processor 1001, and the processor 1001 invokes the instructions or programs in the memory 1003 to execute the The method of module execution achieves the same technical effect, so in order to avoid repetition, it is not repeated here.
  • the embodiment of the present application also provides a readable storage medium, the readable storage medium stores a program or an instruction, and when the program or instruction is executed by the processor, each process of the above-mentioned image frame acquisition method embodiment is implemented, and can To achieve the same technical effect, in order to avoid repetition, no more details are given here.
  • the processor is the processor in the terminal described in the foregoing embodiments.
  • the readable storage medium includes a computer-readable storage medium, such as a computer read-only memory ROM, a random access memory RAM, a magnetic disk or an optical disk, and the like.
  • the embodiment of the present application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions to implement the method for acquiring image frames described above
  • the chip includes a processor and a communication interface
  • the communication interface is coupled to the processor
  • the processor is used to run programs or instructions to implement the method for acquiring image frames described above
  • the chip mentioned in the embodiment of the present application may also be called a system-on-chip, a system-on-chip, a system-on-a-chip, or a system-on-a-chip.
  • An embodiment of the present application further provides a computer program/program product, the computer program/program product is stored in a storage medium, and the computer program/program product is executed by at least one processor to implement the above-mentioned image frame acquisition method
  • the various processes of the embodiment can achieve the same technical effect, so in order to avoid repetition, details are not repeated here.
  • the term “comprising”, “comprising” or any other variation thereof is intended to cover a non-exclusive inclusion such that a process, method, article or apparatus comprising a set of elements includes not only those elements, It also includes other elements not expressly listed, or elements inherent in the process, method, article, or device. Without further limitations, an element defined by the phrase “comprising a " does not preclude the presence of additional identical elements in the process, method, article, or apparatus comprising that element.
  • the scope of the methods and devices in the embodiments of the present application is not limited to performing functions in the order shown or discussed, and may also include performing functions in a substantially simultaneous manner or in reverse order according to the functions involved. Functions are performed, for example, the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
  • the methods of the above embodiments can be implemented by means of software plus a necessary general-purpose hardware platform, and of course also by hardware, but in many cases the former is better implementation.
  • the technical solution of the present application can be embodied in the form of computer software products, which are stored in a storage medium (such as ROM/RAM, magnetic disk, etc.) , CD-ROM), including several instructions to make a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

La présente demande appartient au domaine technique des communications sans fil. Sont divulgués un procédé et un appareil d'acquisition de trame d'image, et un dispositif de communication. Le procédé d'acquisition de trame d'image dans les modes de réalisation de la présente demande comprend les étapes suivantes : un premier dispositif de communication détermine des informations d'interruption de transmission de données d'un service vidéo cible, les informations d'interruption de transmission de données indiquant que la transmission du service vidéo cible est interrompue; et le premier dispositif de communication acquiert une première trame d'image sur la base des informations d'interruption de transmission de données.
PCT/CN2022/141148 2021-12-24 2022-12-22 Procédé et appareil d'acquisition de trame d'image, et dispositif de communication WO2023116836A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111605205.9 2021-12-24
CN202111605205.9A CN116347536A (zh) 2021-12-24 2021-12-24 图像帧的获取方法、装置及通信设备

Publications (1)

Publication Number Publication Date
WO2023116836A1 true WO2023116836A1 (fr) 2023-06-29

Family

ID=86890303

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/141148 WO2023116836A1 (fr) 2021-12-24 2022-12-22 Procédé et appareil d'acquisition de trame d'image, et dispositif de communication

Country Status (2)

Country Link
CN (1) CN116347536A (fr)
WO (1) WO2023116836A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005260541A (ja) * 2004-03-11 2005-09-22 Matsushita Electric Ind Co Ltd 映像表示装置及び映像表示方法
CN108924485A (zh) * 2018-06-29 2018-11-30 四川斐讯信息技术有限公司 客户端实时视频流中断处理方法及系统、监控系统
CN111246306A (zh) * 2020-03-09 2020-06-05 歌尔科技有限公司 播放控制方法、播放设备切换方法、播放设备及智能终端
CN113613346A (zh) * 2021-07-12 2021-11-05 深圳Tcl新技术有限公司 网络连接方法、装置、存储介质及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005260541A (ja) * 2004-03-11 2005-09-22 Matsushita Electric Ind Co Ltd 映像表示装置及び映像表示方法
CN108924485A (zh) * 2018-06-29 2018-11-30 四川斐讯信息技术有限公司 客户端实时视频流中断处理方法及系统、监控系统
CN111246306A (zh) * 2020-03-09 2020-06-05 歌尔科技有限公司 播放控制方法、播放设备切换方法、播放设备及智能终端
CN113613346A (zh) * 2021-07-12 2021-11-05 深圳Tcl新技术有限公司 网络连接方法、装置、存储介质及电子设备

Also Published As

Publication number Publication date
CN116347536A (zh) 2023-06-27

Similar Documents

Publication Publication Date Title
CN112753275B (zh) Nr-u lbt mac过程
CN107483710B (zh) 在无线通信系统中为终端提供服务的方法和设备
TWI477166B (zh) 於通信網路中用於節制協定引發後移之裝置與方法
US20190021134A1 (en) User plane optimization for narrowband internet of things
WO2018196855A1 (fr) Procédé de traitement de récupération de faisceau et terminal
WO2022184044A1 (fr) Procédé de réception de service de multidiffusion, procédé de configuration de service de multidiffusion, terminal et dispositif côté réseau
WO2023040893A1 (fr) Procédé et appareil de traitement de défaillance lbt continue, et terminal et dispositif côté réseau
WO2021057526A1 (fr) Procédé de reprise sur sinistre pour dispositif de passerelle et dispositif de communication
KR20230079275A (ko) 데이터 전송 방법, 장치, 단말, 네트워크 측 기기 및 저장 매체
WO2022206612A1 (fr) Procédé et appareil de commutation de trajet de communication, et terminal
WO2022028473A1 (fr) Procédé de configuration de type de transmission de données et terminal
WO2023116836A1 (fr) Procédé et appareil d'acquisition de trame d'image, et dispositif de communication
WO2022262865A1 (fr) Procédé de traitement d'intervalle, appareil, terminal, et support de stockage lisible
WO2023066107A1 (fr) Procédé et appareil de transmission de données, et terminal
WO2022199482A1 (fr) Procédé et appareil de commande de transmission de liaison montante, et terminal
WO2023280014A1 (fr) Appareil et procédé de communication de liaison latérale de monodiffusion, et terminal
WO2022127703A1 (fr) Procédé de gestion de procédure de mobilité et dispositif associé
WO2022078394A1 (fr) Procédé de transmission de service de multidiffusion, appareil et dispositif de communication
US20230071861A1 (en) Data transmission methods and communication device
WO2023066106A1 (fr) Procédé et appareil de rejet de données, terminal et dispositif côté réseau
WO2022152129A1 (fr) Procédé et appareil pour déclencher un processus d'amélioration de transmission de données de liaison montante et terminal
WO2023098799A1 (fr) Procédé et appareil de transmission d'informations, terminal et dispositif côté réseau
WO2024017134A1 (fr) Procédé de traitement de paquets de données, terminal et dispositif côté réseau
WO2023066114A1 (fr) Procédé et appareil de traitement de données, et terminal
WO2023241445A1 (fr) Procédés de traitement de retard temporel d'ensemble de paquets de données, appareil et dispositif de communication

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22910147

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE