WO2023039890A1 - 视频传输的方法和电子设备 - Google Patents

视频传输的方法和电子设备 Download PDF

Info

Publication number
WO2023039890A1
WO2023039890A1 PCT/CN2021/119379 CN2021119379W WO2023039890A1 WO 2023039890 A1 WO2023039890 A1 WO 2023039890A1 CN 2021119379 W CN2021119379 W CN 2021119379W WO 2023039890 A1 WO2023039890 A1 WO 2023039890A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
decoder
encoder
target
target frame
Prior art date
Application number
PCT/CN2021/119379
Other languages
English (en)
French (fr)
Inventor
刘康
刘国恩
燕慧智
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2021/119379 priority Critical patent/WO2023039890A1/zh
Publication of WO2023039890A1 publication Critical patent/WO2023039890A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/12Arrangements for detecting or preventing errors in the information received by using return channel
    • H04L1/16Arrangements for detecting or preventing errors in the information received by using return channel in which the return channel carries supervisory signals, e.g. repetition request signals

Definitions

  • the present application relates to the field of electronic technology, and more specifically, to a video transmission method and electronic equipment.
  • the encoder In the scene of video signal transmission in the wireless communication field, based on the transmission control/network communication protocol (transmission control protocol/internet protocol, TCP/IP) feedback information, the encoder sends data layered after accessing the channel, and the decoder receives the data. And after a certain delay, access the channel to feed back the receiving status.
  • transmission control/network communication protocol transmission control protocol/internet protocol, TCP/IP
  • the encoder can initiate a short frame transaction of request to send/clear to send (request to send/clear to send, RTS/CTS), or empty finger/confirmation (NULL/(action, ACK)) to protect a transmission opportunity (transmission opportunity, TXOP) sequence, after which the encoder can continuously transmit multi-frame code stream data in the TXOP, and set the acknowledgment policy field of the medium access control (medium access control, MAC) frame of the data to block ack, After frame code stream data, a block ack request (BAR) frame is used to request feedback information of all code stream data.
  • RTS/CTS request to send/clear to send
  • NULL/(action, ACK) empty finger/confirmation
  • the encoder can only send the BAR frame for requesting feedback information to the decoder after successfully sending multiple frames of code stream data, which belongs to the delayed confirmation mode, so the encoder cannot obtain the receiving status of the decoder in real time.
  • the state between the encoder and the decoder cannot be synchronized.
  • the present application provides a video transmission method and electronic equipment, which can reduce the delay of decoder feedback information and synchronize the states of the encoder and decoder.
  • the encoder generates a video frame, the video frame includes a plurality of image blocks, and each image block in the plurality of image blocks is sent within a respective specified period;
  • the encoder sends a target frame to the decoder, and the target frame is used to request feedback from the decoder information.
  • the fact that the first image block is not completely sent at the end of the corresponding specified period can be understood as: before the end of the specified period corresponding to the first image block, the Some data is not sent to the decoder.
  • the encoder sends a target frame for requesting feedback information to the decoder after determining that the first image block among the plurality of image blocks included in the video frame is not completely sent at the end of the corresponding specified period. Since the encoder actively sends the target frame to the decoder when it determines that the first image block has not been completely sent at the end of the corresponding specified period, to request the decoder to send feedback information, it does not need to wait until multiple video frames are sent before sending to the decoder Sending a BAR frame for requesting feedback information can reduce the delay of decoder feedback information, so that the encoder can synchronize the status of the encoder and decoder according to the received feedback information, or understand the synchronization of the decoder to the received video frame state.
  • the target frame is a detection frame
  • the detection frame is used to instruct the decoder to feed back the state of the image block of the video frame received by the decoder
  • the Feedback information is used to characterize the state of the video frame received by the decoder
  • the target frame is a synchronization frame
  • the synchronization frame is used to instruct the decoder to perform synchronization processing on the received video frame
  • the feedback information is used to represent the synchronization status of the decoder on the received video frame.
  • the content indicated by the target frame sent by the encoder to the decoder is different, and the content represented by the feedback information sent by the decoder to the encoder is different.
  • the target frame is a detection frame used to instruct the decoder to feed back the state of the image block of the video frame received by the decoder
  • the feedback information is used to represent the state of the video frame received by the decoder ;
  • the target frame is a synchronization frame used to instruct the decoder to perform synchronization processing on the received video frame
  • the feedback information is used to represent the synchronization state of the decoder on the received video frame.
  • the purpose is to synchronize the state of the video frames on both sides of the encoder and decoder, the difference lies in the execution of the action of synchronizing the state of the video frames on both sides
  • the main body is different (that is, if the target frame is a detection frame, the encoder synchronizes the state of the video frames on both sides; if the target frame is a synchronization frame, the decoder synchronizes the state of the video frames on both sides), which can improve the synchronization encoder and decoding The flexibility of the state of the video frame on both sides of the device.
  • the target frame is a medium access control MAC frame.
  • the target frame when the target frame is the detection frame, the target frame is any one of a control frame, a data frame, or a management frame;
  • the target frame is one of a data frame or a management frame.
  • the specific format of the target frame is related to the content indicated by the target frame, if the target frame is a detection frame used to instruct the decoder to feed back the state of the image block of the video frame received by the decoder,
  • the target frame may be any one of a control frame, a data frame or a management frame; if the target frame is a synchronization frame used to instruct the decoder to perform synchronous processing on the received video frame, the target frame may be a data frame or One of the management frames; useful for synchronizing the state of video frames on both sides of the encoder and decoder.
  • control frame includes a short detection frame
  • the data frame includes a quality of service QoS empty finger frame or a QoS data frame;
  • the management frames include unsolicited acknowledgment frames or acknowledgment frames.
  • the frame header of the MAC frame carries a private value, and the private value is used to instruct the encoder to The controller requests feedback information.
  • the frame header of the MAC frame carries a private value used to instruct the encoder to request feedback information from the decoder.
  • the detection frame carries a private value used to indicate the feedback information of the decoder, and can immediately send feedback information to the encoder to represent the state of the video frame received by the decoder, and the encoder can synchronize with the decoder according to the feedback information state, so that the synchronization effect of image transmission can be guaranteed.
  • the target frame is the control frame
  • the control field of the frame header of the control frame is set to the private value
  • the target frame is the data frame, and the high throughput control HTC domain of the frame header of the data frame is set to the private value; or,
  • the target frame is the management frame, and the HTC field of the frame header of the management frame is set to the private value, or,
  • the first field in the payload of the management frame is set to the private value.
  • the data part of the MAC frame carries indication information, and the indication information is used to instruct the decoder to accept the received video Frames are processed synchronously.
  • the data part of the MAC frame carries indication information for instructing the decoder to perform synchronization processing on the received video frame.
  • the synchronization frame carries instruction information for instructing the decoder to perform synchronization processing on the received video frame, and the state of the encoder can be synchronized with the synchronization frame according to the synchronization frame, so that the synchronization effect of image transmission can be guaranteed.
  • the method further includes:
  • the encoder resends the target frame to the decoder.
  • the encoder when the encoder does not receive the feedback information sent by the decoder, it can resend the target frame for requesting feedback information to the decoder, in order to successfully receive the feedback information sent by the decoder, so that the encoder can Information to synchronize the state of the encoder and decoder, or, to know the state of synchronization of the decoder to the received video frames.
  • the rate at which the encoder resends the target frame to the decoder is lower than a first rate, and the first rate is specified for the first image block in a corresponding
  • the encoder sends the rate of the target frame to the decoder.
  • the rate at which the encoder resends the target frame to the decoder is lower than the first rate, which can increase the probability of the encoder receiving successful feedback information.
  • the encoder can synchronize the encoder and decoder according to the feedback information received in real time. The state of the decoder, or, to know the synchronization state of the decoder for the received video frames.
  • the rate at which the encoder sends the target frame to the decoder is set to at least one of the following:
  • the expected rate calculated by the system the rate at which the target frame was successfully sent last, or the rate mapping value at which the target frame was successfully sent last.
  • the image block includes a tile or a slice.
  • a method for video transmission is provided, the method is applied to a decoder, and the method includes:
  • the decoder receives a target frame, the target frame is used to request feedback information, and the target frame is sent by the encoder when the first image block among the plurality of image blocks is not completely sent at the end of the corresponding specified period;
  • the decoder In response to the target frame, the decoder sends the feedback information to the encoder.
  • the decoder sends feedback information to the encoder in response to receiving the target frame. Since the target frame is sent by the encoder when it is determined that the first image block is not completely sent at the end of the corresponding specified period, the delay of the feedback information of the decoder can be reduced, and the state synchronization between the encoder and the decoder can be facilitated.
  • the target frame is a detection frame
  • the detection frame is used to instruct the decoder to feed back the state of the image block of the video frame received by the decoder, and the feedback information for characterizing the state of said video frame received by said decoder;
  • the target frame is a synchronization frame
  • the synchronization frame is used to instruct the decoder to perform synchronization processing on the received video frame
  • the feedback information is used to represent the synchronization status of the decoder on the received video frame.
  • the content indicated by the target frame is different, and the content represented by the feedback information sent by the decoder to the encoder is different.
  • the target frame is a detection frame used to instruct the decoder to feed back the state of the image block of the video frame received by the decoder
  • the feedback information is used to represent the state of the video frame received by the decoder ;
  • the target frame is a synchronization frame used to instruct the decoder to perform synchronization processing on the received video frame
  • the feedback information is used to represent the synchronization state of the decoder on the received video frame.
  • the purpose is to synchronize the state of the video frames on both sides of the encoder and decoder, the difference lies in the execution of the action of synchronizing the state of the video frames on both sides
  • the main body is different (that is, if the target frame is a detection frame, the encoder synchronizes the state of the video frames on both sides; if the target frame is a synchronization frame, the decoder synchronizes the state of the video frames on both sides), which can improve the synchronization encoder and decoding The flexibility of the state of the video frame on both sides of the device.
  • the target frame is a medium access control MAC frame.
  • the target frame when the target frame is the detection frame, the target frame is any one of a control frame, a data frame, or a management frame;
  • the target frame is one of a data frame or a management frame.
  • the specific format of the target frame is related to the content indicated by the target frame, if the target frame is a detection frame used to instruct the decoder to feed back the state of the image block of the video frame received by the decoder,
  • the target frame may be any one of a control frame, a data frame or a management frame; if the target frame is a synchronization frame used to instruct the decoder to perform synchronous processing on the received video frame, the target frame may be a data frame or One of the management frames; useful for synchronizing the state of video frames on both sides of the encoder and decoder.
  • control frame includes a short detection frame
  • the data frame is a quality of service QoS empty finger frame or a QoS data frame
  • the management frame is a no-request acknowledgment frame or an acknowledgment frame.
  • the frame header of the MAC frame carries a private value, and the private value is used to instruct the encoder to The controller requests feedback information.
  • the frame header of the MAC frame carries a private value used to instruct the encoder to request feedback information from the decoder.
  • the detection frame carries a private value used to indicate the feedback information of the decoder, and can immediately send feedback information to the encoder to represent the state of the video frame received by the decoder, and the encoder can synchronize with the decoder according to the feedback information state, so that the synchronization effect of image transmission can be guaranteed.
  • the target frame is the control frame
  • the control field of the frame header of the control frame is set to the private value
  • the target frame is the data frame, and the high throughput control HTC domain of the frame header of the data frame is set to the private value; or,
  • the target frame is the management frame, and the HTC field of the frame header of the management frame is set to the private value, or,
  • the first field in the payload of the management frame is set to the private value.
  • the data part of the MAC frame carries indication information, and the indication information is used to instruct the decoder to accept the received video Frames are processed synchronously.
  • the data part of the MAC frame carries indication information for instructing the decoder to perform synchronization processing on the received video frame.
  • the synchronization frame carries instruction information for instructing the decoder to perform synchronization processing on the received video frame, and the state of the encoder can be synchronized with the synchronization frame according to the synchronization frame, so that the synchronization effect of image transmission can be guaranteed.
  • the decoder sending the feedback information to the encoder includes:
  • a data link layer in the decoder generates the feedback information according to the target frame
  • the decoder sends the feedback information to the encoder.
  • the data link layer in the decoder generates feedback information based on the received target frame and sends it to the decoder, because the feedback information is generated in the data link layer in the decoder and sent to the encoder through the physical layer Therefore, the delay in transmitting the feedback information at the bidirectional network layer and the bidirectional transport layer can be saved.
  • the rate at which the decoder sends the feedback information to the encoder is set to at least one of the following:
  • the expected rate of system statistics the rate at which the video frame sent by the encoder was successfully received recently, or the rate mapping value of the video frame sent by the encoder successfully received last time.
  • an apparatus which is included in an electronic device, and has a function of realizing the behavior of the electronic device in the above aspect and possible implementation manners of the above aspect.
  • the functions may be implemented by hardware, or may be implemented by executing corresponding software through hardware.
  • Hardware or software includes one or more modules or units corresponding to the functions described above.
  • an electronic device including: one or more processors; memory; one or more application programs; and one or more computer programs. Wherein one or more computer programs are stored in the memory, the one or more computer programs comprising instructions.
  • the electronic device is made to execute the method in any possible implementation of any one of the above first aspect or any one of the second aspect.
  • a device for video transmission includes at least one processor, when the program instructions are executed in the at least one processor, any one of the above-mentioned first aspect or any one of the second aspect
  • the methods in the item's possible implementations are implemented.
  • a computer storage medium including computer instructions.
  • the computer instructions When the computer instructions are run on an electronic device or a processor, the electronic device or processor executes any one of the above-mentioned first aspect or any one of the second aspect. method in a possible implementation.
  • a computer program product is provided.
  • the computer program product runs on the electronic device or the processor, the electronic device or the processor executes any possible design of any one of the above first aspect or any one of the second aspect. method in .
  • FIG. 1 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a software structure of an electronic device provided by an embodiment of the present application.
  • Fig. 3a is a schematic diagram of a scenario of the application of the embodiment of the present application.
  • Fig. 3b is a schematic diagram of another scenario of the application of the embodiment of the present application.
  • Fig. 3c is a schematic diagram of another scenario of the application of the embodiment of the present application.
  • FIG. 4 is a schematic diagram of a feedback method based on the TCP/IP communication protocol provided by the present application.
  • Fig. 5 is a schematic diagram of frame interaction between an encoder and a decoder provided in the present application.
  • Fig. 6 is a schematic diagram of a video transmission method provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of frame interaction between an encoder and a decoder provided in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a MAC frame format provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of another frame interaction between an encoder and a decoder provided in an embodiment of the present application.
  • Fig. 10 is a schematic diagram of still another frame interaction between an encoder and a decoder provided in an embodiment of the present application.
  • FIG. 11 is a schematic diagram of still another frame interaction between an encoder and a decoder provided in an embodiment of the present application.
  • Fig. 12 is a schematic diagram of still another frame interaction between an encoder and a decoder provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of still another frame interaction between an encoder and a decoder provided in an embodiment of the present application.
  • FIG. 14 is a schematic diagram of another feedback method based on the TCP/IP communication protocol provided by the embodiment of the present application.
  • Fig. 15 is a schematic block diagram of another electronic device provided by an embodiment of the present application.
  • Fig. 16 is a schematic block diagram of another electronic device provided by an embodiment of the present application.
  • Fig. 17 is a schematic block diagram of another electronic device provided by an embodiment of the present application.
  • the method in the embodiment of the present application can be applied to smart phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) equipment, notebook computers, super mobile personal computers ( Ultra-mobile personal computer, UMPC), netbook, personal digital assistant (personal digital assistant, PDA) and other electronic devices, the embodiments of the present application do not impose any restrictions on the specific types of electronic devices.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features.
  • the video transmission method provided by the embodiment of the present application can be applied to mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) devices, notebook computers, super mobile personal On electronic devices such as computers (ultra-mobile personal computer, UMPC), netbooks, personal digital assistants (personal digital assistant, PDA), the embodiments of the present application do not impose any restrictions on the specific types of electronic devices.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • MPEG2 moving picture experts group
  • MPEG3 moving picture experts group
  • MPEG4 moving picture experts group
  • the sending of the target frame and the sending of the feedback information in response to the target frame may be respectively implemented by a video encoder and a video decoder.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 .
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the electronic device 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the embodiment of the present application takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
  • FIG. 2 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window managers, content providers, view systems, phone managers, resource managers, notification managers, and so on.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (media libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • Fig. 3a, Fig. 3b, and Fig. 3c are all schematic scene diagrams applicable to the embodiment of the application.
  • the first electronic device projects a screen to the second electronic device, or the first electronic device uses the camera of the second electronic device to implement operations such as video calling and shooting.
  • the first electronic device can be in the same local area network as the second electronic device, such as establishing a connection through WiFi-P2P, or the first electronic device can communicate with the second electronic device through Bluetooth, ultra wide band (UWB), Wired connection and other connection methods to establish a connection.
  • WiFi-P2P WiFi-P2P
  • UWB ultra wide band
  • the first electronic device may be a mobile phone 300a
  • the second electronic device may be a PC 300b.
  • the content of the display interface 310 of the mobile phone 300a is displayed in a window on the desktop of the PC.
  • a title bar part may also be included above the display area, and the title bar part may include the name of the application "Huawei Video", and controls for controlling interface display to minimize, maximize, and close.
  • the first electronic device may be a mobile phone 330a
  • the second electronic device may be a PC 330b.
  • the video screen 340 in the mobile phone is transmitted to the PC 330b for full-screen display, for example, the video displayed in the mobile phone It is the game screen.
  • the screen of the game is displayed on the PC in real time. The user hopes that the real-time performance of the screen projection of the game screen to the PC is better, that is, the delay of the screen projection is low. .
  • the first electronic device can be a mobile phone 360a
  • the second electronic device can be a smart screen 360b.
  • the mobile phone 360a can use the camera of the smart screen 360b to make video calls and take pictures and so on.
  • the mobile phone 360a makes a video call with the help of the camera of the smart screen 360b, and displays the picture of the video call on the screen of the smart screen 360b. window, or display the video images of the local end and the peer end in split screens.
  • the second electronic device may feed back information to enable the first electronic device to adjust the transmission algorithm.
  • the encoder (which can be understood as the device in the first electronic device in the above scenario) accesses the channel and sends data hierarchically
  • the decoder ( It can be understood that the device in the second electronic device in the above scenario) receives data, and accesses a channel to feed back the receiving status after a certain delay.
  • FIG. 4 it is a schematic diagram of a feedback method based on the TCP/IP communication protocol provided by the present application.
  • the encoder and decoder mainly include the transport layer, network layer, data link layer, and physical layer. Each layer provides data communication services for its upper layer, and the transport layer serves the application layer. After the encoder competes for the channel, it sends the code stream data. After the decoder receives the code stream data, it processes the upper-layer information, and sends the processed feedback information to the bottom layer on the re-competed channel.
  • FIG. 5 it is a schematic diagram of frame interaction between an encoder and a decoder provided in the present application.
  • the frame interaction shown in the diagram is fed back in the form of block acknowledgment.
  • This method needs to maintain the bit map lookup table (bitmap look up table, bitmap LUT) table cache status, and it is necessary to ensure that the receiving status of the data link layer on the decoding side matches the receiving status of the transport layer to avoid errors in the data link layer. feedback.
  • bit map lookup table bitmap look up table, bitmap LUT
  • the encoder can initiate a short frame transaction of RTS/CTS or NULL/ACK to protect a TXOP sequence, and then the encoder can send multiple frames of code stream data continuously in the TXOP, and the confirmation policy of the MAC frame of the data
  • the field is set to block ack, and a BAR frame is used to request the feedback information of all code stream data (ie BA frame) after sending multiple frames of code stream data.
  • the entire interaction process can be interacted within the same TXOP, which saves the time of re-competing channels at both ends of the encoder and decoder.
  • the encoder can only send the BAR frame for requesting feedback information to the decoder after successfully sending multiple frames of code stream data, which belongs to the delayed confirmation mode, so the encoder cannot obtain the receiving status of the decoder in real time.
  • the state between the encoder and the decoder cannot be synchronized.
  • the present application provides a method for video transmission, which can reduce the delay of the feedback information of the decoder and synchronize the states of the encoder and the decoder.
  • FIG. 6 it is a schematic diagram of a video transmission method 600 provided in the embodiment of the present application.
  • the method 600 may include steps S610-S640, and the method 600 may be implemented by an encoder and a decoder, wherein the steps S610-S620 may be implemented by the encoder, and steps S630-S640 may be implemented by the decoder.
  • the video frame generated by the encoder can be divided into multiple image blocks.
  • the multiple image blocks can include tiles or slice segments. , SS) (may be referred to as slice for short). Rectangular tiles can be obtained by horizontally or vertically dividing the video frame, and strip segments in the form of strips can be obtained by irregularly dividing the video frame.
  • video frames in this embodiment of the present application may or may not be evenly divided, and there is no limitation thereto.
  • each image block in the plurality of image blocks within a respective specified cycle can be understood as: for each image block in the plurality of image blocks included in the video frame, the encoder configures a cycle, and the encoder can Each of these image blocks is sent to the decoder during a corresponding configured period.
  • FIG. 7 it is a schematic diagram of frame interaction between an encoder and a decoder provided in an embodiment of the present application.
  • the video frame includes 3 image blocks, which are respectively image block 1, image block 2 and image block 3, and the encoder configures corresponding periods for these 3 image blocks, such as the configuration for image block 1 is the first cycle, the second cycle is configured for image block 2, and the third cycle is configured for image block 3, then the encoder can send image block 1 in the first cycle and image block 2 in the second cycle , send image block 3 in the third cycle.
  • the encoder sends a target frame for requesting feedback information to the decoder.
  • the video frame includes 3 image blocks, which are respectively image block 1, image block 2 and image block 3.
  • image block 1 is completely sent to the decoder at the end of the first cycle
  • the encoder The target frame may not be sent to the decoder; assuming that the image block 2 is not completely sent to the decoder at the end of the second period, the encoder may send the target frame to the decoder at the end of the second period.
  • the fact that the first image block is not completely sent at the end of the corresponding specified period can be understood as: before the end of the specified period corresponding to the first image block, the Some data is not sent to the decoder.
  • the encoder sends the target frame to the decoder.
  • the encoder There may be an unlimited delay in sending the target frame to the decoder.
  • S640 Send the feedback information to an encoder in response to the target frame received by the decoder.
  • the decoder after the decoder receives the target frame for requesting feedback information, the decoder sends the feedback information to the encoder in response to the received target frame.
  • the encoder determines that the first image block among the multiple image blocks included in the video frame is not completely sent at the end of the corresponding specified period, and sends the target frame for requesting feedback information to the decoder, and the decoder In response to receiving the target frame, feedback information is sent to the encoder.
  • the encoder Since the encoder actively sends the target frame to the decoder when it determines that the first image block has not been completely sent at the end of the corresponding specified period, to request the decoder to send feedback information, it does not need to wait until multiple video frames are sent before sending to the decoder Sending BAR frames for requesting feedback information can reduce the delay of decoder feedback information, so that the encoder can synchronize the status of the encoder and decoder according to the feedback information received in real time, or understand the decoder's response to the received video frame sync status.
  • the decoder in response to the target frame received by the decoder, sends feedback information to the encoder, wherein the content represented by the feedback information is related to the content indicated by the target frame, please refer to the following for details.
  • the target frame is a detection frame
  • the detection frame is used to instruct the decoder to feed back the state of the image block of the video frame received by the decoder
  • the feedback information is used to characterize the state of the video frame received by the decoder
  • the target frame is a synchronization frame
  • the synchronization frame is used to instruct the decoder to perform synchronization processing on the received video frame
  • the feedback information is used to represent the synchronization status of the decoder on the received video frame.
  • the target frame is a detection frame used to instruct the decoder to feed back the status of the decoder receiving the image block of the video frame
  • the feedback information is used to indicate that the decoder receives The state of the video frame.
  • the encoder sends a probe frame to the decoder to Request the decoder for the status of the video frame received by the decoder at this time.
  • the decoder After receiving the probe frame, the decoder responds to receiving the probe frame and sends the status of the video frame received at this time to the encoder, such as decoding
  • the encoder receives part of the data in the image block 2 at the current moment, it sends the received data to the encoder at the current moment, which is convenient for the encoder to synchronize with the state of the decoder. For the unsent data in image block 2, the encoder does not send any more.
  • the feedback information is used to represent the synchronization status of the video frame received by the decoder.
  • the encoder sends a synchronization frame to the decoder, Inform the decoder of the data of the image block 2 that has been sent at the current moment, so as to facilitate the synchronization of the decoder and the status of the encoder.
  • the decoder When the decoder receives the synchronization frame sent by the encoder, it can send an acknowledgment to the encoder to receive the synchronization frame. If the decoder receives the synchronization frame sent by the encoder at the current moment, it can send a received synchronization frame to the encoder.
  • the feedback information is used to characterize the synchronization state of the video frame received by the decoder.
  • the content indicated by the target frame sent by the encoder to the decoder is different, and the content represented by the feedback information sent by the decoder to the encoder is different.
  • the target frame is a detection frame used to instruct the decoder to feed back the state of the image block of the video frame received by the decoder
  • the feedback information is used to represent the state of the video frame received by the decoder ;
  • the target frame is a synchronization frame used to instruct the decoder to perform synchronization processing on the received video frame
  • the feedback information is used to represent the synchronization state of the decoder on the received video frame.
  • the purpose is to synchronize the state of the video frames on both sides of the encoder and decoder, the difference lies in the execution of the action of synchronizing the state of the video frames on both sides
  • the main body is different (that is, if the target frame is a detection frame, the encoder synchronizes the state of the video frames on both sides; if the target frame is a synchronization frame, the decoder synchronizes the state of the video frames on both sides), which can improve the synchronization encoder and decoding The flexibility of the state of the video frame on both sides of the device.
  • the target frame may also include a detection frame and a synchronization frame
  • the encoder may request the status of the video frame received by the decoder, and at the same time transmit the status information of the sent video frame to the decoder .
  • the encoder can estimate the total duration of sending the two frames based on the current channel state, so as to avoid occupying the duration of the next cycle.
  • the target frame is a MAC frame.
  • the target frame in the embodiment of the present application may be a MAC frame, as shown in FIG. 8 , which is a schematic diagram of a MAC frame format provided in the embodiment of the present application.
  • the MAC frame includes three parts: a frame header, a data part, and a frame trailer.
  • the frame header and the frame tail may include address information, etc.; the data part may include data to be transmitted.
  • the frame header of the MAC frame may include three fields, wherein the first two fields are respectively a 6-byte long destination address field and a source address field, the destination address field contains destination MAC address information, and the source address field contains source MAC address information. Address information.
  • the third field is a 2-byte type field, which contains information that can be used to indicate what protocol the upper layer uses, so that the receiving end will hand over the data part of the received MAC frame to the protocol of the upper layer.
  • the data part of the MAC frame includes a field with a length between 46 and 1500 bytes, which contains the data passed down from the network layer.
  • the frame trailer of the MAC frame also includes a field, which is 4 bytes long and contains a frame check sequence (FCS).
  • FCS frame check sequence
  • the target frame in the embodiment of the present application may be other than the MAC frame exemplified above, as long as it can be used to instruct the decoder to feedback the state of the image block of the video frame received by the decoder or This application can be applied to frames used to instruct the decoder to perform synchronous processing on received video frames.
  • the target frame when the target frame is the detection frame, the target frame is any one of a control frame, a data frame, or a management frame;
  • the target frame is one of a data frame or a management frame.
  • the target frame may include a control frame, a data frame, or a management frame.
  • the control frame can assist in the transmission of information, but cannot carry service data;
  • the data frame and management frame can carry service data.
  • target frames indicating different contents they include different formats.
  • the target frame can be any one of control frame, data frame or management frame, that is, the target frame can be a control frame, a data frame, or a management frame.
  • the target frame can be One of a data frame or a management frame, that is, the target frame can be either a data frame or a management frame.
  • the specific format of the target frame is related to the content indicated by the target frame, if the target frame is a detection frame used to instruct the decoder to feed back the state of the image block of the video frame received by the decoder,
  • the target frame may be any one of a control frame, a data frame or a management frame; if the target frame is a synchronization frame used to instruct the decoder to perform synchronous processing on the received video frame, the target frame may be a data frame or One of the management frames; useful for synchronizing the state of video frames on both sides of the encoder and decoder.
  • control frame includes a short sounding frame
  • the data frame includes a quality of service (quality of service, QoS) empty finger frame or a QoS data frame;
  • QoS quality of service
  • the management frames include unsolicited acknowledgment frames or acknowledgment frames.
  • control frame may include a short probe frame
  • data frame may include a QoS null finger frame or a QoS data frame
  • management frame may include a no-request acknowledgment frame or an acknowledgment frame.
  • the target frame may be any of a control frame, a data frame or a management frame.
  • the target frame can be a short detection frame
  • the target frame can be a QoS empty finger frame
  • the target frame can be Acknowledgment frame without request.
  • the target frame may be a data frame or a management frame; specifically, if the target frame is a data frame, the target frame It may be a QoS data frame, and if the target frame is a management frame, the target frame may be an acknowledgment frame.
  • the target frame indicates different content.
  • the target frame is a MAC frame
  • the corresponding indication is made through the frame header or data part of the MAC frame.
  • the target frame is a detection frame
  • the frame header of the MAC frame carries a private value, and the private value is used to instruct the encoder to request feedback from the decoder information.
  • the target frame when the target frame is the control frame, it is characterized in that the target frame is the control frame, and the control field of the frame header of the control frame is set to the private value; or,
  • the target frame is the data frame, and the high-throughput control (high-throughput control, HTC) domain of the frame header of the data frame is set to the private value; or,
  • the target frame is the management frame, and the HTC field of the frame header of the management frame is set to the private value, or,
  • the first field in the payload of the management frame is set to the private value.
  • the target frame is a probe frame used to instruct the decoder to feed back the state of the image block of the video frame received by the decoder
  • the frame header of the MAC frame It can carry a private value, and this private value is used to instruct the encoder to request feedback information from the decoder.
  • the target frame is a control frame
  • the control field of the frame header of the control frame is set to a private value.
  • FIG. 9 it is a schematic diagram of another frame interaction between an encoder and a decoder provided in the embodiment of the present application.
  • the image block 2 is not completely sent to the decoder at the end of the second cycle, and the encoder sends a short detection frame to the decoder, and the decoder responds to receiving the short detection frame after receiving the short detection frame.
  • the short detection frame sends feedback information to the encoder, and the feedback information is used to characterize the state of the video frame received by the decoder.
  • the delay time for the encoder to receive the feedback information sent by the decoder includes: 2 short interframe space (SIFS) + the time for sending the short detection frame
  • the encoder may use the control frame as the detection frame to request feedback information.
  • the control field of the frame header of the control frame can be set as a private value, for example, a multi-thread control symbol (multi thread identifier, Multi-TID), a compressed bitmap (compressed bitmap), a retry multicast (groupcast with retries, GCR)
  • the combination of these three bits is set as a reserved value (reserved), as shown in Table 1.
  • the decoder resolves that the control field of the frame header of the control frame is set to a reserved value (that is, a private value in this application), it can directly respond to the feedback information.
  • the block confirmation request frame variant can be set as a reserved value, That is, the control field of the frame header of the control frame is set to a reserved value.
  • the target frame is a data frame
  • the HTC field of the frame header of the data frame is set to a private value.
  • FIG. 10 it is a schematic diagram of another frame interaction between an encoder and a decoder provided in the embodiment of the present application.
  • the image block 2 is not completely sent to the decoder at the end of the second period, and the encoder sends a QoS empty finger frame to the decoder, and the decoder responds after receiving the QoS empty finger frame After receiving the QoS null finger frame, send feedback information to the encoder, where the feedback information is used to characterize the state of the video frame received by the decoder.
  • the delay time for the encoder to receive the feedback information sent by the decoder includes: 2 short interframe spaces (SIFS) + the time for sending QoS empty finger frames
  • the encoder may use the QoS empty finger frame as the detection frame to request feedback information.
  • the HTC field of the frame header of the QoS null finger frame can be set as a private value, for example, the control identification (identity, ID) 7- 14 is set as a reserved value, as shown in Table 2 and Table 3.
  • the target frame is a management frame
  • the HTC field of the frame header of the management frame is set to a private value or the first field in the payload of the management frame is set to a private value.
  • FIG. 11 it is a schematic diagram of another frame interaction between an encoder and a decoder provided in the embodiment of the present application.
  • the encoder sends an action no ack frame to the decoder, and the decoder receives no request After acknowledgment, feedback information is sent to the encoder in response to receiving the unsolicited acknowledgment frame, and the feedback information is used to characterize the state of the video frame received by the decoder.
  • the delay time for the encoder to receive the feedback information sent by the decoder includes: 2 short interframe spaces (SIFS) + the time for sending a no-request confirmation frame
  • the encoder may use the unsolicited acknowledgment frame as the detection frame to request feedback information.
  • the HTC field of the frame header of the no-request confirmation frame may be set as a private value, or the first field in the payload of the no-request confirmation frame may be set as a private value.
  • the control ID value 7-14 in the A-control (A-control) field of the HE format can be set as a reserved value.
  • the fields (or codes) 21-125 in the payload of the no-request acknowledgment frame can be set as private values, as shown in Table 4.
  • the frame header of the MAC frame carries a private value used to instruct the encoder to request feedback information from the decoder.
  • the detection frame carries a private value used to indicate the feedback information of the decoder, and can immediately send feedback information to the encoder to represent the state of the video frame received by the decoder, and the encoder can synchronize with the decoder according to the feedback information state, so that the synchronization effect of image transmission can be guaranteed.
  • the target frame is a sync frame
  • the data part of the MAC frame carries indication information, and the indication information is used to instruct the decoder to synchronize the received video frame deal with.
  • a MAC frame includes three parts: a frame header, a data part, and a frame trailer.
  • the frame header of the MAC frame does not need to carry a private value
  • the data part of the MAC frame only needs to carry indication information
  • the indication information is used to instruct the decoder to perform synchronization processing on the received video frame.
  • the target frame is a data frame
  • the encoder may use the QoS data frame as a synchronization frame to transmit the status of the video frame sent by the encoder.
  • FIG. 12 it is a schematic diagram of another frame interaction between an encoder and a decoder provided in the embodiment of the present application.
  • the image block 2 is not completely sent to the decoder at the end of the second period, and the encoder sends a QoS data frame to the decoder, and the data part of the QoS data frame carries indication information (the indication The information may include the code stream number of the image block 2 sent by the encoder), and after receiving the QoS data frame, the decoder may synchronize with the state of the encoder according to the received QoS data frame. At the same time, in response to receiving the QoS data frame, the decoder may send feedback information to the encoder, where the feedback information is used to characterize the synchronization state of the decoder for the received video frame.
  • the target frame is a management frame
  • the encoder may use the acknowledgment frame as a synchronization frame to transmit the status of the video frame sent by the encoder.
  • FIG. 13 it is a schematic diagram of another frame interaction between an encoder and a decoder provided in the embodiment of the present application.
  • the encoder sends an acknowledgment frame to the decoder, and the data part of the acknowledgment frame carries indication information (in the indication information It may include the code stream number of the image block 2 sent by the encoder), after the decoder receives the confirmation frame, it can synchronize with the status of the encoder according to the received confirmation frame.
  • the decoder may send feedback information to the encoder, where the feedback information is used to characterize the synchronization state of the decoder for the received video frame.
  • the data part of the MAC frame carries indication information for instructing the decoder to perform synchronization processing on the received video frame.
  • the synchronization frame carries instruction information for instructing the decoder to perform synchronization processing on the received video frame, and the state of the encoder can be synchronized with the synchronization frame according to the synchronization frame, so that the synchronization effect of image transmission can be guaranteed.
  • the encoder may resend the target frame to the decoder, see below for details.
  • the method further includes:
  • the encoder resends the target frame to the decoder.
  • the encoder can send a target frame for requesting feedback information to the decoder, and the decoder responds after receiving the target frame Based on the received target frame, it sends feedback information to the encoder.
  • the encoder does not receive the feedback information sent by the decoder. For example, the encoder does not receive the feedback information due to channel congestion.
  • the encoder can resend the request to the decoder.
  • the target frame of the feedback information after the decoder receives the target frame again, it can send the feedback information to the encoder again.
  • the content represented by the feedback information sent again by the encoder is different from the feedback information sent by the encoder for the first time (it can be understood that the decoder responds to the received target frame (the target frame is the first image block in the corresponding specified period)
  • the end time is not completely sent, the content represented by the target frame sent by the encoder to the decoder) and the feedback information sent to the encoder) are consistent. Since the first image block is not completely sent at the end of the corresponding specified period, the encoder will no longer send the unsent data in the first image block.
  • the feedback information represents The state or synchronization state of the video frame received by the decoder, therefore, the content represented by the feedback information sent again by the encoder is consistent with the content represented by the feedback information sent by the encoder for the first time.
  • the encoder when the encoder does not receive the feedback information sent by the decoder, it can resend the target frame for requesting feedback information to the decoder, in order to successfully receive the feedback information sent by the decoder, so that the encoder can Information to synchronize the state of the encoder and decoder, or, to know the state of synchronization of the decoder to the received video frames.
  • the rate at which the encoder resends the target frame to the decoder is less than a first rate, the first rate being the first image block at the end of a corresponding specified period
  • the encoder sends the rate of the target frame to the decoder.
  • the rate at which the encoder resends the target frame to the decoder is lower than the first rate, which is the rate at which the encoder sends the target frame to the decoder when the above-mentioned first image block is not completely sent at the end of the corresponding specified period.
  • Rate at which target frames are sent For example, if the rate at which the encoder sends the target frame to the decoder is 12 Mbps when the first image block is not completely transmitted at the end of the corresponding specified period, the rate at which the encoder re-sends the target frame to the decoder may be 6 Mbps.
  • the rate at which the encoder sends the target frame to the decoder decreases.
  • the rate at which the encoder resends the target frame to the decoder is lower than the first rate, which can increase the probability of the encoder receiving successful feedback information.
  • the encoder can synchronize the encoder and decoder according to the feedback information received in real time. The state of the decoder, or, to know the synchronization state of the decoder for the received video frames.
  • the rate at which the encoder sends the target frame to the decoder is set to at least one of the following:
  • the expected rate calculated by the system the rate at which the target frame was successfully sent last, or the rate mapping value at which the target frame was successfully sent last.
  • the rate at which the encoder sends the target frame to the decoder can be based on the following settings:
  • v1 can be set as the rate at which the encoder sends the target frame to the decoder.
  • the rate at which the encoder sends the target frame to the decoder may be set to 12 Mbps.
  • v2 can be set as the rate at which the encoder sends the target frame.
  • the rate at which the encoder successfully transmits the target frame recently is 9 Mbps
  • the rate at which the encoder successfully transmits the target frame recently may be set to 9 Mbps.
  • v3 can be set as the rate of the target frame successfully transmitted by the encoder recently. If the rate of the target frame successfully sent by the encoder in a certain protocol format (such as he format) recently is 9Mbps, the rate of 9Mbps in another protocol format (such as non-high throughput (non-high throughput, non-ht) format ) corresponding to the rate mapping value can be 6Mbps, then the rate of the target frame successfully sent by the encoder recently can be set to 6Mbps.
  • a certain protocol format such as he format
  • 9Mbps such as non-high throughput (non-high throughput, non-ht) format
  • step S640 in response to the target frame received by the decoder, the decoder sends feedback information to the encoder, wherein the generation process of the feedback information can be implemented at a certain layer in the decoder, please refer to the following for details.
  • the decoder sends the feedback information to the encoder, including:
  • a data link layer in the decoder generates the feedback information according to the target frame
  • the decoder sends the feedback information to the encoder.
  • the data link layer in the decoder after the decoder receives the target frame sent by the encoder, in response to the target frame, the data link layer in the decoder generates feedback information according to the target frame and sends the feedback information to the Encoder.
  • FIG. 14 it is a schematic diagram of another feedback method based on the TCP/IP communication protocol provided by the embodiment of the present application.
  • the encoder sends the target frame to the decoder after competing for the channel, and the data link layer in the decoder receives the target frame through the physical layer, generates feedback information according to the target frame, and feeds back the target frame through the physical layer The information is sent to the encoder. Since the feedback information is generated at the data link layer in the decoder and sent to the encoder through the physical layer, the delay in transmitting the feedback information at the bidirectional network layer and the bidirectional transport layer can be saved.
  • the data link layer in the decoder can still transmit the target frame to the network layer, and the network layer It is transmitted to the transport layer, which is convenient for the transport layer in the decoder to do video display business, sound signal transmission and other processing.
  • the data link layer in the decoder generates feedback information based on the received target frame and sends it to the decoder, because the feedback information is generated in the data link layer in the decoder and sent to the encoder through the physical layer Therefore, the delay in transmitting the feedback information at the bidirectional network layer and the bidirectional transport layer can be saved.
  • the rate at which the decoder sends the feedback information to the encoder is set to at least one of the following:
  • the expected rate of system statistics the rate at which the video frame sent by the encoder was successfully received recently, or the rate mapping value of the video frame sent by the encoder successfully received last time.
  • the rate at which the decoder sends feedback information to the encoder can be based on the following settings:
  • v1' can be set as the rate at which the decoder sends feedback information to the encoder. For example, if the expected rate calculated by the system based on the rate of sending feedback information multiple times is 12 Mbps, the rate at which the decoder sends feedback information to the encoder can be set to 12 Mbps.
  • v2' can be set as the rate at which the decoder sends feedback information to the encoder.
  • the rate at which the decoder sends feedback information to the encoder may be set to 9 Mbps.
  • v3' can be set as the decoder sends the video frame to the encoder The rate of feedback information. If the rate at which the decoder successfully receives the video frame sent by the encoder in a certain protocol format (such as he format) recently is 9Mbps, the rate mapping value corresponding to the rate of 9Mbps in another protocol format (such as non-ht format) can be If it is 6Mbps, you can set the rate at which the decoder sends feedback information to the encoder as 6Mbps.
  • the electronic device includes hardware and/or software modules corresponding to each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or by computer software driving hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functions in combination with the embodiments for each specific application, but such implementation should not be regarded as exceeding the scope of the present application.
  • the functional modules of the electronic device may be divided according to the above method example.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules may be implemented in the form of hardware. It should be noted that the division of modules in this embodiment is schematic, and is only a logical function division, and there may be other division methods in actual implementation.
  • FIG. 15 shows a possible composition diagram of the electronic device 1500 involved in the above embodiment.
  • the electronic device 1500 may include: a generation module 1510 and communication module 1520.
  • the generation module 1510 may be used to support the electronic device 1500 to execute the above step S610, etc., and/or be used in other processes of the technologies described herein.
  • the communication module 1520 may be used to support the electronic device 1500 to execute the above step S620, etc., and/or other processes for the technologies described herein.
  • the electronic device provided in this embodiment is used to execute the above-mentioned method of the present application, so the same effect as the above-mentioned implementation method can be achieved.
  • FIG. 16 shows a schematic diagram of a possible composition of an electronic device 1600 involved in the foregoing embodiments.
  • the electronic device 1600 may include: a communication module 1610 .
  • the communication module 1610 may be used to support the electronic device 1600 to execute the above step S630 or S640, etc., and/or other processes for the technologies described herein.
  • the electronic device may include a processing module, a memory module and a communication module.
  • the processing module can be used to control and manage the actions of the electronic device, for example, it can be used to support the electronic device to execute the steps performed by the above-mentioned units.
  • the memory module can be used to support electronic devices to execute stored program codes and data, and the like.
  • the communication module can be used to support the communication between the electronic device and other devices.
  • the processing module may be a processor or a controller. It can implement or execute the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor can also be a combination of computing functions, such as a combination of one or more microprocessors, a combination of digital signal processing (digital signal processing, DSP) and a microprocessor, and the like.
  • the storage module may be a memory.
  • the communication module may be a device that interacts with other electronic devices, such as a radio frequency circuit, a Bluetooth chip, and a Wi-Fi chip.
  • the electronic device involved in this embodiment may be a device having the structure shown in FIG. 1 .
  • FIG. 17 shows a schematic diagram of another possible composition of the electronic device 800 involved in the above embodiment. As shown in FIG. It can also be referred to as display unit) 840 , peripheral interface 850 , storage unit 860 , power supply 870 , video coder/decoder 880 and audio coder/decoder 890 .
  • the communication unit 810 is used to establish a communication channel, so that the electronic device 800 can connect to a remote server through the communication channel, and download media data from the remote server.
  • the communication unit 810 may include communication modules such as a WLAN module, a Bluetooth module, an NFC module, and a baseband module, and a radio frequency (Radio Frequency, referred to as RF) circuit corresponding to the communication module, for performing wireless local area network communication, Bluetooth communication, NFC communication, infrared communication and/or cellular communication system communication, such as wideband code division multiple access (W-CDMA) and/or high speed downlink packet access (HSDPA).
  • the communication module 810 is used to control the communication of various components in the electronic device, and can support direct memory access.
  • the input unit 820 may be used to implement user interaction with the electronic device and/or input information into the electronic device.
  • the input unit may be a touch panel, or other human-computer interaction interface, such as physical input keys, a microphone, etc., or other external information acquisition devices, such as a camera.
  • the processing unit 830 is the control center of the electronic device, which can use various interfaces and lines to connect various parts of the entire electronic device, by running or executing software programs and/or modules stored in the storage unit, and calling the stored in the storage unit data to perform various functions of electronic devices and/or to process data.
  • the output unit 840 includes but not limited to an image output unit and a sound output unit.
  • the image output unit is used for outputting text, pictures and/or videos.
  • the touch panel adopted by the input unit 820 may also serve as the display panel of the output unit 840 at the same time. For example, when the touch panel detects a touch or approach gesture operation on it, it transmits to the processing unit to determine the type of the touch event, and then the processing unit provides corresponding visual output on the display panel according to the type of the touch event.
  • the input unit 820 and the output unit 840 are used as two independent components to realize the input and output functions of the electronic device, in some embodiments, the touch panel and the display panel can be integrated to realize Input and output functions of electronic devices.
  • the image output unit can display various graphical user interfaces as virtual control components, including but not limited to windows, scrolling shafts, icons, and scrapbooks, for users to operate by touch.
  • the storage unit 860 can be used to store software programs and modules, and the processing unit executes various functional applications of the electronic device and implements data processing by running the software programs and modules stored in the storage unit.
  • Video encoder/decoder 880 and audio encoder/decoder 890 can encode or decode files, so as to implement the methods in the above-mentioned embodiments.
  • This embodiment also provides a computer storage medium, in which computer instructions are stored, and when the computer instructions are run on an electronic device or a processor, the electronic device or the processor executes the above-mentioned related method steps to realize the above-mentioned embodiment method in .
  • This embodiment also provides a computer program product, which, when running on a computer or a processor, causes the computer or processor to execute the above-mentioned related steps, so as to implement the method in the above-mentioned embodiment.
  • an embodiment of the present application also provides a device, which may specifically be a chip, a component or a module, and the device may include a connected processor and a memory; wherein the memory is used to store computer-executable instructions, and when the device is running, The processor can execute the computer-executable instructions stored in the memory, so that the chip executes the methods in the foregoing method embodiments.
  • the electronic device, computer storage medium, computer program product or chip provided in this embodiment is all used to execute the corresponding method provided above, therefore, the beneficial effects it can achieve can refer to the corresponding method provided above The beneficial effects in the method will not be repeated here.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or It may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • a unit described as a separate component may or may not be physically separated, and a component shown as a unit may be one physical unit or multiple physical units, which may be located in one place or distributed to multiple different places. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • an integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, and the software product is stored in a storage medium Among them, several instructions are included to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all or part of the steps of the methods in various embodiments of the present application.
  • the aforementioned storage medium includes: various media that can store program codes such as U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disk or optical disk.

Abstract

本申请提供一种视频传输的方法和电子设备,所述方法应用于编码器,所述方法包括:所述编码器生成视频帧,所述视频帧包括多个图像块,所述多个图像块中的每一个图像块在各自指定周期内发送;若所述多个图像块中的第一图像块在对应指定周期的结束时刻未完全发送,所述编码器向所述解码器发送目标帧,所述目标帧用于向所述解码器请求反馈信息。本申请提供的方案可以减少解码器反馈信息的延时,同步编码器和解码器的状态。

Description

视频传输的方法和电子设备 技术领域
本申请涉及电子技术领域,并且更具体地,涉及一种视频传输的方法和电子设备。
背景技术
在无线通信领域视频信号传输的场景中,可以基于传输控制/网络通讯协议(transmission control protocol/internet protocol,TCP/IP)反馈信息,编码器接入信道后分层发送数据,解码器接收数据,并在一定延时后接入信道反馈接收状态。
关于解码器反馈接收状态可通过以下方式实现:该方式采用的是块确认(block ack)的方式反馈的。首先编码器可以发起请求发送/允许发送(request to send/clear to send,RTS/CTS),或,空指/确认(NULL/(action,ACK))的短帧交易,保护一段传输机会(transmission opportunity,TXOP)序列,之后编码器可在该TXOP内连发多帧码流数据,将数据的媒体接入控制(medium access control,MAC)帧的确认策略字段设为block ack,在发完多帧码流数据后用一个块确认请求(block ack request,BAR)帧请求所有码流数据的反馈信息。整个交互过程可以在同一TXOP内交互,节省了编码器和解码器两端重新竞争信道的时间。然而该方式下,编码器在成功发送完多帧码流数据后才可向解码器发送用于请求反馈信息的BAR帧,属于延时确认方式,从而编码器无法实时获取解码器的接收状态,导致编码器与解码器之间的状态无法同步。
发明内容
本申请提供一种视频传输的方法和电子设备,可以减少解码器反馈信息的延时,同步编码器和解码器的状态。
第一方面,提供一种视频传输的方法,所述方法应用于编码器,所述方法包括:
所述编码器生成视频帧,所述视频帧包括多个图像块,所述多个图像块中的每一个图像块在各自指定周期内发送;
若所述多个图像块中的第一图像块在对应指定周期的结束时刻未完全发送,所述编码器向所述解码器发送目标帧,所述目标帧用于向所述解码器请求反馈信息。
需要说明的是,本申请实施例中的第一图像块在对应指定周期的结束时刻未完全发送可以理解为:在第一图像块对应的指定周期的结束时刻为止,该第一图像块中的部分数据并未发送至解码器。
本申请提供的方案,编码器在确定视频帧包括的多个图像块中的第一图像块在对应指定周期的结束时刻未完全发送,向解码器发送用于请求反馈信息的目标帧。由于编码器在确定第一图像块在对应指定周期的结束时刻未完全发送时主动向解码器发送目标帧,以请求解码器发送反馈信息,无需等到多个视频帧发送完成后才可向解码器发送用于请求反馈信息的BAR帧,可以减少解码器反馈信息的延时,从而编码器可以根据接收的反馈信息 同步编码器和解码器的状态,或,了解解码器对接收的视频帧的同步状态。
结合第一方面,在一些可能的实现方式中,所述目标帧为探测帧,所述探测帧用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态,所述反馈信息用于表征所述解码器接收的所述视频帧的状态;或者,
所述目标帧为同步帧,所述同步帧用于指示所述解码器对接收的视频帧进行同步处理,所述反馈信息用于表征所述解码器对接收的视频帧的同步状态。
本申请提供的方案,编码器向解码器发送的目标帧所指示的内容不同,解码器向编码器发送的反馈信息所表征的内容不同。具体地,若目标帧为用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态的探测帧,反馈信息用于表征所述解码器接收的所述视频帧的状态;若目标帧为用于指示所述解码器对接收的视频帧进行同步处理的同步帧,反馈信息用于表征所述解码器对接收的视频帧的同步状态。不管目标帧所指示的和反馈信息所表征的内容是什么,其目的均是为了同步编码器和解码器两侧的视频帧的状态,区别在于同步两侧的视频帧的状态这一动作的执行主体的不同(即若目标帧为探测帧,编码器同步两侧的视频帧的状态;若目标帧为同步帧,解码器同步两侧的视频帧的状态),从而可以提升同步编码器和解码器两侧的视频帧的状态的灵活性。
结合第一方面,在一些可能的实现方式中,所述目标帧为媒体接入控制MAC帧。
结合第一方面,在一些可能的实现方式中,所述目标帧为所述探测帧时,所述目标帧为控制帧、数据帧或管理帧中的任一种;
所述目标帧为所述同步帧时,所述目标帧为数据帧或管理帧中的一种。
本申请提供的方案,目标帧的具体格式与目标帧所指示的内容有关,若目标帧为用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态的探测帧,该目标帧可以为控制帧、数据帧或管理帧中的任一种;若目标帧为用于指示所述解码器对接收的视频帧进行同步处理的同步帧,该目标帧可以为数据帧或管理帧中的一种;有利于同步编码器和解码器两侧的视频帧的状态。
结合第一方面,在一些可能的实现方式中,所述控制帧包括短探测帧;
所述数据帧包括服务质量QoS空指帧或QoS数据帧;
所述管理帧包括无请求确认帧或确认帧。
结合第一方面,在一些可能的实现方式中,所述目标帧为所述探测帧时,所述MAC帧的帧头携带私有值,所述私有值用于指示所述编码器向所述解码器请求反馈信息。
本申请提供的方案,当目标帧为探测帧时,MAC帧的帧头携带用于指示所述编码器向所述解码器请求反馈信息的私有值,解码器接收到该探测帧后,由于该探测帧中携带用于指示所述解码器反馈信息的私有值,可以立即向编码器发送用于表征解码器接收的视频帧的状态的反馈信息,编码器可以根据该反馈信息同步与解码器的状态,从而可以保证图像传输的同步效果。
结合第一方面,在一些可能的实现方式中,所述目标帧为所述控制帧,所述控制帧的帧头的控制域设置为所述私有值;或,
所述目标帧为所述数据帧,所述数据帧的帧头的高吞吐量控制HTC域设置为所述私有值;或,
所述目标帧为所述管理帧,所述管理帧的帧头的HTC域设置为所述私有值,或,
所述管理帧的负载中首个字段设置为所述私有值。
结合第一方面,在一些可能的实现方式中,所述目标帧为所述同步帧时,所述MAC帧的数据部分携带指示信息,所述指示信息用于指示所述解码器对接收的视频帧进行同步处理。
本申请提供的方案,当目标帧为同步帧时,MAC帧的数据部分携带用于指示所述解码器对接收的视频帧进行同步处理的指示信息,解码器接收到该同步帧后,由于该同步帧中携带用于指示所述解码器对接收的视频帧进行同步处理的指示信息,可以根据该同步帧同步与编码器的状态,从而可以保证图像传输的同步效果。
结合第一方面,在一些可能的实现方式中,若所述编码器未接收到所述反馈信息,所述方法还包括:
所述编码器重新向所述解码器发送所述目标帧。
本申请提供的方案,在编码器未接收到解码器发送的反馈信息时,可以重新向解码器发送用于请求反馈信息的目标帧,以期成功接收解码器发送的反馈信息,便于编码器根据反馈信息同步编码器和解码器的状态,或,了解解码器对接收的视频帧的同步状态。
结合第一方面,在一些可能的实现方式中,所述编码器重新向所述解码器发送所述目标帧的速率小于第一速率,所述第一速率为所述第一图像块在对应指定周期的结束时刻未完全发送时,所述编码器向所述解码器发送所述目标帧的速率。
本申请提供的方案,编码器重新向解码器发送目标帧的速率小于第一速率,可以提升编码器接收成功反馈信息的概率,进一步地,编码器可以根据实时接收的反馈信息同步编码器和解码器的状态,或,了解解码器对接收的视频帧的同步状态。
结合第一方面,在一些可能的实现方式中,所述编码器向所述解码器发送所述目标帧的速率设置为以下至少一种:
系统统计的期望速率、最近成功发送所述目标帧的速率或最近成功发送所述目标帧的速率映射值。
结合第一方面,在一些可能的实现方式中,所述图像块包括瓦片tile或条带片段slice。
第二方面,提供一种视频传输的方法,所述方法应用于解码器,所述方法包括:
所述解码器接收目标帧,所述目标帧用于请求反馈信息,所述目标帧是编码器将多个图像块中的第一图像块在对应指定周期的结束时刻未完全发送时发送的;
响应于所述目标帧,所述解码器向所述编码器发送所述反馈信息。
本申请提供的方案,解码器响应于接收该目标帧,向编码器发送反馈信息。由于目标帧是编码器在确定第一图像块在对应指定周期的结束时刻未完全发送时发送的,可以减少解码器反馈信息的延时,便于编码器和解码器两侧的状态的同步。
结合第二方面,在一些可能的实现方式中,所述目标帧为探测帧,所述探测帧用于指示所述解码器反馈所述解码器接收视频帧的图像块的状态,所述反馈信息用于表征所述解码器接收的所述视频帧的状态;或,
所述目标帧为同步帧,所述同步帧用于指示所述解码器对接收的视频帧进行同步处理,所述反馈信息用于表征所述解码器对接收的视频帧的同步状态。
本申请提供的方案,目标帧所指示的内容不同,解码器向编码器发送的反馈信息所表征的内容不同。具体地,若目标帧为用于指示所述解码器反馈所述解码器接收所述视频帧 的图像块的状态的探测帧,反馈信息用于表征所述解码器接收的所述视频帧的状态;若目标帧为用于指示所述解码器对接收的视频帧进行同步处理的同步帧,反馈信息用于表征所述解码器对接收的视频帧的同步状态。不管目标帧所指示的和反馈信息所表征的内容是什么,其目的均是为了同步编码器和解码器两侧的视频帧的状态,区别在于同步两侧的视频帧的状态这一动作的执行主体的不同(即若目标帧为探测帧,编码器同步两侧的视频帧的状态;若目标帧为同步帧,解码器同步两侧的视频帧的状态),从而可以提升同步编码器和解码器两侧的视频帧的状态的灵活性。
结合第二方面,在一些可能的实现方式中,所述目标帧为媒体接入控制MAC帧。
结合第二方面,在一些可能的实现方式中,所述目标帧为所述探测帧时,所述目标帧为控制帧、数据帧或管理帧中的任一种;
所述目标帧为所述同步帧时,所述目标帧为数据帧或管理帧中的一种。
本申请提供的方案,目标帧的具体格式与目标帧所指示的内容有关,若目标帧为用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态的探测帧,该目标帧可以为控制帧、数据帧或管理帧中的任一种;若目标帧为用于指示所述解码器对接收的视频帧进行同步处理的同步帧,该目标帧可以为数据帧或管理帧中的一种;有利于同步编码器和解码器两侧的视频帧的状态。
结合第二方面,在一些可能的实现方式中,所述控制帧包括短探测帧;
所述数据帧为服务质量QoS空指帧或QoS数据帧;
所述管理帧为无请求确认帧或确认帧。
结合第二方面,在一些可能的实现方式中,所述目标帧为所述探测帧时,所述MAC帧的帧头携带私有值,所述私有值用于指示所述编码器向所述解码器请求反馈信息。
本申请提供的方案,当目标帧为探测帧时,MAC帧的帧头携带用于指示所述编码器向所述解码器请求反馈信息的私有值,解码器接收到该探测帧后,由于该探测帧中携带用于指示所述解码器反馈信息的私有值,可以立即向编码器发送用于表征解码器接收的视频帧的状态的反馈信息,编码器可以根据该反馈信息同步与解码器的状态,从而可以保证图像传输的同步效果。
结合第二方面,在一些可能的实现方式中,所述目标帧为所述控制帧,所述控制帧的帧头的控制域设置为所述私有值;或,
所述目标帧为所述数据帧,所述数据帧的帧头的高吞吐量控制HTC域设置为所述私有值;或,
所述目标帧为所述管理帧,所述管理帧的帧头的HTC域设置为所述私有值,或,
所述管理帧的负载中首个字段设置为所述私有值。
结合第二方面,在一些可能的实现方式中,所述目标帧为所述同步帧时,所述MAC帧的数据部分携带指示信息,所述指示信息用于指示所述解码器对接收的视频帧进行同步处理。
本申请提供的方案,当目标帧为同步帧时,MAC帧的数据部分携带用于指示所述解码器对接收的视频帧进行同步处理的指示信息,解码器接收到该同步帧后,由于该同步帧中携带用于指示所述解码器对接收的视频帧进行同步处理的指示信息,可以根据该同步帧同步与编码器的状态,从而可以保证图像传输的同步效果。
结合第二方面,在一些可能的实现方式中,所述解码器向所述编码器发送所述反馈信息,包括:
所述解码器中的数据链路层根据所述目标帧生成所述反馈信息;
所述解码器向所述编码器发送所述反馈信息。
本申请提供的方案,解码器中的数据链路层根据接收的目标帧生成反馈信息并发送至解码器,由于该反馈信息是在解码器中的数据链路层生成并通过物理层发送至编码器的,因此,可以节省双向网络层、双向运输层传输该反馈信息的延时。
结合第二方面,在一些可能的实现方式中,所述解码器向所述编码器发送所述反馈信息的速率设置为以下至少一种:
系统统计的期望速率、最近成功接收所述编码器发送的视频帧的速率或最近成功接收所述编码器发送的视频帧的速率映射值。
第三方面,提供了一种装置,该装置包含在电子设备中,该装置具有实现上述方面及上述方面的可能实现方式中电子设备行为的功能。功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块或单元。
第四方面,提供了一种电子设备,包括:一个或多个处理器;存储器;一个或者多个应用程序;以及一个或多个计算机程序。其中,一个或多个计算机程序被存储在存储器中,一个或多个计算机程序包括指令。当指令被电子设备执行时,使得电子设备执行上述第一方面任一项或第二方面任一项可能的实现中的方法。
第五方面,提供了一种视频传输的装置,所述装置包括至少一个处理器,当程序指令在所述至少一个处理器中执行时,使得上述第一方面任一项或第二方面任一项可能的实现中的方法得以实现。
第六方面,提供了一种计算机存储介质,包括计算机指令,当计算机指令在电子设备或处理器上运行时,使得电子设备或处理器执行上述第一方面任一项或第二方面任一项可能的实现中的方法。
第七方面,提供了一种计算机程序产品,当计算机程序产品在电子设备或处理器上运行时,使得电子设备或处理器执行上述第一方面任一项或第二方面任一项可能的设计中的方法。
附图说明
图1是本申请实施例提供的一种电子设备的硬件结构示意图。
图2是本申请实施例提供的一种电子设备的软件结构示意图。
图3a是本申请实施例应用的一种场景示意图。
图3b是本申请实施例应用的另一种场景示意图。
图3c是本申请实施例应用的再一种场景示意图。
图4是本申请提供的一种基于TCP/IP通信协议的反馈方式的示意图。
图5是本申请提供的一种编码器和解码器的帧交互的示意图。
图6是本申请实施例提供的一种视频传输的方法的示意图。
图7是本申请实施例提供的一种编码器和解码器的帧交互的示意图。
图8是本申请实施例提供的一种MAC帧格式的示意图。
图9本申请实施例提供的另一种编码器和解码器的帧交互的示意图。
图10是本申请实施例提供的再一种编码器和解码器的帧交互的示意图。
图11是本申请实施例提供的又一种编码器和解码器的帧交互的示意图。
图12是本申请实施例提供的又一种编码器和解码器的帧交互的示意图。
图13是本申请实施例提供的又一种编码器和解码器的帧交互的示意图。
图14是本申请实施例提供的另一种基于TCP/IP通信协议的反馈方式的示意图。
图15是本申请实施例提供的另一种电子设备的示意性框图。
图16是本申请实施例提供的再一种电子设备的示意性框图。
图17是本申请实施例提供的又一种电子设备的示意性框图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。
本申请实施例中的方法可以应用于智能手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等电子设备上,本申请实施例对电子设备的具体类型不作任何限制。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。
本申请实施例提供的视频传输的方法可以应用于手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等电子设备上,本申请实施例对电子设备的具体类型不作任何限制。
示例性的,图1示出了电子设备100的结构示意图。电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器 (application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以 是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。显示屏194用于显示图像,视频等。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数 字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。本申请实施例中的发送目标帧,和响应于目标帧发送反馈信息可分别通过视频编码器和视频解码器实现。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图2是本申请实施例的电子设备100的软件结构框图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。应用程序层可以包括一系列应用程序包。
如图2所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图2所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(media libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
2D图形引擎是2D绘图的绘图引擎。
图3a、图3b、图3c均是申请实施例可以适用的示意性场景图。本申请实施例中,可以是第一电子设备向第二电子设备投屏,或者第一电子设备借助第二电子设备的摄像头实现视频通话、拍摄等操作。其中,第一电子设备可以和第二电子设备处于同一个局域网内,如通过WiFi-P2P建立连接,或者第一电子设备可以和第二电子设备通过蓝牙、超宽带(ultra wide band,UWB)、有线连接等连接方式建立连接。
参见图3a,第一电子设备可以是手机300a,第二电子设备可以是PC300b,当手机300a向PC300b投屏时,将手机300a的显示界面310的内容显示在PC的桌面的窗口中。可选地,在显示区域的上方还可以包括标题栏部分,该标题栏部分可以包括应用的名称“华为视频”,以及用于控制界面显示的最小化、最大化和关闭的控件等。
为了保证用户体验,一般会保证手机中的画面可以实时和流畅的显示在PC的屏幕中,即对投屏的时延要求较高。
参见图3b,第一电子设备可以是手机330a,第二电子设备可以是PC330b,当手机330a向PC330b投屏时,手机中的视频画面340被传输至PC330b中全屏显示,例如,手机中显示的为游戏画面,当手机向PC投屏时,将该游戏的画面实时在PC中显示,则用户希望该游戏画面投屏至PC中显示的实时性较好,即希望投屏的时延较低。
参见图3c,第一电子设备可以是手机360a,第二电子设备可以是智慧屏360b,在手机360a与智慧屏360b建立连接的情况下,手机360a可以借助智慧屏360b的摄像头进行视频通话、拍照等操作。
例如,手机360a借助智慧屏360b的摄像头进行视频通话,并将视频通话的画面显示在智慧屏360b的屏幕中,如智慧屏全屏显示本端摄像头拍摄的画面371,将对端的显示画面372以悬浮窗口的形式显示,或者,将本端和对端的视频画面分屏显示等。
在上述视频传输的场景下,当处在同一个局域网下的多个设备同时进行高吞吐率的操作时,如多个设备同时进行投屏操作,可能会导致卡顿,时延高等问题,第二电子设备可 以通过反馈信息以使得第一电子设备调整传输算法。
在无线通信领域中视频信号传输的场景中,可以基于TCP/IP反馈信息,编码器(可以理解为上述场景中的第一电子设备中的器件)接入信道后分层发送数据,解码器(可以理解为上述场景中的第二电子设备中的器件)接收数据,并在一定延时后接入信道反馈接收状态。
如图4所示,为本申请提供的一种基于TCP/IP通信协议的反馈方式的示意图。
参考图4,编码器和解码器主要包括运输层、网络层、数据链路层、物理层等部分,每一层级均为其上一层级提供数据通信服务,运输层服务于应用层。编码器竞争到信道后发送码流数据,解码器收到码流数据后,做上层信息处理,并在重新竞争到的信道上向底层下发处理后的反馈信息。
如图5所示,为本申请提供的一种编码器和解码器的帧交互的示意图。该示意图所示的帧交互采用的是块确认的方式反馈的。该方式需维护位映射查找表(bit map look up table,bitmap LUT)表缓存状态,且需保证解码侧的数据链路层的接收状态和运输层的接收状态匹配,避免数据链路层的误反馈。
参考图5,首先编码器可以发起RTS/CTS或NULL/ACK的短帧交易,保护一段TXOP序列,之后编码器可在该TXOP内连发多帧码流数据,将数据的MAC帧的确认策略字段设为block ack,在发完多帧码流数据后用一个BAR帧请求所有码流数据的反馈信息(即BA帧)。整个交互过程可以在同一TXOP内交互,节省了编码器和解码器两端重新竞争信道的时间。然而该方式下,编码器在成功发送完多帧码流数据后才可向解码器发送用于请求反馈信息的BAR帧,属于延时确认方式,从而编码器无法实时获取解码器的接收状态,导致编码器与解码器之间的状态无法同步。
因此,本申请提供一种视频传输的方法,可以减少解码器反馈信息的延时,同步编码器和解码器的状态。
如图6所示,为本申请实施例提供的一种视频传输的方法600的示意图。该方法600可以包括步骤S610-S640,该方法600可以由编码器和解码器实现,其中,步骤S610-S620可以由编码器实现,步骤S630-S640可以由解码器实现。
S610,生成视频帧,所述视频帧包括多个图像块,所述多个图像块中的每一个图像块在各自指定周期内发送。
本申请实施例中,编码器生成的视频帧可以被划分为多个图像块,可选地,在一些实施例中,这多个图像块可以包括瓦片(tile)或条带片段(slice segement,SS)(可简称为slice)。其中,通过对视频帧进行水平或垂直划分可以得到呈矩形的瓦片,对视频帧进行不规则划分可以得到呈条带形的条带片段。
需要说明的是,本申请实施例中的视频帧可以被均匀划分,也可以不被均匀划分,不予限制。
此外,这多个图像块中的每一个图像块在各自指定周期内发送可以理解为:对于视频帧包括的这多个图像块中的每一个图像块,编码器均配置了周期,编码器可以在配置的对应周期内将这每一个图像块发送至解码器。如图7所示,为本申请实施例提供的一种编码器和解码器的帧交互的示意图。
参考图7,该视频帧包括3个图像块,分别为图像块1、图像块2和图像块3,且编 码器分别为这3个图像块配置了相应的周期,如为图像块1配置的是第一周期,为图像块2配置的是第二周期,为图像块3配置的是第三周期,则编码器可以在第一周期内发送图像块1,在第二周期内发送图像块2,在第三周期内发送图像块3。
应理解,上述视频帧包括的图像块的数量仅为示例性说明,还可以为其它数值,不应对本申请造成特别限定。
S620,若所述多个图像块中的第一图像块在对应指定周期的结束时刻未完全发送,向所述解码器发送目标帧,所述目标帧用于向所述解码器请求反馈信息。
本申请实施例中,若多个图像块中的第一图像块在对应指定周期的结束时刻未完全发送,则编码器向解码器发送用于请求反馈信息的目标帧。仍然参考上述图7,该视频帧包括3个图像块,分别为图像块1、图像块2和图像块3,假设在第一周期的结束时刻图像块1被完全发送至解码器,则编码器可以不用向解码器发送目标帧;假设在第二周期的结束时刻图像块2未被完全发送至解码器,则编码器可以在第二周期的结束时刻向解码器发送目标帧。
需要说明的是,本申请实施例中的第一图像块在对应指定周期的结束时刻未完全发送可以理解为:在第一图像块对应的指定周期的结束时刻为止,该第一图像块中的部分数据并未发送至解码器。
此外,本申请实施例中,若多个图像块中的第一图像块在对应指定周期的结束时刻未完全发送时,编码器向解码器发送目标帧,在一些可能的实施例中,编码器向解码器发送目标帧的时刻可能会有延迟,不予限制。
S630,接收目标帧。
S640,响应于所述解码器接收的所述目标帧,向编码器发送所述反馈信息。
本申请实施例中,在解码器接收到用于向其请求反馈信息的目标帧后,响应于接收的目标帧,解码器向编码器发送反馈信息。
本申请提供的方案,编码器在确定视频帧包括的多个图像块中的第一图像块在对应指定周期的结束时刻未完全发送,向解码器发送用于请求反馈信息的目标帧,解码器响应于接收该目标帧,向编码器发送反馈信息。由于编码器在确定第一图像块在对应指定周期的结束时刻未完全发送时主动向解码器发送目标帧,以请求解码器发送反馈信息,无需等到多个视频帧发送完成后才可向解码器发送用于请求反馈信息的BAR帧,可以减少解码器反馈信息的延时,从而编码器可以根据实时接收的反馈信息同步编码器和解码器的状态,或,了解解码器对接收的视频帧的同步状态。
上文步骤S640中指出,响应于解码器接收的目标帧,解码器向编码器发送反馈信息,其中,反馈信息所表征的内容与目标帧所指示的内容有关,具体请参考下文。
可选地,在一些实施例中,所述目标帧为探测帧,所述探测帧用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态,所述反馈信息用于表征所述解码器接收的所述视频帧的状态;或者,
所述目标帧为同步帧,所述同步帧用于指示所述解码器对接收的视频帧进行同步处理,所述反馈信息用于表征所述解码器对接收的视频帧的同步状态。
本申请实施例中,若所述目标帧为用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态的探测帧,所述反馈信息用于表征所述解码器接收的所述视频帧的状 态。示例性地,仍然以上述图7中视频帧包括3个图像块为例,假设图像块2在第二周期内的结束时刻未完全发送至解码器,则编码器向解码器发送探测帧,以向解码器请求解码器此时所接收到的视频帧的状态,解码器在接收到探测帧后,响应于接收该探测帧,向编码器发送此时所接收到的视频帧的状态,如解码器在当前时刻接收到了图像块2中的部分数据,便向编码器发送当前时刻已接收的数据的情况,便于编码器同步与解码器的状态。对于图像块2中未发送的数据,编码器不再发送。
若所述目标帧为用于指示所述解码器对接收的视频帧进行同步处理的同步帧,所述反馈信息用于表征所述解码器接收的视频帧的同步状态。示例性地,仍然以上述图7中的视频帧包括3个图像块为例,假设图像块2在第二周期内的结束时刻未完全发送至解码器,则编码器向解码器发送同步帧,通知解码器在当前时刻已发送的图像块2的数据,便于解码器同步与编码器的状态。在解码器接收到编码器发送的同步帧,可向编码器发送确认接收到该同步帧,如解码器在当前时刻接收到了编码器发送的同步帧,则可以向编码器发送已接收到同步帧的反馈信息,该反馈信息用于表征解码器接收的视频帧的同步状态。
本申请提供的方案,编码器向解码器发送的目标帧所指示的内容不同,解码器向编码器发送的反馈信息所表征的内容不同。具体地,若目标帧为用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态的探测帧,反馈信息用于表征所述解码器接收的所述视频帧的状态;若目标帧为用于指示所述解码器对接收的视频帧进行同步处理的同步帧,反馈信息用于表征所述解码器对接收的视频帧的同步状态。不管目标帧所指示的和反馈信息所表征的内容是什么,其目的均是为了同步编码器和解码器两侧的视频帧的状态,区别在于同步两侧的视频帧的状态这一动作的执行主体的不同(即若目标帧为探测帧,编码器同步两侧的视频帧的状态;若目标帧为同步帧,解码器同步两侧的视频帧的状态),从而可以提升同步编码器和解码器两侧的视频帧的状态的灵活性。
在一些可能的实现方式中,目标帧也可以包括探测帧和同步帧,编码器既可以请求解码器接收的所述视频帧的状态,同时也向解码器传递了已发送的视频帧的状态信息。在这种情况下,编码器可以基于当前信道状态预估发送这两种帧的总时长,避免占用下一个周期的时长。
可选地,在一些实施例中,所述目标帧为MAC帧。
本申请实施例中的目标帧可以为MAC帧,如图8所示,为本申请实施例提供的一种MAC帧格式的示意图。参考图8,该MAC帧包括三部分:帧头、数据部分、帧尾。其中,帧头和帧尾可以包括地址信息等;数据部分可以包括待传输的数据。
具体地,MAC帧的帧头可以包括三个字段,其中,前两个字段分别为6字节长的目的地址字段和源地址字段,目的地址字段包含目的MAC地址信息,源地址字段包含源MAC地址信息。第三个字段为2字节的类型字段,包含的信息可以用来标志上一层使用的是什么协议,以便接收端将接收到的MAC帧的数据部分上交给上一层的这个协议。
MAC帧的数据部分包括一个字段,其长度在46到1500字节之间,包含网络层传下来的数据。
MAC帧的帧尾也包括一个字段,为4字节长,包含帧校验序列(frame check sequence,FCS)。
需要说明的是,本申请实施例中的目标帧除了可以为上述示例出的MAC帧之外,还 可以为其它帧,只要可以用于指示解码器反馈解码器接收视频帧的图像块的状态或用于指示解码器对接收的视频帧进行同步处理的帧均可应用本申请。
可选地,在一些实施例中,所述目标帧为所述探测帧时,所述目标帧为控制帧、数据帧或管理帧中的任一种;
所述目标帧为所述同步帧时,所述目标帧为数据帧或管理帧中的一种。
本申请实施例中,目标帧可以包括控制帧、数据帧或管理帧。其中,控制帧可以协助信息的传递,不能携带业务数据;数据帧和管理帧可以携带业务数据。
本申请实施例中,对于指示不同内容的目标帧,其包括的格式不同。
示例性地,若目标帧为探测帧,由于该探测帧用于指示解码器反馈解码器接收视频帧的图像块的状态,即,该探测帧可以不用携带当前已发送的视频帧的图像块的状态,因此,目标帧可以为控制帧、数据帧或管理帧中的任一种,即,目标帧可以为控制帧,也可以为数据帧,还可以为管理帧。
若目标帧为同步帧,由于同步帧用于指示解码器对接收的视频帧进行同步处理,即,该同步帧中需要携带当前已发送的视频帧的图像块的状态,因此,目标帧可以为数据帧或管理帧中的一种,即,目标帧可以为数据帧,也可以为管理帧。
本申请提供的方案,目标帧的具体格式与目标帧所指示的内容有关,若目标帧为用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态的探测帧,该目标帧可以为控制帧、数据帧或管理帧中的任一种;若目标帧为用于指示所述解码器对接收的视频帧进行同步处理的同步帧,该目标帧可以为数据帧或管理帧中的一种;有利于同步编码器和解码器两侧的视频帧的状态。
可选地,在一些实施例中,所述控制帧包括短探测帧;
所述数据帧包括服务质量(quality of service,QoS)空指帧或QoS数据帧;
所述管理帧包括无请求确认帧或确认帧。
本申请实施例中,控制帧可以包括短探测帧,数据帧可以包括QoS空指帧或QoS数据帧,管理帧可以包括无请求确认帧或确认帧。对于这多种格式包括的不同类型的帧,当目标帧所指示的内容不同时,其包括的帧的类型的不同。
示例性地,若目标帧为用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态的探测帧,则目标帧可以为控制帧、数据帧或管理帧中的任一种;具体地,若目标帧为控制帧,则目标帧可以为短探测帧,若目标帧为数据帧,则目标帧可以为QoS空指帧,若目标帧为管理帧,则目标帧可以为无请求确认帧。
若目标帧为用于指示所述解码器对接收的视频帧进行同步处理的同步帧,则目标帧可以为数据帧或管理帧的一种;具体地,若目标帧为数据帧,则目标帧可以为QoS数据帧,若目标帧为管理帧,则目标帧可以为确认帧。
上文说明了目标帧指示不同内容时的帧类型,对于指示不同内容的目标帧,当目标帧为MAC帧时,通过MAC帧的帧头或数据部分进行相应指示,具体请参考下文。
情况一:目标帧为探测帧
可选地,在一些实施例中,所述目标帧为所述探测帧时,所述MAC帧的帧头携带私有值,所述私有值用于指示所述编码器向所述解码器请求反馈信息。
可选地,在一些实施例中,所述目标帧为所述控制帧时,其特征在于,所述目标帧为 所述控制帧,所述控制帧的帧头的控制域设置为所述私有值;或,
所述目标帧为所述数据帧,所述数据帧的帧头的高吞吐量控制(high-throughput control,HTC)域设置为所述私有值;或,
所述目标帧为所述管理帧,所述管理帧的帧头的HTC域设置为所述私有值,或,
所述管理帧的负载中首个字段设置为所述私有值。
本申请实施例中,若目标帧为用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态的探测帧,则目标帧为MAC帧时,该MAC帧的帧头可以携带私有值,且该私有值用于指示编码器向解码器请求反馈信息。
下文分别介绍在目标帧为不同帧格式时,其对应的帧头的域设置为私有值。
(1)、目标帧为控制帧
本申请实施例中,目标帧为控制帧时,该控制帧的帧头的控制域设置为私有值。如图9所示,为本申请实施例提供的另一种编码器和解码器的帧交互的示意图。
参考图9,可以看出,在第二周期内的结束时刻图像块2未完全发送至解码器,则编码器向解码器发送短探测帧,解码器在接收到短探测帧后,响应于接收该短探测帧,向编码器发送反馈信息,该反馈信息用于表征解码器所接收的视频帧的状态。
在上述图9的示例中,编码器接收到解码器发送的反馈信息的延时时长包括:2个短帧间隙(short interframe space,SIFS)+发送短探测帧的时长
若编码器向解码器发送的视频帧采用的是正交频分复用技术(orthogonal frequency division multiplexing,OFDM)帧,则该SIFS=16us(该16us包括物理层收发延时15.6us加上调度延时0.4us);若MAC帧的帧长固定24字节,在速率为11a/g 24m时,发送短探测帧的时长为32us。因此,编码器接收到解码器发送的反馈信息的延时时长=2*16+32=64us。
本申请实施例中,编码器可以利用控制帧作为探测帧用于请求反馈信息。具体地,可以将控制帧的帧头的控制域设置为私有值,示例性地,如将多线程控制符(multi thread identifier,Multi-TID)、压缩位图(compressed bitmap)、重试组播(groupcast with retries,GCR)这三个位的组合设为预留值(reserved),如表1所示。当解码器解析到控制帧的帧头的控制域设置为预留值(即本申请中的私有值)时,可以直接响应反馈信息。
表1
Figure PCTCN2021119379-appb-000001
Figure PCTCN2021119379-appb-000002
参考上述表1,可以看出,当Multi-TID、压缩位图、GCR这三个位的子字段值分别为001或101或111时,可以将块确认请求帧变体设置为预留值,即,将控制帧的帧头的控制域设置为预留值。
(2)、目标帧为数据帧
本申请实施例中,目标帧为数据帧时,该数据帧的帧头的HTC域设置为私有值。如图10所示,为本申请实施例提供的再一种编码器和解码器的帧交互的示意图。
参考图10,可以看出,在第二周期内的结束时刻图像块2未完全发送至解码器,则编码器向解码器发送QoS空指帧,解码器在接收到QoS空指帧后,响应于接收该QoS空指帧,向编码器发送反馈信息,该反馈信息用于表征解码器所接收的视频帧的状态。
在上述图10的示例中,编码器接收到解码器发送的反馈信息的延时时长包括:2个短帧间隙(short interframe space,SIFS)+发送QoS空指帧的时长
若编码器向解码器发送的视频帧采用的是OFDM帧,则该SIFS=16us;若MAC帧的帧长固定34字节,在速率为11a/g 24m时,发送QoS空指帧的时长为36us。因此,编码器接收到解码器发送的反馈信息的延时时长=2*16+36=68us。
本申请实施例中,编码器可以利用QoS空指帧作为探测帧用于请求反馈信息。具体地,可以将QoS空指帧的帧头的HTC域设置为私有值,示例性地,将高效吞吐(high efficiency,HE)格式的A-控制域中的控制标识(identity,ID)7-14设为预留值,如表2和表3所示。解码器收到指定HTC域的Qos空指帧后,可以直接响应反馈信息。
表2
Figure PCTCN2021119379-appb-000003
表3
Figure PCTCN2021119379-appb-000004
Figure PCTCN2021119379-appb-000005
参考上述表2和表3,可以看出,当变体为HE时,B2-B31为A-控制域;并且可以将A-控制域的控制ID值7-14设置为预留值。
(3)目标帧为管理帧
本申请实施例中,目标帧为管理帧时,该管理帧的帧头的HTC域设置为私有值或该管理帧的负载中首个字段设置为私有值。如图11所示,为本申请实施例提供的又一种编码器和解码器的帧交互的示意图。
参考图11,可以看出,在第二周期内的结束时刻图像块2未完全发送至解码器,则编码器向解码器发送无请求确认(action no ack)帧,解码器在接收到无请求确认后,响应于接收该无请求确认帧,向编码器发送反馈信息,该反馈信息用于表征解码器所接收的视频帧的状态。
在上述图11的示例中,编码器接收到解码器发送的反馈信息的延时时长包括:2个短帧间隙(short interframe space,SIFS)+发送无请求确认帧的时长
若编码器向解码器发送的视频帧采用的是OFDM帧,则该SIFS=16us;若MAC帧的帧长固定29字节,在速率为11a/g 24m时,发送无请求确认帧的时长为32us。因此,编码器接收到解码器发送的反馈信息的延时时长=2*16+32=64us。
本申请实施例中,编码器可以利用无请求确认帧作为探测帧用于请求反馈信息。具体地,可以将无请求确认帧的帧头的HTC域设置为私有值,或者,将无请求确认帧的负载中首个字段设置为私有值。
若将无请求确认帧的帧头的HTC域设置为私有值,可以将HE格式的A-控制(A-control)域中控制ID值7-14设为预留值,具体请参见上述表2和表3所示的内容。
若将无请求确认帧的负载中首个字段设置为私有值,可以将无请求确认帧的负载中的字段(或者称为代码)21-125设置为私有值,如表4所示。
表4
Figure PCTCN2021119379-appb-000006
Figure PCTCN2021119379-appb-000007
本申请提供的方案,当目标帧为探测帧时,MAC帧的帧头携带用于指示所述编码器向所述解码器请求反馈信息的私有值,解码器接收到该探测帧后,由于该探测帧中携带用于指示所述解码器反馈信息的私有值,可以立即向编码器发送用于表征解码器接收的视频帧的状态的反馈信息,编码器可以根据该反馈信息同步与解码器的状态,从而可以保证图像传输的同步效果。
情况二:目标帧为同步帧
可选地,在一些实施例中,所述目标帧为所述同步帧时,所述MAC帧的数据部分携带指示信息,所述指示信息用于指示所述解码器对接收的视频帧进行同步处理。
上文指出,MAC帧包括三部分:帧头、数据部分、帧尾。当目标帧为同步帧时,MAC帧的帧头无需携带私有值,该MAC帧的数据部分携带指示信息即可,该指示信息用于指示解码器对接收的视频帧进行同步处理。
(1)目标帧为数据帧
本申请实施例中,编码器可以利用QoS数据帧作为同步帧用于传输编码器发送的视频帧的状态。如图12所示,为本申请实施例提供的又一种编码器和解码器的帧交互的示意图。
参考图12,可以看出,在第二周期内的结束时刻图像块2未完全发送至解码器,则编码器向解码器发送QoS数据帧,该QoS数据帧的数据部分携带指示信息(该指示信息中可以包括编码器已发送的图像块2的码流编号),解码器在接收到QoS数据帧后,可以根据接收的QoS数据帧同步与编码器的状态。同时,响应于接收该QoS数据帧,解码器可以向编码器发送反馈信息,该反馈信息用于表征解码器对接收的视频帧的同步状态。
(2)目标帧为管理帧
本申请实施例中,编码器可以利用确认帧作为同步帧用于传输编码器发送的视频帧的状态。如图13所示,为本申请实施例提供的又一种编码器和解码器的帧交互的示意图。
参考图13,可以看出,在第二周期内的结束时刻图像块2未完全发送至解码器,则编码器向解码器发送确认帧,该确认帧的数据部分携带指示信息(该指示信息中可以包括编码器已发送的图像块2的码流编号),解码器在接收到确认帧后,可以根据接收的确认帧同步与编码器的状态。同时,响应于接收该确认帧,解码器可以向编码器发送反馈信息,该反馈信息用于表征解码器对接收的视频帧的同步状态。
本申请提供的方案,当目标帧为同步帧时,MAC帧的数据部分携带用于指示所述解码器对接收的视频帧进行同步处理的指示信息,解码器接收到该同步帧后,由于该同步帧中携带用于指示所述解码器对接收的视频帧进行同步处理的指示信息,可以根据该同步帧同步与编码器的状态,从而可以保证图像传输的同步效果。
上文介绍了目标帧分别为探测帧和同步帧时的相关内容,在一些实施例中,编码器可以重新向解码器发送目标帧,详见下文。
可选地,在一些实施例中,若所述编码器未接收到所述反馈信息,所述方法还包括:
所述编码器重新向所述解码器发送所述目标帧。
本申请实施例中,若第一图像块在对应指定周期的结束时刻未完全发送时,编码器可以向解码器发送用于请求反馈信息的目标帧,解码器在接收到该目标帧后,响应于接收到的目标帧,便向编码器发送反馈信息。在一些可能的实施例中,编码器并未接收到解码器发送的反馈信息,如由于信道堵塞等原因导致编码器并未接收到反馈信息,此时编码器可以重新向解码器发送用于请求反馈信息的目标帧,解码器再次接收到目标帧后,可以再次向编码器发送反馈信息。
可以理解的是,编码器再次发送的反馈信息所表征的内容与编码器初次发送的反馈信息(可以理解为解码器响应于接收的目标帧(该目标帧为第一图像块在对应指定周期的结束时刻未完全发送时,编码器向解码器发送的目标帧),向编码器发送的反馈信息)所表征的内容是一致的。由于第一图像块在对应指定周期的结束时刻未完全发送时,对于该第一图像块中未发送的数据,编码器不再发送,不管目标帧是探测帧还是同步帧,反馈信息表征的是解码器已接收的视频帧的状态或同步状态,因此,编码器再次发送的反馈信息所表征的内容与编码器初次发送的反馈信息所表征的内容是一致的。
本申请提供的方案,在编码器未接收到解码器发送的反馈信息时,可以重新向解码器发送用于请求反馈信息的目标帧,以期成功接收解码器发送的反馈信息,便于编码器根据反馈信息同步编码器和解码器的状态,或,了解解码器对接收的视频帧的同步状态。
可选地,在一些实施例中,所述编码器重新向所述解码器发送所述目标帧的速率小于第一速率,所述第一速率为所述第一图像块在对应指定周期的结束时刻未完全发送时,所述编码器向所述解码器发送所述目标帧的速率。
本申请实施例中,编码器重新向解码器发送目标帧的速率小于第一速率,该第一速率即为上述第一图像块在对应指定周期的结束时刻未完全发送时,编码器向解码器发送目标帧的速率。示例性地,若第一图像块在对应指定周期的结束时刻未完全发送时,编码器向解码器发送目标帧的速率为12Mbps,则编码器重新向解码器发送目标帧的速率可以为6Mbps。
需要说明的是,编码器向解码器发送目标帧的速率越小,则解码器成功接收该目标帧的概率越大,因此,本申请中编码器重新向解码器发送目标帧的速率降低。
应理解,上述数值仅为举例说明,编码器向解码器发送目标帧的速率还可以为其它数值,不应对本申请造成特别限定。
本申请提供的方案,编码器重新向解码器发送目标帧的速率小于第一速率,可以提升编码器接收成功反馈信息的概率,进一步地,编码器可以根据实时接收的反馈信息同步编码器和解码器的状态,或,了解解码器对接收的视频帧的同步状态。
可选地,在一些实施例中,所述编码器向所述解码器发送所述目标帧的速率设置为以下至少一种:
系统统计的期望速率、最近成功发送所述目标帧的速率或最近成功发送所述目标帧的速率映射值。
本申请实施例中,编码器向解码器发送目标帧的速率可以基于以下设置:
(1)系统统计的期望速率
若系统根据多次发送目标帧的速率统计出的期望速率为v1,则可以将v1设置为编码器向解码器发送目标帧的速率。示例性地,若系统根据多次发送目标帧的速率统计出的期望速率为12Mbps,则可以设置编码器向解码器发送目标帧的速率为12Mbps。
(2)最近成功发送所述目标帧的速率
若编码器最近成功发送目标帧的速率为v2,则可以将v2设置为编码器发送目标帧的速率。示例性地,若编码器最近成功发送目标帧的速率为9Mbps,则可以设置编码器最近成功发送目标帧的速率为9Mbps。
(3)最近成功发送目标帧的速率映射值
若编码器最近成功发送的目标帧的速率为v2,该速率v2在其它协议格式下的所对应的速率映射值为v3,则可以将v3设置为编码器最近成功发送的目标帧的速率。若编码器在某一协议格式(如he格式)最近成功发送的目标帧的速率为9Mbps,该速率9Mbps在另一协议格式下(如非高吞吐量(non-high throughput,non-ht)格式)所对应的速率映射值可以为6Mbps,则可以设置编码器最近成功发送的目标帧的速率为6Mbps。
上文步骤S640中指出,响应于解码器接收的目标帧,解码器向编码器发送反馈信息,其中,该反馈信息的生成过程可以在解码器中的某一层实现,具体请参见下文。
可选地,在一些实施例中,所述解码器向所述编码器发送所述反馈信息,包括:
所述解码器中的数据链路层根据所述目标帧生成所述反馈信息;
所述解码器向所述编码器发送所述反馈信息。
本申请实施例中,在解码器接收到编码器发送的目标帧后,响应于该目标帧,解码器中的数据链路层根据该目标帧生成反馈信息并通过物理层将该反馈信息发送至编码器。
如图14所示,为本申请实施例提供的另一种基于TCP/IP通信协议的反馈方式的示意图。参考图14,编码器竞争到信道后向解码器发送目标帧,解码器中的数据链路层通过物理层接收到该目标帧后,根据该目标帧生成反馈信息,并通过物理层将该反馈信息发送至编码器。由于该反馈信息是在解码器中的数据链路层生成并通过物理层发送至编码器的,因此,可以节省双向网络层、双向运输层传输该反馈信息的延时。
应理解,虽然反馈信息是在解码器中的数据链路层中生成,但对于接收到的目标帧,解码器中的数据链路层仍然可以将该目标帧传输至网络层,并由网络层传输至运输层,便于解码器中的运输层做视频显示业务、声音信号传输等处理。
本申请提供的方案,解码器中的数据链路层根据接收的目标帧生成反馈信息并发送至解码器,由于该反馈信息是在解码器中的数据链路层生成并通过物理层发送至编码器的,因此,可以节省双向网络层、双向运输层传输该反馈信息的延时。
可选地,在一些实施例中,所述解码器向所述编码器发送所述反馈信息的速率设置为以下至少一种:
系统统计的期望速率、最近成功接收所述编码器发送的视频帧的速率或最近成功接收所述编码器发送的视频帧的速率映射值。
本申请实施例中,解码器向编码器发送反馈信息的速率可以基于以下设置:
(1)系统统计的期望速率
若系统根据多次发送反馈信息的速率统计出的期望速率为v1’,则可以将v1’设置为解码器向编码器发送反馈信息的速率。示例性地,若系统根据多次发送反馈信息的速率统计 出的期望速率为12Mbps,则可以设置解码器向编码器发送反馈信息的速率为12Mbps。
(2)最近成功接收编码器发送的视频帧的速率
若解码器最近成功接收编码器发送的视频帧的速率为v2’,则可以将v2’设置为解码器向编码器发送反馈信息的速率。示例性地,若解码器最近成功接收编码器发送的视频帧的速率为9Mbps,则可以设置解码器向编码器发送反馈信息的速率为9Mbps。
(3)最近成功接收所述编码器发送的视频帧的速率映射值
若解码器最近成功接收编码器发送的视频帧的速率为v2’,该速率v2’在其它协议格式下所对应的速率映射值为v3’,则可以将v3’设置为解码器向编码器发送反馈信息的速率。若解码器在某一协议格式(如he格式)最近成功接收编码器发送的视频帧的速率为9Mbps,该速率9Mbps在另一协议格式下(如non-ht格式)所对应的速率映射值可以为6Mbps,则可以设置解码器向编码器发送反馈信息的速率为6Mbps。
可以理解的是,电子设备为了实现上述功能,其包含了执行各个功能相应的硬件和/或软件模块。结合本文中所公开的实施例描述的各示例的算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以结合实施例对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本实施例可以根据上述方法示例对电子设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块可以采用硬件的形式实现。需要说明的是,本实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图15示出了上述实施例中涉及的电子设备1500的一种可能的组成示意图,如图15所示,该电子设备1500可以包括:生成模块1510和通信模块1520。
其中,生成模块1510可以用于支持电子设备1500执行上述步骤S610等,和/或用于本文所描述的技术的其他过程。
通信模块1520可以用于支持电子设备1500执行上述步骤S620等,和/或用于本文所描述的技术的其他过程。
需要说明的是,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
本实施例提供的电子设备,用于执行上述本申请的方法,因此可以达到与上述实现方法相同的效果。
图16示出了上述实施例中涉及的电子设备1600的一种可能的组成示意图,如图16所示,该电子设备1600可以包括:通信模块1610。
其中,通信模块1610可以用于支持电子设备1600执行上述步骤S630或S640等,和/或用于本文所描述的技术的其他过程。
需要说明的是,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
在采用集成的单元的情况下,电子设备可以包括处理模块、存储模块和通信模块。其 中,处理模块可以用于对电子设备的动作进行控制管理,例如,可以用于支持电子设备执行上述各个单元执行的步骤。存储模块可以用于支持电子设备执行存储程序代码和数据等。通信模块,可以用于支持电子设备与其他设备的通信。
其中,处理模块可以是处理器或控制器。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,数字信号处理(digital signal processing,DSP)和微处理器的组合等等。存储模块可以是存储器。通信模块具体可以为射频电路、蓝牙芯片、Wi-Fi芯片等与其他电子设备交互的设备。
在一个实施例中,当处理模块为处理器,存储模块为存储器时,本实施例所涉及的电子设备可以为具有图1所示结构的设备。
图17示出了上述实施例涉及的电子设备800的另一种可能的组成示意图,如图17所示,该电子设备800可以包括通信单元810、输入单元820、处理单元830、输出单元(或也可以称为显示单元)840、外设接口850、存储单元860、电源870、视频编/解码器880以及音频编/解码器890。
通信单元810用于建立通信信道,使电子设备800通过所述通信信道以连接至远程服务器,并从所述远程服务器下媒体数据。所述通信单元810可以包括WLAN模块、蓝牙模块、NFC模块、基带模块等通信模块,以及所述通信模块对应的射频(Radio Frequency,简称RF)电路,用于进行无线局域网络通信、蓝牙通信、NFC通信、红外线通信及/或蜂窝式通信系统通信,例如宽带码分多重接入(wideband code division multiple access,W-CDMA)及/或高速下行封包存取(high speed downlink packet access,HSDPA)。所述通信模块810用于控制电子设备中的各组件的通信,并且可以支持直接内存存取。
输入单元820可以用于实现用户与电子设备的交互和/或信息输入到电子设备中。在本发明具体实施方式中,输入单元可以是触控面板,也可以是其他人机交互界面,例如实体输入键、麦克风等,还可是其他外部信息撷取装置,例如摄像头等。
处理单元830为电子设备的控制中心,可以利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储单元内的软件程序和/或模块,以及调用存储在存储单元内的数据,以执行电子设备的各种功能和/或处理数据。
输出单元840包括但不限于影像输出单元和声音输出单元。影像输出单元用于输出文字、图片和/或视频。在本发明的具体实施方式中,上述输入单元820所采用的触控面板亦可同时作为输出单元840的显示面板。例如,当触控面板检测到在其上的触摸或接近的手势操作后,传送给处理单元以确定触摸事件的类型,随后处理单元根据触摸事件的类型在显示面板上提供相应的视觉输出。虽然在图17中,输入单元820与输出单元840是作为两个独立的部件来实现电子设备的输入和输出功能,但是在某些实施例中,可以将触控面板与显示面板集成一体而实现电子设备的输入和输出功能。例如,所述影像输出单元可以显示各种图形化用户接口以作为虚拟控制组件,包括但不限于窗口、卷动轴、图标及剪贴簿,以供用户通过触控方式进行操作。
存储单元860可用于存储软件程序以及模块,处理单元通过运行存储在存储单元的软件程序以及模块,从而执行电子设备的各种功能应用以及实现数据处理。
视频编/解码器880和音频编/解码器890可以对文件进行编码或解码,以便于实现上 述实施例中的方法。
本实施例还提供一种计算机存储介质,该计算机存储介质中存储有计算机指令,当该计算机指令在电子设备或处理器上运行时,使得电子设备或处理器执行上述相关方法步骤实现上述实施例中的方法。
本实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机或处理器上运行时,使得计算机或处理器执行上述相关步骤,以实现上述实施例中的方法。
另外,本申请的实施例还提供一种装置,这个装置具体可以是芯片,组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装置运行时,处理器可执行存储器存储的计算机执行指令,以使芯片执行上述各方法实施例中的方法。
其中,本实施例提供的电子设备、计算机存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上实施方式的描述,所属领域的技术人员可以了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (26)

  1. 一种视频传输的方法,其特征在于,所述方法应用于编码器,所述方法包括:
    所述编码器生成视频帧,所述视频帧包括多个图像块,所述多个图像块中的每一个图像块在各自指定周期内发送;
    若所述多个图像块中的第一图像块在对应指定周期的结束时刻未完全发送,所述编码器向所述解码器发送目标帧,所述目标帧用于向所述解码器请求反馈信息。
  2. 如权利要求1所述的方法,其特征在于,所述目标帧为探测帧,所述探测帧用于指示所述解码器反馈所述解码器接收所述视频帧的图像块的状态,所述反馈信息用于表征所述解码器接收的所述视频帧的状态;或者,
    所述目标帧为同步帧,所述同步帧用于指示所述解码器对接收的视频帧进行同步处理,所述反馈信息用于表征所述解码器对接收的视频帧的同步状态。
  3. 如权利要求1或2所述的方法,其特征在于,所述目标帧为媒体接入控制MAC帧。
  4. 如权利要求1至3中任一项所述的方法,其特征在于,所述目标帧为所述探测帧时,所述目标帧为控制帧、数据帧或管理帧中的任一种;
    所述目标帧为所述同步帧时,所述目标帧为数据帧或管理帧中的一种。
  5. 如权利要求4所述的方法,其特征在于,所述控制帧包括短探测帧;
    所述数据帧包括服务质量QoS空指帧或QoS数据帧;
    所述管理帧包括无请求确认帧或确认帧。
  6. 如权利要求3至5中任一项所述的方法,其特征在于,所述目标帧为所述探测帧时,所述MAC帧的帧头携带私有值,所述私有值用于指示所述编码器向所述解码器请求反馈信息。
  7. 如权利要求6所述的方法,其特征在于,所述目标帧为所述控制帧,所述控制帧的帧头的控制域设置为所述私有值;或,
    所述目标帧为所述数据帧,所述数据帧的帧头的高吞吐量控制HTC域设置为所述私有值;或,
    所述目标帧为所述管理帧,所述管理帧的帧头的HTC域设置为所述私有值,或,
    所述管理帧的负载中首个字段设置为所述私有值。
  8. 如权利要求3至5中任一项所述的方法,其特征在于,所述目标帧为所述同步帧时,所述MAC帧的数据部分携带指示信息,所述指示信息用于指示所述解码器对接收的视频帧进行同步处理。
  9. 如权利要求1至8中任一项所述的方法,其特征在于,若所述编码器未接收到所述反馈信息,所述方法还包括:
    所述编码器重新向所述解码器发送所述目标帧。
  10. 如权利要求9所述的方法,其特征在于,所述编码器重新向所述解码器发送所述目标帧的速率小于第一速率,所述第一速率为所述第一图像块在对应指定周期的结束时刻未完全发送时,所述编码器向所述解码器发送所述目标帧的速率。
  11. 如权利要求1至10中任一项所述的方法,其特征在于,所述编码器向所述解码器发送所述目标帧的速率设置为以下至少一种:
    系统统计的期望速率、最近成功发送所述目标帧的速率或最近成功发送所述目标帧的速率映射值。
  12. 如权利要求1至11中任一项所述的方法,其特征在于,所述图像块包括瓦片tile或条带片段slice。
  13. 一种视频传输的方法,其特征在于,所述方法应用于解码器,所述方法包括:
    所述解码器接收目标帧,所述目标帧用于请求反馈信息,所述目标帧是编码器将多个图像块中的第一图像块在对应指定周期的结束时刻未完全发送时发送的;
    响应于所述目标帧,所述解码器向所述编码器发送所述反馈信息。
  14. 如权利要求13所述的方法,其特征在于,所述目标帧为探测帧,所述探测帧用于指示所述解码器反馈所述解码器接收视频帧的图像块的状态,所述反馈信息用于表征所述解码器接收的所述视频帧的状态;或,
    所述目标帧为同步帧,所述同步帧用于指示所述解码器对接收的视频帧进行同步处理,所述反馈信息用于表征所述解码器对接收的视频帧的同步状态。
  15. 如权利要求13或14所述的方法,其特征在于,所述目标帧为媒体接入控制MAC帧。
  16. 如权利要求15所述的方法,其特征在于,所述目标帧为所述探测帧时,所述目标帧为控制帧、数据帧或管理帧中的任一种;
    所述目标帧为所述同步帧时,所述目标帧为数据帧或管理帧中的一种。
  17. 如权利要求16所述的方法,其特征在于,所述控制帧包括短探测帧;
    所述数据帧为服务质量QoS空指帧或QoS数据帧;
    所述管理帧为无请求确认帧或确认帧。
  18. 如权利要求15至17中任一项所述的方法,其特征在于,所述目标帧为所述探测帧时,所述MAC帧的帧头携带私有值,所述私有值用于指示所述编码器向所述解码器请求反馈信息。
  19. 如权利要求18所述的方法,其特征在于,所述目标帧为所述控制帧,所述控制帧的帧头的控制域设置为所述私有值;或,
    所述目标帧为所述数据帧,所述数据帧的帧头的高吞吐量控制HTC域设置为所述私有值;或,
    所述目标帧为所述管理帧,所述管理帧的帧头的HTC域设置为所述私有值,或,
    所述管理帧的负载中首个字段设置为所述私有值。
  20. 如权利要求15至17中任一项所述的方法,其特征在于,所述目标帧为所述同步帧时,所述MAC帧的数据部分携带指示信息,所述指示信息用于指示所述解码器对接收的视频帧进行同步处理。
  21. 如权利要求13至20中任一项所述的方法,其特征在于,所述解码器向所述编码器发送所述反馈信息,包括:
    所述解码器中的数据链路层根据所述目标帧生成所述反馈信息;
    所述解码器向所述编码器发送所述反馈信息。
  22. 如权利要求13至21中任一项所述的方法,其特征在于,所述解码器向所述编码器发送所述反馈信息的速率设置为以下至少一种:
    系统统计的期望速率、最近成功接收所述编码器发送的视频帧的速率或最近成功接收所述编码器发送的视频帧的速率映射值。
  23. 一种电子设备,其特征在于,包括:
    一个或多个处理器;
    一个或多个存储器;
    所述一个或多个存储器存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行如权利要求1至12或13至22中任一项所述的方法。
  24. 一种视频传输的装置,其特征在于,所述装置包括至少一个处理器,当程序指令在所述至少一个处理器中执行时,使得如权利要求1至12或13至22中任一所述的方法得以实现。
  25. 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备或处理器上运行时,使得所述电子设备或所述处理器执行如权利要求1至12或13至22中任一项所述的方法。
  26. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机或处理器上运行时,使得所述计算机或所述处理器执行如权利要求1至12或13至22中任一项所述的方法。
PCT/CN2021/119379 2021-09-18 2021-09-18 视频传输的方法和电子设备 WO2023039890A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/119379 WO2023039890A1 (zh) 2021-09-18 2021-09-18 视频传输的方法和电子设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/119379 WO2023039890A1 (zh) 2021-09-18 2021-09-18 视频传输的方法和电子设备

Publications (1)

Publication Number Publication Date
WO2023039890A1 true WO2023039890A1 (zh) 2023-03-23

Family

ID=85602355

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/119379 WO2023039890A1 (zh) 2021-09-18 2021-09-18 视频传输的方法和电子设备

Country Status (1)

Country Link
WO (1) WO2023039890A1 (zh)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101073218A (zh) * 2004-10-05 2007-11-14 高通股份有限公司 增强的块确认
US20090232044A1 (en) * 2008-03-14 2009-09-17 Samsung Electronics Co. Ltd. Apparatus and method for retransmitting of data in a wireless communication system using relay
CN112136285A (zh) * 2019-04-18 2020-12-25 北京小米移动软件有限公司 一种数据传输方法、装置及存储介质

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101073218A (zh) * 2004-10-05 2007-11-14 高通股份有限公司 增强的块确认
US20090232044A1 (en) * 2008-03-14 2009-09-17 Samsung Electronics Co. Ltd. Apparatus and method for retransmitting of data in a wireless communication system using relay
CN112136285A (zh) * 2019-04-18 2020-12-25 北京小米移动软件有限公司 一种数据传输方法、装置及存储介质

Similar Documents

Publication Publication Date Title
WO2021004381A1 (zh) 一种投屏显示方法及电子设备
WO2021051986A1 (zh) 设备之间建立连接的方法及电子设备
WO2021218864A1 (zh) 一种Wi-Fi点对点业务的实现方法以及相关设备
WO2021185141A1 (zh) Wi-Fi Aware的建链方法、系统、电子设备和存储介质
WO2021175300A1 (zh) 数据传输方法、装置、电子设备和可读存储介质
WO2021047567A1 (zh) 一种回调流的处理方法及设备
WO2022083386A1 (zh) 投屏方法、系统及电子设备
WO2022017205A1 (zh) 一种显示多个窗口的方法及电子设备
EP4311314A1 (en) Sleep scheduling method and device
WO2022143077A1 (zh) 一种拍摄方法、系统及电子设备
WO2022228083A1 (zh) 一种多设备建立连接的方法及设备
EP4242810A1 (en) Application sharing method, and electronic device and storage medium
WO2022222924A1 (zh) 一种投屏显示参数调节方法
WO2022048371A1 (zh) 跨设备音频播放方法、移动终端、电子设备及存储介质
WO2021134386A1 (zh) 一种通信方法、通信装置和系统
WO2021179990A1 (zh) 一种应用服务器的访问方法及终端
CN113645608B (zh) 数据传输方法和数据传输装置
JP2024516668A (ja) デバイスネットワーキング方法、電子デバイス、及び記憶媒体
WO2021027727A1 (zh) 信息传输的方法和电子设备
WO2023039890A1 (zh) 视频传输的方法和电子设备
WO2022160985A1 (zh) 一种分布式拍摄方法,电子设备及介质
WO2021114950A1 (zh) 一种多路http通道复用的方法及终端
WO2022206702A1 (zh) 蓝牙通信方法及系统
WO2024088173A1 (zh) 组播通信方法及相关装置
WO2024007998A1 (zh) 一种数据传输的方法、电子设备和通信系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21957167

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021957167

Country of ref document: EP

Effective date: 20240312