CN113747237A - Data processing method and device, electronic equipment and storage medium - Google Patents

Data processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113747237A
CN113747237A CN202111016496.8A CN202111016496A CN113747237A CN 113747237 A CN113747237 A CN 113747237A CN 202111016496 A CN202111016496 A CN 202111016496A CN 113747237 A CN113747237 A CN 113747237A
Authority
CN
China
Prior art keywords
data frame
duration
data
time length
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111016496.8A
Other languages
Chinese (zh)
Other versions
CN113747237B (en
Inventor
邓得敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sangfor Technologies Co Ltd
Original Assignee
Sangfor Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sangfor Technologies Co Ltd filed Critical Sangfor Technologies Co Ltd
Priority to CN202111016496.8A priority Critical patent/CN113747237B/en
Publication of CN113747237A publication Critical patent/CN113747237A/en
Application granted granted Critical
Publication of CN113747237B publication Critical patent/CN113747237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • H04N21/4394Processing of audio elementary streams involving operations for analysing the audio stream, e.g. detecting features or characteristics in audio streams

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a data processing method, a data processing device, an electronic device and a storage medium, wherein the data processing method comprises the following steps: determining a third time length based on the first time length corresponding to the received first data frame and the set second time length; the first duration characterizing a reception time interval between the first data frame and the second data frame; the second data frame represents a previous data frame of the first data frame; and after transmitting the second data frame and waiting for the third duration, transmitting the first data frame to an application layer. According to the scheme provided by the embodiment of the application, data are transmitted in an unstable network, the application layer renders the corresponding data frames in sequence based on the time of receiving each data frame, and the smooth audio and video playing effect can be obtained without additionally adjusting the rendering time of each data frame.

Description

Data processing method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of network technologies, and in particular, to a data processing method and apparatus, an electronic device, and a storage medium.
Background
With the development of network transmission technology, the demand of users for acquiring audio data and video data is increasing day by day. After receiving data through the transport layer and the layers below, the electronic device transmits the data to an application layer operating on the electronic device, and the application layer provides common network application services. Due to the existence of unstable networks such as weak networks, the application layer sequentially renders audio or video based on the time of receiving audio frames or image frames, and the rendering effect of the audio or video is poor.
Disclosure of Invention
In view of this, embodiments of the present disclosure provide a data processing method, an apparatus, an electronic device, and a storage medium, so as to at least solve the problem of poor rendering effect of audio or video in the related art.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a data processing method, which comprises the following steps:
determining a third time length based on the first time length corresponding to the received first data frame and the set second time length; the first duration characterizes a receive time interval between the first data frame and a second data frame; the second data frame characterizes a previous data frame of the first data frame;
and after transmitting the second data frame and waiting for the third duration, transmitting the first data frame to an application layer.
In the foregoing scheme, the determining a third duration based on a first duration corresponding to the received first data frame and a set second duration includes:
determining a third time length based on the second time length under the condition that the first data frame is the first data frame received in the current data transmission process;
and under the condition that the first data frame is not the first data frame received in the current data transmission process, determining a third time length based on the set frame rate corresponding to the current data transmission process, the first time length corresponding to the first data frame and the second time length.
In the foregoing solution, the determining a third duration based on the set frame rate corresponding to the current data transmission process, the first duration corresponding to the first data frame, and the second duration includes:
determining the third duration based on the set frame rate and the first difference value when the first duration corresponding to the first data frame is greater than the second duration; the first difference value represents a difference value between a first time length corresponding to the first data frame and the second time length;
and determining the third duration based on the set frame rate under the condition that the first duration corresponding to the first data frame is less than or equal to the second duration.
In the above scheme, the method further comprises:
and updating the second duration according to the receiving time interval between every two adjacent data frames in the received data frames.
In the foregoing solution, the updating the second duration according to the receiving time interval between every two adjacent data frames in the received data frames includes:
under the condition that the first duration corresponding to at least one first data frame received in the current life cycle of the second duration is greater than or equal to the second duration, updating the second duration and resetting the life cycle of the second duration; and/or the presence of a gas in the gas,
under the condition that the first time length corresponding to each first data frame received in the current life cycle of the second time length is smaller than the second time length, under the condition that the current life cycle expires, the second time length is updated, and the life cycle of the second time length is reset.
In the foregoing solution, the updating the second duration includes:
and updating the second time length according to the maximum first time length corresponding to at least one first data frame received in the current life cycle.
In the foregoing solution, under the condition that the current lifetime expires, the updating the second duration includes:
and updating the second time length based on the average value of the first time length corresponding to each first data frame received in the current life cycle.
In the above scheme, the method is applied to a Virtual Desktop Infrastructure (VDI) terminal.
An embodiment of the present application further provides a data processing apparatus, including:
the processing unit is used for determining a third time length based on the first time length corresponding to the received first data frame and the set second time length; the first duration characterizes a receive time interval between the first data frame and a second data frame; the second data frame characterizes a previous data frame of the first data frame;
and the transmitting unit is used for transmitting the first data frame to an application layer after transmitting the second data frame and waiting for the third duration.
An embodiment of the present application further provides an electronic device, including: a processor and a memory for storing a computer program capable of running on the processor,
wherein the processor is configured to execute the steps of the data processing method when the computer program is executed.
The embodiment of the application also provides a storage medium, on which a computer program is stored, and the computer program realizes the steps of the data processing method when being executed by a processor.
In the embodiment of the present application, a third duration is determined based on a first duration corresponding to a received first data frame and a set second duration, and the corresponding first data frame is transmitted to an application layer after transmitting the second data frame and waiting for the third duration, where the first duration represents a receiving time interval between the received first data frame and a previous data frame. In the embodiment of the application, the time for transmitting the data frame to the application layer is determined, and the corresponding data frame is transmitted to the application layer at the determined time, so that data is transmitted in an unstable network, the application layer renders the corresponding data frame in sequence based on the time for receiving each data frame, and the smooth audio and video playing effect can be obtained without additionally adjusting the rendering time of each data frame.
Drawings
FIG. 1 is a schematic illustration of video image frame transmission;
fig. 2 is a schematic flowchart of a data processing method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of video image frame transmission provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a video image frame transmission according to another embodiment of the present application;
FIG. 5 is a schematic diagram of peer-to-peer communication provided in an embodiment of the present application;
fig. 6 is a schematic processing flow diagram of a sending end according to an embodiment of the present application;
fig. 7 is a schematic processing flow diagram of a receiving end according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
With the development of network transmission technology, the demand of users for acquiring audio data and video data is increasing day by day, and specifically, users need to transmit audio data and video data through a network in live scenes such as live competition and live shopping and in conference scenes such as remote online conferences. Some types of data depend on the stability of the data transmission, such as audio data, video data. Taking video data as an example, a video is actually a continuous combination of frames of image pictures, and the frame rate of the video is determined, that is, the playing interval of each frame of image is determined. If the frame rate of the video is 30fps, i.e. 30 image frames are played back per second, the playing interval of each image frame is 33 ms.
The cloud desktop is a Computer solution for operating a remote server desktop through a network by using cloud terminal equipment, and is different from a traditional Personal Computer (PC), data calculation and storage of the cloud desktop are centralized in the remote server, and the cloud desktop terminal is mainly responsible for input and output operations of peripheral devices such as a keyboard, a mouse and a display. The VDI is a mainstream cloud desktop technology in the market and comprises a VDI terminal and a remote server, the VDI has the advantages of strong mobility, high data security and the like, but the VDI depends on a network environment, has high requirements on bandwidth and time delay, and cannot be used after the network is disconnected.
When video data transmission is performed in a VDI scene, due to the fact that an unstable network exists, when network data transmission is performed by using a weak network, a large amount of network jitter and packet loss occur in the weak network, and if video rendering is performed in sequence by an application layer based on the time of receiving image frames, video playing is not smooth, that is, the rendering effect of the video is not good.
As shown in the schematic diagram of video image frame transmission shown in fig. 1, each rectangle with a number (1, 2, 3, … …) represents an image frame, and the length of each rectangle represents the time of transmission of the corresponding image frame on the network. The frame rate of the image frames at the sending end is set, the playing time interval of each image frame is 33 milliseconds at 30fps, the time of the image frames actually received by the receiving end can change along with the network state, under the weak network state of unstable networks such as high delay, high packet loss, jitter and the like, the playing time intervals of the received image frames are greatly different, if the application layer sequentially renders the corresponding data frames based on the time of receiving each data frame, the playing time intervals of the image frames are greatly different, taking the image frames received by the receiving end under the weak network shown in fig. 1 as an example, when a user watches the corresponding video, the playing time interval of the image is normal when the first image frame is played, the playing time interval of the image is longer when the second image frame is played, the user feels video blockage, and the playing time interval of the image is shorter when the third image frame and the fourth image frame are played, the user will feel that the video is fast-forwarding.
If the receiving end sequentially transmits the image frames to the application layer according to the time of receiving the image frames, the application layer sequentially renders the video based on the time of the received image frames, and the playing time interval of each frame image is the time interval of the receiving end receiving the corresponding two adjacent data frames, then the larger the receiving time interval is, the longer the time of the corresponding frame image staying during playing is. If the time interval of the continuous multiframes is changed too much, the obtained video playing is not smooth, namely the video rendering effect is poor.
Based on this, an embodiment of the present application provides a data processing method, an apparatus, an electronic device, and a storage medium, where a third duration is determined based on a first duration corresponding to a received first data frame and a set second duration, and the corresponding first data frame is transmitted to an application layer after transmitting the second data frame and waiting for the third duration, where the first duration represents a receiving time interval between the received first data frame and a previous data frame. In the embodiment of the application, the time for transmitting the data frame to the application layer is determined, and the corresponding data frame is transmitted to the application layer at the determined time, so that data is transmitted in an unstable network, the application layer renders the corresponding data frame in sequence based on the time for receiving each data frame, and the smooth audio and video playing effect can be obtained without additionally adjusting the rendering time of each data frame.
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Fig. 2 is a schematic flowchart of a data processing method provided in an embodiment of the present application, where an execution subject of the process is an electronic device that receives and renders audio/video data, including but not limited to an electronic device such as a terminal and a server.
Moreover, the electronic device may acquire the audio and video data transmitted by the external electronic device based on network transmission, a Serial port, a Universal Serial Bus (USB), and the like.
As shown in fig. 2, the data processing method includes:
step 201: and determining a third time length based on the first time length corresponding to the received first data frame and the set second time length.
Wherein the first duration characterizes a receive time interval between the first data frame and a second data frame; the second data frame represents a data frame before the first data frame, that is, a data frame before the first data frame received by the electronic device.
Here, the first duration is determined by a difference between a second time and a first time, where the second time represents a time when the electronic device completely receives the second data frame, and the first time represents a time when the electronic device completely receives the first data frame.
Here, the transport layer and the following layers provide a complete communication service, and the application layer is located at the uppermost layer of the computer network and is a user-oriented layer. After receiving data through the transport layer and the layers below, the electronic device transmits the data to an application layer operating on the electronic device, and the application layer provides network application services.
The set second time duration characterizes a time interval determined in a set manner when the data processing method is executed, which differs from the first time duration determined from the reception time interval of the data frames. In some embodiments, the second duration may be set according to a network state required by an application scenario, and the current network state is reflected by a size relationship between the first duration and the second duration, for example, the current network state may be considered to be poor if the first duration is longer than the second duration.
In addition, the second duration may be an overall delay duration of all data frames in the current data transmission process, that is, before the receiving end transmits the first data frame to the application layer, the transmission behavior of all data frames is delayed based on the second duration, so that the setting of the second duration can also provide a buffering time for the current data transmission process, and a poor rendering effect caused by a large amount of network jitter of an unstable network is improved.
In an embodiment, during data transmission, the receiving end may obtain information of the data frame according to the header data, where the information of the data frame that can be obtained includes, but is not limited to, a set frame rate and a size of each data frame. The set frame rate of the video represents the number of display frames in a unit time, and if the set frame rate of the video is 30fps, 30 images are played every second.
In practical applications, fragmented transmission is usually adopted for data frames occupying a large memory. After receiving the data frames transmitted in the fragments, the receiving end can judge whether the received data can be assembled into a complete data frame according to the header data of the data messages transmitted in the fragments. Typically, the large memory occupied frames of data include, but are not limited to, high resolution video frames, high audio quality audio.
Step 202: and after transmitting the second data frame and waiting for the third duration, transmitting the first data frame to an application layer.
Here, the first data frame is transmitted to the application layer after a third duration corresponding to the first data frame expires from a time when the receiving end transmits the second data frame to the application layer.
In the embodiment of the application, the application layer sequentially renders the corresponding data frames based on the time of receiving each data frame, so that in an unstable network, the application layer can obtain a smooth audio and video playing effect without additionally adjusting the rendering time of each data frame.
Particularly, for scenes such as a cloud desktop and the like which are constructed by network environment-dependent technologies, such as a VDI, data are transmitted under the condition of an unstable network, and the method provided by the embodiment of the application can improve the audio and video playing effect of the cloud desktop scenes under the weak network condition.
In an embodiment, the determining a third duration based on the first duration corresponding to the received first data frame and the set second duration includes:
determining a third time length based on the second time length under the condition that the first data frame is the first data frame received in the current data transmission process;
and under the condition that the first data frame is not the first data frame received in the current data transmission process, determining a third time length based on the set frame rate corresponding to the current data transmission process, the first time length corresponding to the first data frame and the second time length.
Here, the receiving end determines the third duration corresponding to the first data frame in different ways according to whether the received first data frame is the first data frame received in the current data transmission process. For the first data frame, where there is no previous data frame, the first duration may be considered zero or null, and the third duration may be determined based on the second duration. And because the application layer renders the corresponding data frames in turn based on the time of receiving each data frame, the rendering time of the application layer in rendering the first data frame is not influenced by the rendering time of the previous data frame.
In practical applications, the first data frame received in the current data transmission process includes, but is not limited to, the following cases: the application layer requests to play the video A, the electronic equipment requests a corresponding server for a media stream of the video A, and a first image frame of the video A can be understood as a first data frame received in the current data transmission process; or the video a is paused after the last half of the playing, and the application layer requests to continue playing the video a, the first image frame from the paused position can be understood as the first data frame received in the current data transmission process.
And when the first data frame received by the receiving end is not the first data frame received in the current data transmission process, that is, the first data frame is the second or later data frame received in the current data transmission process, determining the third time length based on the first time length and the second time length. Because the application layer renders the corresponding data frames in sequence based on the time of receiving each data frame, the rendering time of the application layer in rendering the data frames is influenced by the rendering time of the previous data frame.
According to the embodiment of the application, under the condition that the first data frame received by the receiving end is the first data frame received in the current data transmission process, the third time length for transmitting the first data frame to the application layer is determined based on the second time length, under the condition that the first data frame received by the receiving end is the second or later data frame received in the current data transmission process, the third time length corresponding to the first data frame is determined based on the set frame rate, the first time length and the second time length, so that the moment for transmitting the data frame to the application layer is determined, the application layer sequentially renders the corresponding data frames based on the time for receiving the data frames, and the smooth playing audio and video effect can be obtained without additionally adjusting the rendering time of the data frames.
In an embodiment, the determining a third duration based on the set frame rate corresponding to the current data transmission process, the first duration corresponding to the first data frame, and the second duration includes:
determining the third duration based on the set frame rate and the first difference value when the first duration corresponding to the first data frame is greater than the second duration; the first difference value represents a difference value between a first time length corresponding to the first data frame and the second time length;
and determining the third duration based on the set frame rate under the condition that the first duration corresponding to the first data frame is less than or equal to the second duration.
Here, when the first data frame received by the receiving end is not the first data frame received in the current data transmission process, the third duration is determined in different ways according to the size relationship between the first duration and the second duration corresponding to the first data frame.
In an unstable network, if a receiving end transmits a data frame to an application layer according to a set frame rate, the receiving end may not receive a corresponding complete data frame at a time when the data frame should be transmitted to the application layer, and thus, the application layer sequentially renders audio data or video data based on the time when the data audio frame or the data image frame is received, which may cause video blocking. In some embodiments, the second duration may be set according to a network state required by an application scenario, a current network state may be reflected by a size relationship between the first duration and the second duration, and when the determined first duration is less than or equal to the second duration, the current network may be considered as a stable network. When the determined first duration is longer than the second duration, the current network can be considered to be an unstable network such as a weak network.
Here, the first difference represents an absolute value of a difference between the first duration and the second duration corresponding to the first data frame, in other words, the first difference is obtained by subtracting the second duration from the first duration corresponding to the first data frame. The larger the first difference, the less desirable the current network state. Further, a threshold of the first difference may be set as a determination condition for determining whether the sending end sends the data frame, and when the first difference is greater than the threshold, it indicates that the current network is not enough to support data transmission in the current data transmission process, and the receiving end stops transmitting the data frame to the application layer.
In the embodiment of the application, the third duration is determined in different ways according to the size relationship between the first duration and the second duration corresponding to the first data frame. The receiving end determines the time for transmitting the data frames to the application layer by combining the current network state, so that the corresponding data frames are rendered in sequence on the basis of the time for receiving the data frames in the application layer, and a smoother audio and video playing effect can be obtained without additionally adjusting the rendering time of the data frames.
For example, the initial time of the current data transmission is 19 hours, 0 minutes and 0 seconds, the set second time duration is 2 seconds, and the frame rate is set to 1 fps. The receiving end receives a first data frame in the current data transmission process in 19 hours 0 minutes 1 seconds, receives a second data frame in the current data transmission process in 19 hours 0 minutes 2 seconds, and receives a third data frame in the current data transmission process in 19 hours 0 minutes 5 seconds. And for the first data frame, the second time length is 2 seconds, the third time length is determined to be 2 seconds based on the second time length, and the receiving end transmits the first data frame to the application layer at 19 hours, 0 minutes and 2 seconds. And for the second data frame, the first time length is 1 second and is less than 2 seconds of the second time length, the third time length is determined to be 1 second based on the set frame rate, and the receiving end transmits the second data frame to the application layer at 19 hours, 0 minutes and 3 seconds. For the third data frame, the first duration is 3 seconds and 2 seconds longer than the second duration, the third duration is determined to be 2 seconds based on the set frame rate, the first duration and the second duration, and the receiving end transmits the third data frame to the application layer at 19 hours, 0 minutes and 5 seconds.
In an embodiment, the method further comprises:
and updating the second duration according to the receiving time interval between every two adjacent data frames in the received data frames.
Thus, before the sending end and the receiving end start the current data transmission process, the receiving end can set the second time length according to the application scene and the data type. And with the progress of data transmission, updating the second time length in real time according to the network state to enable the second time length to be dynamically associated with the network state, wherein the updated second time length is used for determining the third time length, and then determining the time for transmitting the subsequent data frame to the application layer, so that the third time length for transmitting the data frame to the application layer is regulated and controlled in real time according to the network state, the time for transmitting the first data frame is further adjusted, and the application layer renders the corresponding data frame based on the time for receiving each data frame, so that a smoother audio and video playing effect can be obtained.
In an embodiment, the updating the second duration according to the receiving time interval between every two adjacent data frames in the received data frames includes:
under the condition that the first duration corresponding to at least one first data frame received in the current life cycle of the second duration is greater than or equal to the second duration, updating the second duration and resetting the life cycle of the second duration; and/or the presence of a gas in the gas,
under the condition that the first time length corresponding to each first data frame received in the current life cycle of the second time length is smaller than the second time length, under the condition that the current life cycle expires, the second time length is updated, and the life cycle of the second time length is reset.
Here, in the current lifetime of the second duration, the receiving end determines whether at least one corresponding first data frame with the first duration greater than the second duration is received. And under the condition that the first time length corresponding to at least one first data frame received by the receiving end in the current life cycle is greater than the second time length, updating the second time length, and resetting the life cycle of the second time length. Under the condition that the first time length corresponding to each first data frame received by the receiving end in the current life cycle is smaller than the second time length, whether the life cycle expires or not needs to be further judged, the second time length is updated under the condition that the life cycle expires, and the life cycle of the second time length is reset. The first duration of the first data frame received in the current life cycle can reflect the network state of the current life cycle.
And under the condition that the first duration corresponding to at least one first data frame received in the current life cycle of the second duration is greater than or equal to the second duration, the network state in a certain period of time in the current life cycle is poor, the network in the period of time can be considered as an unstable network, and the second duration is correspondingly adjusted according to the network state.
Here, when the first duration corresponding to one first data frame received in the current lifetime of the second duration is greater than or equal to the second duration, the current lifetime may be immediately ended, the second duration may be updated and the lifetime of the second duration may be reset, or the second duration may be updated and the lifetime of the second duration may be reset after the current lifetime expires. That is, under the condition that it is determined that the first duration corresponding to a first data frame received in the current lifetime of the second duration is greater than or equal to the second duration, the receiving end may immediately update the second duration at the time when it is determined that there is a first duration greater than or equal to the second duration in the current lifetime; or the second duration may be updated again by the time when the current life cycle expires.
And under the condition that the first time length corresponding to each first data frame received in the current life cycle of the second time length is smaller than the second time length, the network is a stable network in the period of the current life cycle, and under the condition that the current life cycle expires, the receiving end correspondingly adjusts the second time length according to the network state.
Therefore, the time for adjusting the second time length is judged according to the size relation of the first time length and the second time length, the second time length is enabled to be dynamically associated with the network state, the updated second time length is used for determining the third time length in the current data transmission process, and then the time for transmitting the subsequent data frames to the application layer is determined, so that the third time length for transmitting the data frames to the application layer is regulated and controlled in real time according to the network state, the time for transmitting the first data frames is adjusted, the application layer renders the corresponding data frames based on the time for receiving the data frames, and a smoother audio and video playing effect can be obtained.
In an embodiment, the updating the second duration includes:
and updating the second time length based on the maximum first time length corresponding to at least one first data frame received in the current life cycle.
Here, according to the corresponding first duration, the receiving end determines a data frame from at least one first data frame received in the current lifetime, the determined data frame is the data frame corresponding to the largest first duration in the data frames in the current lifetime, and the second duration is updated according to the first duration of the data frame.
Here, the second duration is updated according to the maximum first duration received in the current life cycle, and the method is suitable for all schemes requiring updating of the second duration. In other words, for a scheme in which the corresponding first duration is greater than or equal to the second duration in the current lifetime, the second duration may be updated with the largest first duration; for the scheme that the corresponding first duration is smaller than the second duration in the current life cycle, the second duration may also be updated with the largest first duration.
Therefore, the second time length is updated corresponding to the maximum first time length in the current life cycle, the second time length is enabled to be dynamically associated with the network state, the updated second time length can reflect the worst network state in the previous life cycle, the updated second time length is used for determining the third time length, and then the time for transmitting the subsequent data frame to the application layer is determined, so that the third time length for transmitting the data frame to the application layer is regulated and controlled in real time according to the network state, the time for transmitting the first data frame is further adjusted, the application layer renders the corresponding data frame based on the time for receiving each data frame, and a smoother audio and video playing effect can be obtained.
In an embodiment, in the case that the current lifetime expires, the updating the second duration includes:
and updating the second time length based on the average value of the first time length corresponding to each first data frame received in the current life cycle.
Here, the manner of updating the second duration by the average value of the first duration received in the current lifetime is applicable to a scheme that the first duration corresponding to each first data frame received in the current lifetime is less than the second duration and the current lifetime expires.
Therefore, the second time length is updated corresponding to the average first time length in the current life cycle, so that the second time length can represent the average network state in the previous life cycle, the updated second time length can reflect the average network state in the previous life cycle, the updated second time length is used for determining the third time length, and then the time for transmitting the subsequent data frames to the application layer is determined, so that the third time length for transmitting the data frames to the application layer is regulated and controlled in real time according to the network state, the time for transmitting the first data frames is further regulated, the application layer renders the corresponding data frames based on the time for receiving the data frames, and a smoother audio and video playing effect can be obtained.
In one embodiment, the method is applied to a VDI terminal.
Here, the VDI architecture includes a VDI terminal and a remote server, and the VDI terminal includes a terminal device such as a PC, a mobile phone, a tablet, a thin client, or the like. And under the VDI environment, the remote server sends a data frame to the VDI terminal, the VDI terminal determines a third time length based on a first time length corresponding to the received first data frame and a set second time length, and transmits the corresponding first data frame to an application layer of the VDI terminal after waiting for the third time length. The time for transmitting the data frames to the application layer is determined, and the corresponding data frames are transmitted to the application layer at the determined time, so that data are transmitted in the unstable network, the application layer renders the corresponding data frames in sequence based on the time for receiving the data frames, and the smooth audio and video playing effect can be obtained without additionally adjusting the rendering time of the data frames. Particularly, for scenes such as a cloud desktop and the like which are constructed by network environment-dependent technologies, such as a VDI, data are transmitted under the condition of an unstable network, and the method provided by the embodiment of the application can improve the audio and video playing effect of the cloud desktop scenes under the weak network condition.
The present application will be described in further detail with reference to the following application examples.
Due to the fact that an unstable network exists, when network data transmission is conducted through a weak network, due to the fact that a large amount of network jitter and packet loss exist under the weak network, the application layer conducts video rendering in sequence based on the time of receiving video frames, and the video effect obtained through the rendering is poor.
Here, the data Transmission is performed by a network Transmission Protocol capable of transmitting network data, including, but not limited to, Transmission Control Protocol/Internet Protocol (TCP/IP), User Datagram Protocol (UDP). The IP is an internet layer protocol in a TCP/IP system, and is designed to improve the expandability of the network: firstly, the problem of the internet is solved, and interconnection and intercommunication of large-scale and heterogeneous networks are realized; and secondly, the coupling relation between the top network application and the bottom network technology is divided, so that the independent development of the top network application and the bottom network technology is facilitated. IP provides a connectionless, unreliable, best-effort packet transport service for hosts based on end-to-end design principles. TCP is a connection-oriented, reliable, byte-stream-based transport-layer communication protocol, an IP protocol encapsulation-based transport protocol. UDP is a connectionless transport protocol that provides applications with a way to send encapsulated IP packets without establishing a connection.
The application embodiment of the application provides a data processing method, so that the playing time interval of each frame of image picture of a video and the time interval of sending a data frame by a sending end are kept consistent as much as possible.
And updating the second duration in real time to enable the receiving end to wait for a certain time after receiving the data frame and then transmit the corresponding data frame to the application layer. Here, the waiting time is related to the network status, and if the network is a stable network, the waiting time is short, and if the network is an unstable network, the waiting time is correspondingly long. Here, the waiting time is determined by the worst network state.
As shown in the schematic diagram of video image frame transmission shown in fig. 3, if network data transmission is performed using the weak network, the extra transmission time is required due to network jitter and retransmission, the playing time interval between the first image frame and the second image frame is, for example, the length of the rectangle 301, and if the length of the rectangle 302 (corresponding to the second duration) is greater than the length of the rectangle 301 (corresponding to the first duration), the second duration is not adjusted.
As shown in fig. 4, in the video image frame transmission diagram, if the network data transmission is performed by using the weak network, the extra transmission time is required due to network jitter and retransmission, the playing time interval between the first image frame and the second image frame is, for example, the length of the rectangle 401, and if the length of the rectangle 402 (corresponding to the second duration) is less than the length of the rectangle 401 (corresponding to the first duration), the second duration is dynamically adjusted to the first duration, and the difference between the first duration and the second duration is inserted before the second image frame is transmitted to the application layer. Therefore, the receiving end can also normally acquire the data under the worst network state, and the data are strictly arranged in time sequence. And the network state is changed in real time, the second time length is dynamically adjusted according to the first time length, so that the third time length is adjusted, and the time for transmitting the corresponding image frame to the application layer is determined. In addition, by setting the life cycle and adjusting the second time length according to the life cycle, the second time length can be adjusted according to the network characteristics of the life cycle, and the situation that the playing time interval difference of the images is too large is avoided as much as possible.
Therefore, after the processing of updating the second duration in real time, the application layer renders the corresponding data frames in sequence based on the time of receiving each data frame, and a smooth audio and video playing effect can be obtained without additionally adjusting the rendering time of each data frame, and the playing time interval of each frame of image picture of the video and the time interval of sending the data frame by the sending end are kept consistent as much as possible.
As shown in fig. 5, in the end-to-end communication diagram, a sending end processes application data in a smoothing layer, records information such as a set frame rate of data frames and a size of each data frame in a smoothing header, sends the information to a receiving end through a network layer, and then transmits the data frames to an application. Here, the smoothing layer may be implemented by any one of a network layer and a transport layer.
Fig. 6 shows a processing flow diagram of the transmitting end, which includes: the sending end obtains application data to be sent, calculates the time interval of the data frame, fills a smooth data head in a smooth layer, and calls a network sending interface to send the data frame to a receiving party. And if the transmission fails due to the conditions of network disconnection and the like, returning the transmission failure information corresponding to the application layer.
Fig. 7 shows a processing flow diagram of the receiving end, which includes:
1: the receiving end receives data by the network layer.
2: and assembling the sliced data to obtain a complete data frame.
3: it is determined whether the fragmented data is sufficient to assemble into a complete data frame.
If the fragmented data can not be assembled into a complete data frame, executing step 4; if the fragmented data can be assembled into one complete data frame, 5 is performed.
4: and finishing the data reception, and saving the received data so as to assemble the data when the data is received next time.
5: and calculating the time interval between the time when the current composition obtains a complete data frame and the time when the previous composition obtains a complete data frame.
Here, the actual arrival time of the data in the one data frame and the actual arrival time of the data in the previous data frame are obtained and calculated together based on the frame rate recorded by the data head.
6: and judging the size relation between the calculated time interval and the second duration.
If the calculated time interval is less than the second duration, executing step 7; if the calculated time interval is equal to the second duration, executing 11; if the calculated time interval is greater than the second duration, 9 is executed.
7: and judging whether the life cycle of the second duration expires.
If the life cycle of the second duration expires, execute 10; if the lifetime of the second duration has not expired 8 is executed.
8: and recording the time interval of the calculation, and storing the time interval to a time interval list of the life cycle.
9: the second duration is updated based on the calculated time interval.
10: the current lifetime has expired and the maximum value of the time interval of the current lifetime is taken as the new second duration.
11: the lifetime of the second duration is reset and the time interval list of the lifetime is reset.
12: it is determined whether the received data frame can be read by the application.
If it is determined that the received data frame can be read by the application, 13 is performed; if it is determined that the received data frame cannot be read by the application, 4 is performed.
Here, by determining whether the third duration is waited after the previous data frame is transmitted, the corresponding data frame is transmitted to the application layer only when the determination result represents that the third duration is waited.
13: the corresponding data frame is transmitted to the application layer.
Compared with the prior art that additional processing is needed on the basis of a receiving end by an application layer, the data processing scheme provided by the application embodiment of the application embodiment of the application embodiment of the application embodiment of the application embodiment of the application embodiment of the application embodiment of the application of the.
In order to implement the method according to the embodiment of the present application, an embodiment of the present application further provides a data transmission device, as shown in fig. 8, where the data transmission device includes:
a processing unit 801, configured to determine a third duration based on a first duration corresponding to the received first data frame and a set second duration; the first duration characterizes a receive time interval between the first data frame and a second data frame; the second data frame characterizes a previous data frame of the first data frame;
a transmitting unit 802, configured to transmit the first data frame to an application layer after transmitting the second data frame and waiting for the third duration.
In one embodiment, the processing unit 801 is configured to:
determining a third time length based on the second time length under the condition that the first data frame is the first data frame received in the current data transmission process;
and under the condition that the first data frame is not the first data frame received in the current data transmission process, determining a third time length based on the set frame rate corresponding to the current data transmission process, the first time length corresponding to the first data frame and the second time length.
In one embodiment, the processing unit 801 is configured to:
determining the third duration based on the set frame rate and the first difference value when the first duration corresponding to the first data frame is greater than the second duration; the first difference value represents a difference value between a first time length corresponding to the first data frame and the second time length;
and determining the third duration based on the set frame rate under the condition that the first duration corresponding to the first data frame is less than or equal to the second duration.
In one embodiment, the apparatus further comprises:
and the updating unit is used for updating the second duration according to the receiving time interval between every two adjacent data frames in the received data frames.
In one embodiment, the updating unit is configured to:
under the condition that the first duration corresponding to at least one first data frame received in the current life cycle of the second duration is greater than or equal to the second duration, updating the second duration and resetting the life cycle of the second duration; and/or the presence of a gas in the gas,
under the condition that the first time length corresponding to each first data frame received in the current life cycle of the second time length is smaller than the second time length, under the condition that the current life cycle expires, the second time length is updated, and the life cycle of the second time length is reset.
In one embodiment, the updating unit is configured to:
and updating the second time length according to the maximum first time length corresponding to at least one first data frame received in the current life cycle.
In one embodiment, the updating unit is configured to:
and updating the second time length based on the average value of the first time length corresponding to each first data frame received in the current life cycle.
In one embodiment, the method is applied to a VDI terminal.
In practical applications, the transmitting unit 802 may be implemented by a communication interface in a data processing apparatus, and the processing unit 801 and the updating unit may be implemented by a processor in the data processing apparatus.
It should be noted that: in the data processing apparatus provided in the above embodiment, when performing data processing, only the division of each program module is exemplified, and in practical applications, the processing may be distributed to different program modules according to needs, that is, the internal structure of the apparatus may be divided into different program modules to complete all or part of the processing described above. In addition, the data processing apparatus and the data processing method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Based on the hardware implementation of the program module, and in order to implement the data processing method in the embodiment of the present application, an embodiment of the present application further provides an electronic device, as shown in fig. 9, where the electronic device 900 includes:
a communication interface 910 capable of information interaction with other devices such as network devices and the like;
and the processor 920 is connected with the communication interface 910 to implement information interaction with other devices, and is configured to execute the method provided by one or more of the above technical solutions when the computer program runs. And the computer program is stored on the memory 930.
Of course, in practice, the various components in the electronic device 900 are coupled together by a bus system 940. It is understood that the bus system 940 is used to enable connected communication between these components. The bus system 940 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 940 in fig. 9.
The memory 930 in embodiments of the present application is used to store various types of data to support the operation of the electronic device 900. Examples of such data include: any computer program for operating on the electronic device 900.
It will be appreciated that the memory 930 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a magnetic random access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical disk, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced Synchronous Dynamic Random Access Memory (ESDRAM), Enhanced Synchronous Dynamic Random Access Memory (Enhanced DRAM), Synchronous Dynamic Random Access Memory (SLDRAM), Direct Memory (DRmb Access), and Random Access Memory (DRAM). The memory 930 described in embodiments herein is intended to comprise, without being limited to, these and any other suitable types of memory.
The method disclosed in the embodiments of the present application may be applied to the processor 920, or implemented by the processor 920. The processor 920 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 920. The processor 920 described above may be a general purpose processor, a DSP, or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. Processor 920 may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in a storage medium located in the memory 930, and the processor 920 reads the program in the memory 930 to perform the steps of the aforementioned methods in conjunction with its hardware.
Optionally, when the processor 920 executes the program, the corresponding process implemented by the electronic device in each method according to the embodiment of the present application is implemented, and for brevity, no further description is given here.
In an exemplary embodiment, the present application further provides a storage medium, specifically a computer storage medium, for example, a memory 930 storing a computer program, which can be executed by a processor 920 of an electronic device to perform the steps of the foregoing method. The computer readable storage medium may be Memory such as FRAM, ROM, PROM, EPROM, EEPROM, Flash Memory, magnetic surface Memory, optical disk, or CD-ROM.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus, electronic device and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
The technical means described in the embodiments of the present application may be arbitrarily combined without conflict. Unless otherwise specified and limited, the term "coupled" is to be construed broadly, e.g., as meaning electrical connections, or as meaning communications between two elements, either directly or indirectly through intervening media, as well as the specific meanings of such terms as understood by those skilled in the art.
In addition, in the examples of the present application, "first", "second", and the like are used for distinguishing similar objects, and are not necessarily used for describing a specific order or a sequential order. It should be understood that "first \ second \ third" distinct objects may be interchanged under appropriate circumstances such that the embodiments of the application described herein may be implemented in an order other than those illustrated or described herein.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Various combinations of the specific features in the embodiments described in the detailed description may be made without contradiction, for example, different embodiments may be formed by different combinations of the specific features, and in order to avoid unnecessary repetition, various possible combinations of the specific features in the present application will not be described separately.

Claims (11)

1. A method of data processing, the method comprising:
determining a third time length based on the first time length corresponding to the received first data frame and the set second time length; the first duration characterizes a receive time interval between the first data frame and a second data frame; the second data frame characterizes a previous data frame of the first data frame;
and after transmitting the second data frame and waiting for the third duration, transmitting the first data frame to an application layer.
2. The data processing method according to claim 1, wherein the determining a third duration based on the first duration corresponding to the received first data frame and the set second duration comprises:
determining a third time length based on the second time length under the condition that the first data frame is the first data frame received in the current data transmission process;
and under the condition that the first data frame is not the first data frame received in the current data transmission process, determining a third time length based on the set frame rate corresponding to the current data transmission process, the first time length corresponding to the first data frame and the second time length.
3. The data processing method of claim 2, wherein determining the third duration based on the set frame rate corresponding to the current data transmission process, the first duration corresponding to the first data frame, and the second duration comprises:
determining the third duration based on the set frame rate and the first difference value when the first duration corresponding to the first data frame is greater than the second duration; the first difference value represents a difference value between a first time length corresponding to the first data frame and the second time length;
and determining the third duration based on the set frame rate under the condition that the first duration corresponding to the first data frame is less than or equal to the second duration.
4. The data processing method of claim 1, wherein the method further comprises:
and updating the second duration according to the receiving time interval between every two adjacent data frames in the received data frames.
5. The data processing method of claim 4, wherein the updating the second duration according to the receiving time interval between every two adjacent data frames in the received data frames comprises:
under the condition that the first duration corresponding to at least one first data frame received in the current life cycle of the second duration is greater than or equal to the second duration, updating the second duration and resetting the life cycle of the second duration; and/or the presence of a gas in the gas,
under the condition that the first time length corresponding to each first data frame received in the current life cycle of the second time length is smaller than the second time length, under the condition that the current life cycle expires, the second time length is updated, and the life cycle of the second time length is reset.
6. The data processing method of claim 5, wherein the updating the second duration comprises:
and updating the second time length according to the maximum first time length corresponding to at least one first data frame received in the current life cycle.
7. The data processing method of claim 5, wherein in the event that the current lifetime expires, the updating the second duration comprises:
and updating the second time length based on the average value of the first time length corresponding to each first data frame received in the current life cycle.
8. The data processing method according to any of claims 1 to 7, wherein the method is applied to a Virtual Desktop Infrastructure (VDI) terminal.
9. A data processing apparatus, comprising:
the processing unit is used for determining a third time length based on the first time length corresponding to the received first data frame and the set second time length; the first duration characterizes a receive time interval between the first data frame and a second data frame; the second data frame characterizes a previous data frame of the first data frame;
and the transmitting unit is used for transmitting the first data frame to an application layer after transmitting the second data frame and waiting for the third duration.
10. An electronic device, comprising: a processor and a memory for storing a computer program capable of running on the processor,
wherein the processor is adapted to perform the steps of the data processing method of any of claims 1 to 8 when running the computer program.
11. A storage medium on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the data processing method according to any one of claims 1 to 8.
CN202111016496.8A 2021-08-31 2021-08-31 Data processing method and device, electronic equipment and storage medium Active CN113747237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111016496.8A CN113747237B (en) 2021-08-31 2021-08-31 Data processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111016496.8A CN113747237B (en) 2021-08-31 2021-08-31 Data processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113747237A true CN113747237A (en) 2021-12-03
CN113747237B CN113747237B (en) 2023-03-17

Family

ID=78734466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111016496.8A Active CN113747237B (en) 2021-08-31 2021-08-31 Data processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113747237B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109343997A (en) * 2018-10-31 2019-02-15 Oppo广东移动通信有限公司 Caton detection method, device, terminal and storage medium
CN110381316A (en) * 2019-07-17 2019-10-25 腾讯科技(深圳)有限公司 A kind of method for controlling video transmission, device, equipment and storage medium
CN110418170A (en) * 2019-07-03 2019-11-05 腾讯科技(深圳)有限公司 Detection method and device, storage medium and electronic device
CN112437336A (en) * 2020-11-19 2021-03-02 维沃移动通信有限公司 Audio and video playing method and device, electronic equipment and storage medium
CN112601127A (en) * 2020-11-30 2021-04-02 Oppo(重庆)智能科技有限公司 Video display method and device, electronic equipment and computer readable storage medium
CN112929741A (en) * 2021-01-21 2021-06-08 杭州雾联科技有限公司 Video frame rendering method and device, electronic equipment and storage medium
CN112954402A (en) * 2021-03-11 2021-06-11 北京字节跳动网络技术有限公司 Video display method, device, storage medium and computer program product
CN113079421A (en) * 2020-01-03 2021-07-06 阿里巴巴集团控股有限公司 Information processing method, device, equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109343997A (en) * 2018-10-31 2019-02-15 Oppo广东移动通信有限公司 Caton detection method, device, terminal and storage medium
CN110418170A (en) * 2019-07-03 2019-11-05 腾讯科技(深圳)有限公司 Detection method and device, storage medium and electronic device
CN110381316A (en) * 2019-07-17 2019-10-25 腾讯科技(深圳)有限公司 A kind of method for controlling video transmission, device, equipment and storage medium
CN113079421A (en) * 2020-01-03 2021-07-06 阿里巴巴集团控股有限公司 Information processing method, device, equipment and storage medium
CN112437336A (en) * 2020-11-19 2021-03-02 维沃移动通信有限公司 Audio and video playing method and device, electronic equipment and storage medium
CN112601127A (en) * 2020-11-30 2021-04-02 Oppo(重庆)智能科技有限公司 Video display method and device, electronic equipment and computer readable storage medium
CN112929741A (en) * 2021-01-21 2021-06-08 杭州雾联科技有限公司 Video frame rendering method and device, electronic equipment and storage medium
CN112954402A (en) * 2021-03-11 2021-06-11 北京字节跳动网络技术有限公司 Video display method, device, storage medium and computer program product

Also Published As

Publication number Publication date
CN113747237B (en) 2023-03-17

Similar Documents

Publication Publication Date Title
US10123183B2 (en) Voice messaging method and mobile terminal supporting voice messaging in mobile messenger service
US20190089760A1 (en) Systems and methods for real-time content creation and sharing in a decentralized network
CN109889543B (en) Video transmission method, root node, child node, P2P server and system
CN100587681C (en) System and method for communicating images between intercommunicating users
CN106686438B (en) method, device and system for synchronously playing audio images across equipment
RU2392753C2 (en) Method for sending instructions to device not to carryout synchronisation or delay synchronisation of multimedia streams
EP3840394A1 (en) Video screen projection method, device, computer equipment and storage medium
KR101301434B1 (en) Voice instant messaging between mobile and computing devices
CN107819809B (en) Method and device for synchronizing content
WO2021082642A1 (en) Video playing control method and system
JP2006501744A (en) Media communication method and apparatus
US11792130B2 (en) Audio/video communication method, terminal, server, computer device, and storage medium
WO2014054325A1 (en) Encoding control device and encoding control method
US20220014574A1 (en) Data distribution method and network device
JP2010136220A (en) Communication terminal device, communication volume control method, and integrated circuit
CN113747237B (en) Data processing method and device, electronic equipment and storage medium
CN111803924B (en) Multi-terminal synchronous display method and device for cloud game and readable storage medium
US10925014B2 (en) Method and apparatus for synchronization in a network
CN112771875A (en) Video bit rate enhancement while maintaining video quality
US20220311812A1 (en) Method and system for integrating video content in a video conference session
US20210069590A1 (en) Method for playing back applications from a cloud, telecommunication network for streaming and for replaying applications (apps) via a specific telecommunication system, and use of a telecommunication network for streaming and replaying applications (apps)
US9277177B2 (en) High definition (HD) video conferencing system
WO2024093284A1 (en) Data transmission method and apparatus, and device
JPH11163934A (en) System and device for transmission, reception device, real-time dynamic picture, system and device for sound transmission, control method for the same and storage device
JP2013201593A (en) Communication terminal apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant