CN108235136B - Data playing method and terminal equipment - Google Patents

Data playing method and terminal equipment Download PDF

Info

Publication number
CN108235136B
CN108235136B CN201810215225.7A CN201810215225A CN108235136B CN 108235136 B CN108235136 B CN 108235136B CN 201810215225 A CN201810215225 A CN 201810215225A CN 108235136 B CN108235136 B CN 108235136B
Authority
CN
China
Prior art keywords
data
playing
live
terminal equipment
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810215225.7A
Other languages
Chinese (zh)
Other versions
CN108235136A (en
Inventor
周立辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Media Network Technology Co Ltd
Original Assignee
Qingdao Hisense Media Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Media Network Technology Co Ltd filed Critical Qingdao Hisense Media Network Technology Co Ltd
Priority to CN201810215225.7A priority Critical patent/CN108235136B/en
Publication of CN108235136A publication Critical patent/CN108235136A/en
Application granted granted Critical
Publication of CN108235136B publication Critical patent/CN108235136B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/438Interfacing the downstream path of the transmission network originating from a server, e.g. retrieving encoded video stream packets from an IP network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the invention discloses a data playing method and terminal equipment, relates to the field of audio and video processing, and can avoid the delay increase between playing data played on the terminal equipment and live broadcasting data stored by a server and reduce or avoid the occurrence of live broadcasting playing failure. The method comprises the following steps: the terminal equipment loads live broadcast data sent by the server in a cache; the terminal equipment reads the current playing data in the cache and plays the current playing data; the terminal equipment determines the delay of the current playing data and the live data according to the current live time of the live data and the playing time of the current playing data; and the terminal equipment adjusts the playing speed of the current playing data according to the time delay.

Description

Data playing method and terminal equipment
Technical Field
The embodiment of the invention relates to the field of audio and video processing, in particular to a data playing method and terminal equipment.
Background
A DASH (Dynamic Adaptive Streaming over HTTP) video transmission protocol is a protocol for transmitting video based on the internet, and can dynamically adapt to the change of network bandwidth, and according to the size of the bandwidth, a player can select video data with different bandwidths to play. The DASH protocol may transmit video programs of the on-demand type vod (video on demand) and also of the Live (Live) type.
Due to the characteristics of live video, live data of the live video is generated in real time and stored in the server, and after the live data is generated for a period of time (usually one minute to dozens of minutes), the live data can be stored for a period of time due to limited space of the server, and when the time is up, the live data can be deleted by the server. If the user watches the live video for a long time and the situation of network blockage occurs for many times in the process of the live video, the playing data seen on the terminal equipment and the live data stored by the server have larger and larger delay, and the delay cannot be restored after the network is recovered. Once this delay is outside the server's allowable range (one to ten or more minutes, as described above, which relates to the storage space allocated by the server for live data), playing the video fails. At which point replay is required. This will cause the live video to fail to be played finally even if the bandwidth is high under the condition of unstable network.
Disclosure of Invention
Embodiments of the present invention provide a data playing method and a terminal device, which can avoid increasing delay between playing data played on the terminal device and live broadcast data stored in a server, and reduce or avoid occurrence of a live broadcast video playing failure condition.
In a first aspect, a data playing method is provided, including: the terminal equipment loads live broadcast data sent by the server in a cache; the terminal equipment reads the current playing data in the cache and plays the current playing data; the terminal equipment determines the delay of the current playing data and the live data according to the current live time of the live data and the playing time of the current playing data; and the terminal equipment adjusts the playing speed of the current playing data according to the time delay. In the above scheme, the terminal device can determine the delay between the currently played data and the live data according to the current live time of the live data and the playing time of the currently played data in the playing process of the currently played data, and then adjust the playing speed of the currently played data according to the delay, for example, the playing speed is increased when the delay is large, the normal playing speed is maintained when the delay is low, and the like, so that the delay between the played data on the terminal device and the live data stored by the server can be prevented from being increased, and the occurrence of a live playing failure condition is reduced or avoided.
In a second aspect, a terminal device is provided, which includes:
the loading module is used for loading the live broadcast data sent by the server in the cache;
the playing module is used for reading the current playing data in the cache and playing the current playing data;
the processing module is used for determining the delay of the current playing data and the live data according to the current live time of the live data and the playing time of the current playing data;
and the playing module is used for adjusting the playing speed of the current playing data according to the time delay determined by the processing module.
In a third aspect, a terminal device is provided, including: communication interface, processor, memory, bus; the memory is used for storing computer execution instructions, the processor is connected with the memory through the bus, and when the terminal device runs, the processor executes the computer execution instructions stored in the memory so as to enable the terminal device to execute the method.
In a fourth aspect, there is provided a computer storage medium comprising instructions which, when run on a computer, cause the computer to perform the method as described above.
It is to be understood that any one of the terminal device or the computer storage medium provided above is used to execute the method according to the first aspect provided above, and therefore, the beneficial effects achieved by the method according to the first aspect and the beneficial effects of the solutions in the following detailed description may be referred to, and are not repeated herein.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic diagram of a data playing method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a data playing method according to another embodiment of the present invention;
fig. 3 is a schematic diagram illustrating a data playing method according to another embodiment of the present invention;
fig. 4 is a schematic diagram of a data playing method according to yet another embodiment of the present invention;
fig. 5 is a flowchart illustrating a data playing method according to an embodiment of the present invention;
fig. 6 is a schematic diagram illustrating a mapping relationship between a playing speed and a delay factor according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a data playing method according to another embodiment of the present invention;
fig. 8 is a schematic diagram illustrating a data playing method according to another embodiment of the present invention;
fig. 9 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a terminal device according to another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
First, taking DASH video transmission protocol as an example, the following description is made on the playing of live content by a player on a terminal device:
referring to fig. 1, referring to the clock axis, if the current time is T1, the content of the live video is content a starting at time point T1. As shown in fig. 1, after the content a is processed, live data is generated and stored synchronously on the server. Because the space allocated by the server for storing the live broadcast data is limited, only the live broadcast data with the time length of B can be stored on the server, and the live broadcast data with the time length of B is always live broadcast content of the latest time period, namely, after a section of live broadcast data is self-synchronized to the server, the server can delete the section of live broadcast data after the time length of B is timed, namely, the section of live broadcast data only can keep the time length of B in the server.
In addition, to meet the playing requirement of the terminal device, when the player starts playing, in order to adapt to the instability of the network, the player loads a small amount of live data buffered from the server as playing data for playing, such as buffered data with a duration of C, which is typically several seconds to several tens of seconds. When the player actually starts playing, the content starts playing at time T2 after the buffer duration C ends. This content is usually slightly delayed (denoted by δ T) from the true live time point T1.
In the normal case of the network, the video playing speed of the player and the speed of the live broadcast are the same, so δ T remains the same all the time, and the content seen by the user is usually slightly delayed (several seconds to several tens of seconds, i.e. the buffering time C) from the real content. If the network experiences short-term interference, which may cause data transmission between the server and the terminal device to be interrupted or too slow, the bandwidth is not sufficient to maintain normal playback. According to the existing practice, under the standard in the DASH protocol, the terminal device may load the low-definition video for the measures such as switching the player to a lower resolution. However, if the bandwidth reaches the lower limit allowed by the lowest definition video, the player can only play the playing data in the cache to maintain the network to return to normal, and if the cache data is played completely, the player automatically changes to the pause state. As shown in fig. 2. After the player pauses, the time difference (δ T) between the time (T2) of the video played by the player and the real live video time (T1) is increased.
When the network is recovered, the terminal device will continue to load the data after T2 for the player, and at the same time, resume playing. Until all data loading is complete or the player's buffer is full. As shown in fig. 3. T3 is the time position to which the buffered data is loaded. Under normal conditions of the network, the speed of playing the live broadcast video by the terminal equipment is the same as the speed of generating the live broadcast data by the server, and the playing delay (delta T) is kept unchanged all the time. If the network interferes many times during the process of playing the live video, the network pause shown in fig. 2 appears many times. The larger the time difference (δ T) between the video time (T2) played out by the terminal device and the real live video time (T1) will be pulled. Until the situation shown in fig. 4 occurs.
Referring to fig. 4, as δ T becomes larger, there is always a point in time when the currently cached data cannot contain the deleted data on the server. At this time, the terminal device cannot request the server for the data played by the player. According to the protocol of DASH, the player feeds back the failure of playing to the application and stops playing. To solve this problem, it is currently common practice to increase the memory resources used for buffering. However, this usually wastes a lot of memory resources on the terminal device, and the video watched by the user is actually delayed from the real video more and more.
In order to avoid the delay increase between the playing data played on the terminal equipment and the server and reduce or avoid the occurrence of the failure condition of live playing. An embodiment of the present application provides a data playing method, as shown in fig. 5, including the following steps:
101. and the terminal equipment loads the live broadcast data sent by the server in the cache.
102. And the terminal equipment reads the currently played data in the cache and plays the currently played data.
Specifically, for video data, the terminal device firstly sends the video data to a decoder for decoding, and then renders and outputs the decoded video data; for the audio data, the terminal equipment firstly sends the audio data into a decoder for decoding, and then renders and outputs the decoded audio data.
103. And the terminal equipment determines the delay of the current playing data and the live data according to the current live time of the live data and the playing time of the current playing data.
104. And the terminal equipment adjusts the playing speed of the current playing data according to the time delay.
For video data, the adjustment of the playing speed needs to be adjusted before the video data is sent to a decoder, and the specific terminal equipment modifies the timestamp of the currently played data before decoding according to the time delay. Thereby realizing the adjustment of the playing speed. For audio data, the adjustment of the playing speed needs to be performed after the decoding by the decoder, and the specific terminal device performs time stretching processing on the decoded current playing data according to the delay, for example, performs time stretching processing on the current playing data by using a Waveform Similarity and Overlap Add (WSOLA) method. Of course, there are many ways to modify the playing speed, which will not be described herein.
In addition, step 104 may be implemented as follows:
the first method is as follows: and the terminal equipment adjusts the playing speed of the current playing data according to the delay delta T and the buffering time length C. Before this, the terminal device needs to obtain the buffer duration C of the currently played data, which is determined by the attribute of the player application installed in the terminal device, and the length of the buffer duration C is default or user-adjustable. Specifically, the delay factor F may be calculated according to the formula F ═ δ T-C)/C, and the playing speed of the current playing data may be adjusted according to the delay factor F.
The relationship between the adjusted playback speed V and the delay factor F can be mapped according to the relationship shown in fig. 6: when F is less than or equal to 1, playing according to the original speed. When F is greater than 1, the playing is accelerated according to a delay factor F, the larger F is, the faster the playing speed V is, and the maximum speed upper limit can be controlled by a user option Vmax.
Generally considering user experience, the adjusted play speed does not exceed 1.3 times of the normal play speed. A variety of mapping functions may be defined based on empirical data of play speed, and the above figure is merely an example.
After the playing speed is adjusted, because the playing speed of the terminal device is faster than the normal playing speed, after a period of time, the delay δ T is automatically gradually reduced to an acceptable range, at this time, the delay factor F is restored to 1, the playing speed is restored to normal, and the state shown in fig. 7 is finally achieved.
The second method comprises the following steps: the terminal equipment adjusts the playing speed of the current playing data according to the time delay, the buffering time length and the reserved time length, wherein the playing speed meets the condition that delta T-C is less than or equal to B, delta T is the time delay, C is the buffering time length, and B is the reserved time length. Before this, the terminal device needs to obtain the buffer duration C of the currently played data, which is determined by the attribute of the player application installed in the terminal device, and the length of the buffer duration C is default or user-adjustable. The terminal device further needs to determine the duration of the server holding the live data according to the data transmission protocol used by the server. Exemplary reservation duration as specified by the DASH protocol.
Referring to fig. 8, as long as δ T-C is not greater than B, it can be ensured that the data buffered in the cache sent by the server to the terminal device can be continued to the data that is not deleted in the server, so that the phenomenon that the currently cached data cannot contain the deleted data in the server after the play speed is adjusted does not occur, thereby avoiding the failure of live broadcast play caused by delay.
In the above scheme, the terminal device can determine the delay between the currently played data and the live data according to the current live time of the live data and the playing time of the currently played data in the playing process of the currently played data, and then adjust the playing speed of the currently played data according to the delay, for example, the playing speed is increased when the delay is large, the normal playing speed is maintained when the delay is low, and the like, so that the delay between the played data on the terminal device and the server can be prevented from being increased, and the occurrence of a live playing failure condition is reduced or avoided.
An embodiment of the present invention provides a terminal device, which is applied to implement the data playing method described above, and includes, referring to fig. 9:
the loading module 91 is configured to load live data sent by the server in the cache;
the playing module 92 is configured to read currently played data from the cache and play the currently played data;
a processing module 93, configured to determine, according to a current live time of the live data and a play time of the current play data, a delay between the current play data and the live data;
and the playing module 92 is configured to adjust a playing speed of the currently played data according to the delay determined by the processing module 93.
In an exemplary scheme, the processing module 93 is further configured to obtain a buffer duration of the current playing data; the playing module 92 is specifically configured to adjust the playing speed of the currently played data according to the delay time and the buffering time determined by the processing module 93.
In an exemplary scheme, the playing module 92 is specifically configured to calculate a delay factor F according to a formula F ═ δ T-C)/C, where δ T is delay time and C is buffer duration; and adjusting the playing speed of the current playing data according to the delay factor.
In an exemplary scheme, the processing module 93 is further configured to determine a duration of time for which the server retains the live data according to a data transmission protocol used by the server; the processing module 93 is further configured to obtain a buffer duration of the current playing data; the playing module 92 is specifically configured to adjust a playing speed of the currently played data according to the delay, the buffering duration and the retention duration, where the playing speed satisfies that δ T-C is less than or equal to B, where δ T is the delay, C is the buffering duration, and B is the retention duration.
In one exemplary arrangement, the currently playing data includes video data; the processing module 93 is specifically configured to modify the timestamp of the current playing data according to the delay.
In one exemplary arrangement, the currently playing data includes audio data; the processing module 93 is specifically configured to perform time stretching processing on the decoded current playing data according to the delay.
All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and the function thereof is not described herein again.
In the case of an integrated module, the terminal device comprises: the device comprises a storage unit, a processing unit and an interface unit. The processing unit is used for controlling and managing the actions of the terminal device, for example, the processing unit is used for supporting the terminal device to execute the processes S101-104 in fig. 5; the interface unit is used for supporting the terminal equipment to acquire live broadcast data from the server. And a storage unit for storing program codes and data of the terminal device.
For example, the processing unit is a processor, the storage unit is a memory, and the interface unit is a communication interface. The terminal device shown in fig. 10 includes a communication interface 1001, a processor 1002, a memory 1003, and a bus 1004, and the communication interface 1001 and the processor 1002 are connected to the memory 1003 via the bus 1004.
The processor 1002 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an Application-Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to control the execution of programs in accordance with the teachings of the present disclosure.
The Memory 1003 may be a Read-Only Memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Compact Disc Read-Only Memory (CD-ROM) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these. The memory may be self-contained and coupled to the processor via a bus. The memory may also be integral to the processor.
The memory 1003 is used for storing application program codes for executing the scheme of the application, and the processor 1002 controls the execution. The communication interface 1001 is used for receiving content input by an external device, such as live data sent by a server. The processor 1002 is configured to execute the application program code stored in the memory 1003, thereby implementing the methods described in the embodiments of the present application.
Further, a computing storage medium (or media) is also provided, comprising instructions which, when executed, perform the operations of the method in the above embodiments.
Additionally, a computer program product is also provided, comprising the above-described computing storage medium (or media).
It should be understood that, in various embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (4)

1. A data playback method, comprising:
the terminal equipment loads live broadcast data sent by the server in a cache;
the terminal equipment reads and plays the current playing data in the cache;
the terminal equipment determines the delay of the current playing data and the live data according to the current live time of the live data and the playing time of the current playing data;
the terminal equipment determines the retention duration of the server on the live broadcast data according to a data transmission protocol used by the server;
the terminal equipment acquires the buffer duration of the current playing data;
and the terminal equipment adjusts the playing speed of the current playing data according to the delay, the buffering time length and the retention time length, wherein the playing speed meets the condition that delta T-C is less than or equal to B, delta T is the delay, C is the buffering time length, and B is the retention time length.
2. The data playback method of claim 1, wherein the currently played data includes video data;
the terminal equipment adjusts the playing speed of the current playing data according to the time delay, and the method comprises the following steps:
and the terminal equipment modifies the time stamp of the current playing data before decoding according to the time delay.
3. The data playback method of claim 1, wherein the currently played data includes audio data;
the terminal equipment adjusts the playing speed of the current playing data according to the time delay, and the method comprises the following steps:
and the terminal equipment performs time stretching processing on the decoded current playing data according to the time delay.
4. A terminal device, comprising:
the loading module is used for loading the live broadcast data sent by the server in the cache;
the playing module is used for reading and playing the current playing data in the cache;
the processing module is used for determining the delay of the current playing data and the live data according to the current live time of the live data and the playing time of the current playing data;
the processing module is further used for determining the retention time of the server on the live broadcast data according to a data transmission protocol used by the server;
the processing module is further configured to obtain a buffer duration of the current playing data;
the playing module is configured to adjust a playing speed of the currently played data according to the delay, the buffering duration and the retention duration determined by the processing module, where the playing speed satisfies that δ T-C is less than or equal to B, where δ T is the delay, C is the buffering duration, and B is the retention duration.
CN201810215225.7A 2018-03-15 2018-03-15 Data playing method and terminal equipment Active CN108235136B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810215225.7A CN108235136B (en) 2018-03-15 2018-03-15 Data playing method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810215225.7A CN108235136B (en) 2018-03-15 2018-03-15 Data playing method and terminal equipment

Publications (2)

Publication Number Publication Date
CN108235136A CN108235136A (en) 2018-06-29
CN108235136B true CN108235136B (en) 2021-06-01

Family

ID=62659451

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810215225.7A Active CN108235136B (en) 2018-03-15 2018-03-15 Data playing method and terminal equipment

Country Status (1)

Country Link
CN (1) CN108235136B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109640160B (en) * 2018-12-28 2022-01-11 鸿视线科技(北京)有限公司 Sustainable synchronous control system and method based on buffer control structure
WO2021009255A1 (en) * 2019-07-17 2021-01-21 Dolby International Ab Latency management for content delivery
CN112752109B (en) 2019-10-30 2022-05-17 上海哔哩哔哩科技有限公司 Video playing control method and system
CN111601152A (en) * 2020-05-11 2020-08-28 青岛海信传媒网络技术有限公司 Live broadcast processing method and display device
CN111583952B (en) * 2020-05-19 2024-05-07 北京达佳互联信息技术有限公司 Audio processing method, device, electronic equipment and storage medium
CN112019927B (en) * 2020-09-23 2023-01-06 Oppo广东移动通信有限公司 Video live broadcast method, microphone connecting equipment, live broadcast system and storage medium
CN114885209B (en) * 2022-04-08 2023-06-16 车智互联(北京)科技有限公司 Live broadcast data processing method, computing device and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102387401A (en) * 2011-10-17 2012-03-21 上海聚力传媒技术有限公司 Method, device and equipment for adjusting play time point of audio and video information
CN104079955A (en) * 2013-03-26 2014-10-01 华为技术有限公司 Method, device and system of OTT (Over The Top) live broadcast
WO2016125655A1 (en) * 2015-02-03 2016-08-11 日立マクセル株式会社 Broadcast receiver, broadcast receiving method and content output method
CN107396171A (en) * 2017-07-24 2017-11-24 广州酷狗计算机科技有限公司 Live network broadcast method, device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102387401A (en) * 2011-10-17 2012-03-21 上海聚力传媒技术有限公司 Method, device and equipment for adjusting play time point of audio and video information
CN104079955A (en) * 2013-03-26 2014-10-01 华为技术有限公司 Method, device and system of OTT (Over The Top) live broadcast
WO2016125655A1 (en) * 2015-02-03 2016-08-11 日立マクセル株式会社 Broadcast receiver, broadcast receiving method and content output method
CN107396171A (en) * 2017-07-24 2017-11-24 广州酷狗计算机科技有限公司 Live network broadcast method, device and storage medium

Also Published As

Publication number Publication date
CN108235136A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108235136B (en) Data playing method and terminal equipment
US7630612B2 (en) Video stream adaptive frame rate scheme
US20230336600A1 (en) Method for broadcasting streaming contents in a peer-to-peer network
US9781183B2 (en) Accelerated playback of streaming media
US8782276B2 (en) Method and system for selecting a delivery method for media on demand
CN110022498B (en) Method and device for realizing code rate switching
JP2015511783A (en) Improved DASH client and receiver with playback rate selection
JP2015515173A (en) Control of HTTP streaming between source and receiver via multiple TCP connections
US11496536B2 (en) Method of requesting video, computing device, and computer-program product
CN105072480A (en) Video live broadcast method and device
JP2016059037A (en) Method and client terminal for receiving multimedia content split into at least two successive segments, and corresponding computer program product and computer readable medium
CN106851337B (en) Video buffering control method and device
US20160309210A1 (en) Video playback method, apparatus and intelligent terminal
KR102521753B1 (en) A method for playing content streamed over a network in a player on a client device
CN114040245A (en) Video playing method and device, computer storage medium and electronic equipment
US10965607B2 (en) Arbitration of competing flows
US20190387271A1 (en) Image processing apparatus, image processing method, and program
US20080285937A1 (en) Video/audio display system and method
CN114416013A (en) Data transmission method, data transmission device, electronic equipment and computer-readable storage medium
US11546675B2 (en) Methods, systems, and media for streaming video content using adaptive buffers
JP2016126037A (en) Signal processing device, signal processing method, and program
JP2022039452A (en) Reception terminal, distribution server, reception method and reception program
US20230421821A1 (en) Method for Managing Playback of Multimedia Content
US10938939B2 (en) Client-side quality-of-service (QOS) for viewing of adaptive bitrate (ABR) streams
CN118337630A (en) Bandwidth allocation method, device, computer equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant