CN109348251B - Method and device for video playing, computer readable medium and electronic equipment - Google Patents

Method and device for video playing, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN109348251B
CN109348251B CN201811166174.XA CN201811166174A CN109348251B CN 109348251 B CN109348251 B CN 109348251B CN 201811166174 A CN201811166174 A CN 201811166174A CN 109348251 B CN109348251 B CN 109348251B
Authority
CN
China
Prior art keywords
video
file
target
frame
timestamp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811166174.XA
Other languages
Chinese (zh)
Other versions
CN109348251A (en
Inventor
翁名为
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201811166174.XA priority Critical patent/CN109348251B/en
Publication of CN109348251A publication Critical patent/CN109348251A/en
Application granted granted Critical
Publication of CN109348251B publication Critical patent/CN109348251B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234309Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2387Stream processing in response to a playback request from an end-user, e.g. for trick-play
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440218Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Abstract

The embodiment of the invention provides a method and a device for video playing, a computer readable medium and electronic equipment. The method comprises the following steps: acquiring current dragging position information of a video file; determining a target video clip of the video file according to the current dragging position information; searching a target instant decoding refresh frame which is closest to the current dragging position in the target video segment according to the index file of the video file; wherein the index file includes timestamps for the instant decode refresh frames within each video segment in the video file. According to the technical scheme of the embodiment of the invention, the target instant decoding refresh frame closest to the front distance of the current dragging distance in the target video segment can be found through the index file comprising the timestamp of the instant decoding refresh frame in each video segment in the video file, so that the positioning precision is improved, and the video watching experience is optimized.

Description

Method and device for video playing, computer readable medium and electronic equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for video playing, a computer-readable medium, and an electronic device.
Background
The commonly used Streaming media protocols mainly include HTTP (HyperText Transfer Protocol) progressive download and Real-Time Streaming media protocols based on RTSP (Real-Time Streaming Protocol)/RTP (Real-Time Transport Protocol), and currently, an HTTP progressive download method is convenient and preferable. In the HTTP progressive downloading method, the HLS (HTTP Live Streaming) of Apple company is a representative of this aspect, and is mainly used for audio and video services of PCs (Personal computers) and Apple terminals, and includes an m3u8 index file, a TS (english is called Transport Stream, chinese is called Transport Stream, and TS is a subprotocol in MPEG2(Moving Picture Experts Group 2) protocol) media slice file, and a key encryption string file.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present invention and therefore may include information that does not constitute prior art known to a person of ordinary skill in the art.
Disclosure of Invention
The embodiment of the invention provides a method and a device for video playing, a computer readable medium and electronic equipment, which can improve the positioning precision of an instant decoding refresh frame.
According to an aspect of an embodiment of the present invention, there is provided a method for video playback, including: acquiring current dragging position information of a video file; determining a target video clip of the video file according to the current dragging position information; searching a target instant decoding refresh frame which is closest to the current dragging position in the target video segment according to the index file of the video file; wherein the index file includes timestamps for the instant decode refresh frames within each video segment in the video file.
In an exemplary embodiment of the present disclosure, searching for a target instant decoding refresh frame closest to a current dragging position in the target video segment according to an index file of the video file includes: obtaining a timestamp of the current drag position within the target video segment; and determining the target instant decoding refresh frame according to the time stamp of the instant decoding refresh frame in the target video segment and the time stamp of the current dragging position in the target video segment.
In an exemplary embodiment of the present disclosure, the index file further includes offset positions of the instantaneous decoding refresh frames within each video segment in the video file; wherein the method further comprises: acquiring the offset position of the target instant decoding refresh frame according to the index file; sending a video continuous playing request according to the offset position of the target instant decoding refresh frame; and receiving current video data returned in response to the video continuous playing request.
In an exemplary embodiment of the present disclosure, further comprising: decoding the current video data; and playing the decoded current video data.
In an exemplary embodiment of the present disclosure, further comprising: traversing video frames in each video segment in the video file, and acquiring the frame type, the timestamp and the offset position of each video frame, wherein the frame type comprises an instant decoding refresh frame; and storing the acquired time stamp and offset position of the instant decoding refresh frame in each video segment in the video file into the index file.
In an exemplary embodiment of the present disclosure, the index file further includes a standard tag for each video clip; storing the time stamp and the offset position of the instant decoding refresh frame in each video segment in the video file to the index file, including: inserting a private label in front of a standard label of each video clip in the index file; wherein the private label includes a timestamp and an offset position of an instantaneous decode refresh frame within a corresponding video segment.
In an exemplary embodiment of the present disclosure, the standard tag includes a duration of the video clip.
In an exemplary embodiment of the present disclosure, further comprising: sending a playing request of the video file; receiving an index file returned in response to the playing request; and analyzing the index file to obtain the time stamp and the offset position of the instant decoding refresh frame in each video segment in the video file.
According to an aspect of the present disclosure, there is provided an apparatus for video playback, including: the system comprises a dragging information acquisition module, a dragging information acquisition module and a dragging information acquisition module, wherein the dragging information acquisition module is configured to acquire current dragging position information of a video file; the video clip determining module is configured to determine a target video clip of the video file according to the current dragging position information; the target frame searching module is configured to search a target instant decoding refresh frame which is closest to the current dragging position in the target video segment according to the index file of the video file; wherein the index file includes timestamps for the instant decode refresh frames within each video segment in the video file.
In an exemplary embodiment of the present disclosure, the target frame lookup module includes: a drag timestamp obtaining unit configured to obtain a timestamp of the current drag position within the target video segment; and the target frame determining unit is configured to determine the target instant decoding refresh frame according to the timestamp of the instant decoding refresh frame in the target video segment and the timestamp of the current dragging position in the target video segment.
In an exemplary embodiment of the present disclosure, the index file further includes offset positions of the instantaneous decoding refresh frames within each video segment in the video file; wherein the apparatus further comprises: the offset position acquisition module is configured to acquire the offset position of the target instant decoding refresh frame according to the index file; the playing request sending module is configured to send a video continuous playing request according to the offset position of the target instant decoding refresh frame; and the video data receiving module is configured to receive the current video data returned in response to the video continuous playing request.
In an exemplary embodiment of the present disclosure, further comprising: a decoding module configured to decode the current video data; and the playing module is configured to play the decoded current video data.
In an exemplary embodiment of the present disclosure, further comprising: the traversal module is configured to traverse video frames in each video segment in the video file and acquire a frame type, a timestamp and an offset position of each video frame, wherein the frame type comprises an instant decoding refresh frame; and the frame information storage module is configured to store the time stamp and the offset position of the instant decoding refresh frame in each video segment in the acquired video file into the index file.
According to an aspect of embodiments of the present invention, there is provided a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the method for video playback as described in the above embodiments.
According to an aspect of an embodiment of the present invention, there is provided an electronic apparatus including: one or more processors; a storage device configured to store one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the method for video playback as described in the embodiments above.
In the technical solutions provided in some embodiments of the present invention, the target instant decoding refresh frame closest to the previous distance of the current dragging distance in the target video segment is found through the index file including the timestamp of the instant decoding refresh frame in each video segment in the video file, so that the positioning accuracy is improved, and the video viewing experience of the user can be optimized.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
fig. 1 shows a schematic diagram of an exemplary system architecture of a method for video playback or an apparatus for video playback to which an embodiment of the invention may be applied;
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use with the electronic device to implement an embodiment of the invention;
fig. 3 is a schematic diagram illustrating a method for video playback in the related art;
FIG. 4 schematically illustrates a flow diagram of a method for video playback according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating a processing procedure of step S430 shown in FIG. 4 in one embodiment;
FIG. 6 schematically shows a flow diagram of a method for video playback according to a further embodiment of the invention;
FIG. 7 schematically illustrates a flow diagram of a method for video playback according to yet another embodiment of the present invention;
FIG. 8 schematically illustrates a flow diagram of a method for video playback according to yet another embodiment of the present invention;
FIG. 9 schematically shows a block diagram of an apparatus for video playback according to an embodiment of the present invention;
FIG. 10 is a schematic interface diagram illustrating a user dragging a progress bar during video playing;
FIG. 11 is a schematic diagram of an interface for locating a target immediate decode refresh frame using the method of FIG. 3;
fig. 12 is a schematic diagram of an interface for locating a target immediate decoding refresh frame by the method shown in fig. 4.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention may be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations or operations have not been shown or described in detail to avoid obscuring aspects of the invention.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Fig. 1 shows a schematic diagram of an exemplary system architecture 100 of a method for video playback or an apparatus for video playback to which embodiments of the present invention may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, a network 104, and a server 105. The network 104 is used to provide a medium for communication links between the terminal devices 101 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
A user may use terminal device 101 to interact with server 105 over network 104 to receive or send messages or the like. The terminal device 101 may be a variety of electronic devices having a display screen including, but not limited to, smart phones, tablets, portable and desktop computers, digital cinema projectors, and the like. Various clients, such as a video playing client, can be installed on the terminal device 101.
The server 105 may be a server that provides various services. For example, the user sends a video play request to the server 105 using the terminal apparatus 101. The server 105 may retrieve the matched search result from the database based on the related information carried in the video playing request, and feed back the search result, for example, the corresponding video information to the terminal device 101, so that the user may view the corresponding video based on the content displayed on the terminal device 101.
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use with the electronic device to implement an embodiment of the invention.
It should be noted that the computer system 200 of the electronic device shown in fig. 2 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiment of the present invention.
As shown in fig. 2, the computer system 200 includes a Central Processing Unit (CPU)201 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)202 or a program loaded from a storage section 208 into a Random Access Memory (RAM) 203. In the RAM 203, various programs and data necessary for system operation are also stored. The CPU 201, ROM 202, and RAM 203 are connected to each other via a bus 204. An input/output (I/O) interface 205 is also connected to bus 204.
The following components are connected to the I/O interface 205: an input portion 206 including a keyboard, a mouse, and the like; an output section 207 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 208 including a hard disk and the like; and a communication section 209 including a network interface card such as a LAN card, a modem, or the like. The communication section 209 performs communication processing via a network such as the internet. A drive 210 is also connected to the I/O interface 205 as needed. A removable medium 211 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 210 as necessary, so that a computer program read out therefrom is mounted into the storage section 208 as necessary.
In particular, according to an embodiment of the present invention, the processes described below with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the invention include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 209 and/or installed from the removable medium 211. The computer program, when executed by a Central Processing Unit (CPU)201, performs various functions defined in the methods and/or apparatus of the present application.
It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of methods, apparatus, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules and/or units described in the embodiments of the present invention may be implemented by software, or may be implemented by hardware, and the described modules and/or units may also be disposed in a processor. Wherein the names of such modules and/or units do not in some way constitute a limitation on the modules and/or units themselves.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 4, 5, 6, 7, or 8.
Fig. 3 is a schematic diagram illustrating a method for video playback in the related art.
In the related art, first, the location of the SEEK of the user is obtained, and the TS fragment corresponding to the location of the SEEK (chinese translation is search, in the embodiment of the present invention, the operation of dragging the triggered player to the target location for the progress bar in the present invention). And then, directly acquiring the IDR frame at the beginning of the TS fragment, and playing the TS fragment after decoding.
As shown in fig. 3, it is assumed here that a video file includes 3 video clips: 1, ts, 2, ts, and 3, wherein the 1, ts includes two IDR (Instantaneous Decoding Refresh) frames, the 2, ts includes three IDR frames, and the 3, ts includes two IDR frames, and the duration of the 3 video segments is 10 seconds, and the current position of the drag (SEEK) is 15 seconds.
Assume at this time that the contents of the m3u8 index file of the video file are as shown in table 1 below:
Figure BDA0001821150070000081
Figure BDA0001821150070000091
TABLE 1
In the embodiment of fig. 3, the play position after the SEEK decoding is 2. the initial position of the TS file, that is, the position of 10 seconds of the video file, but the actual SEEK position of the user is 15 seconds, and at this time, there is an error of 5 seconds between the located IDR frame and the actual SEEK position, that is, by adopting the scheme in the related art, the SEEK can only reach the beginning of each TS slice, the SEEK in the TS slice cannot be realized, and the deviation between the actual play position and the SEEK position is large.
Fig. 4 schematically shows a flow chart of a method for video playback according to an embodiment of the invention. The method steps of the embodiment of the present invention may be executed by the terminal device, or may be executed by the server, or may be executed by the terminal device and the server interactively, for example, the server 105 or the terminal device 101 in fig. 1 described above, or may be executed by both of them, but the present invention is not limited thereto.
As shown in fig. 4, the method for playing video according to the embodiment of the present invention may include the following steps.
In step S410, current drag (SEEK, which means that a user drags a video playing progress bar to a designated position or a target position to play) position information of a video file is obtained.
In the embodiment of the present invention, the format of the video file may be an HLS format, but the present invention is not limited to this, and may also be any other suitable video format.
In step S420, a target video segment (TS segment) of the video file is determined according to the current drag position information.
In step S430, a target Instantaneous Decoding Refresh (IDR) frame closest to the current dragging position in the target video segment is searched according to the index file of the video file.
Wherein the index file may include timestamps of IDR frames within video segments in the video file.
In the embodiment of the present invention, the video file may include I frames (intra-coded frames), P frames (forward predictive coded frames), and B frames (bidirectional predictive interpolation coded frames).
The I frame is an independent frame with all information, and can be independently decoded without referring to other images, and the first frame in the video sequence is always the I frame. Both I and IDR frames use intra prediction. In the encoding and decoding, for convenience, the first I frame needs to be distinguished from other I frames, and the first I frame is called an IDR frame, so that the encoding and decoding processes are conveniently controlled.
The purpose of the IDR frame is to refresh it immediately so that errors do not propagate. Starting from the IDR frame, a new sequence is recalculated to start encoding. While I-frames do not have the capability of random access, this function is assumed by IDR frames. An IDR frame will cause a DPB (Decoded Picture Buffer reference frame list) to be empty, while an I frame will not. For IDR frames, all frames following an IDR frame cannot reference the content of frames preceding any IDR frame, in contrast to normal I frames, for which the following B and P frames may reference I frames preceding the normal I frame. From a randomly accessed video stream, the player can always play from one IDR frame because there is no frame following it to reference the previous frame. However, it is not possible to start playing from any point in a video without IDR frames, since the following frames will always refer to the preceding frames.
In the embodiment of the invention, the SEEK needs to be positioned to the IDR frame for decoding and playing.
According to the method for playing the video, provided by the embodiment of the invention, the target IDR frame closest to the front distance of the current dragging distance in the target video segment is searched through the index file comprising the time stamp of the IDR frame in each video segment in the video file, so that the positioning precision is improved, and the video watching experience is optimized.
Fig. 5 is a schematic diagram illustrating a processing procedure of step S430 shown in fig. 4 in an embodiment. The method steps of the embodiment of the present invention may be executed by the terminal device, the server, or the terminal device and the server interactively, for example, the server 105 or the terminal device 101 in fig. 1 may be executed, but the present invention is not limited thereto.
As shown in fig. 5, step S430 in the above-mentioned embodiment shown in fig. 4 may further include the following steps.
In step S431, a timestamp of the current drag position within the target video segment is obtained.
In step S432, the target IDR frame is determined according to the timestamp of the IDR frame in the target video segment and the timestamp of the current drag position in the target video segment.
Fig. 6 schematically shows a flow chart of a method for video playback according to a further embodiment of the present invention. The method steps of the embodiment of the present invention may be executed by the terminal device, the server, or the terminal device and the server interactively, for example, the server 105 or the terminal device 101 in fig. 1 may be executed, but the present invention is not limited thereto.
In this embodiment of the present invention, the index file may further include offset positions of IDR frames in each video segment in the video file.
As shown in fig. 6, the embodiment of the present invention is different from the embodiment shown in fig. 4 in that the following steps may be further included.
In step S610, an offset position of the target IDR frame is obtained according to the index file.
In step S620, a video playback continuation request is sent according to the offset position of the target IDR frame.
In step S630, current video data returned in response to the video resume play request is received.
With continued reference to fig. 6, the method may further include the following steps.
In step S640, the current video data is decoded.
In step S650, the decoded current video data is played.
Fig. 7 schematically shows a flow chart of a method for video playback according to a further embodiment of the present invention. The method steps of the embodiment of the present invention may be executed by the terminal device, the server, or the terminal device and the server interactively, for example, the server 105 or the terminal device 101 in fig. 1 may be executed, but the present invention is not limited thereto.
As shown in fig. 7, compared with the above embodiments shown in fig. 4 and/or fig. 6, the embodiment of the present invention may further include the following steps.
In step S710, video frames in each video segment in the video file are traversed, and a frame type, a timestamp, and an offset position of each video frame are obtained, where the frame type may include an instantaneous decoding refresh frame.
For example, an ffmpeg (fast Forward mpeg) tool or the like may be used to traverse the video frames in each video segment in the video file, obtain the frame type, the timestamp, and the offset position of each video frame, and record the timestamp and the offset position of the IDR frame therein.
FFmpeg is a set of open source computer programs that can be used to record, convert digital audio, video, and convert them into streams. It provides a complete solution for recording, converting and streaming audio and video.
In the embodiment of the present invention, the frame types of the video frame may include an I frame, a P frame, and a B frame, where an IDR frame belongs to one of the I frames.
In step S720, the time stamp and the offset position of the instant decoding refresh frame in each video segment in the obtained video file are stored in the index file.
In the embodiment of the present invention, the index file may be an m3u8 file, but the present invention is not limited to this. When the format of the video file is changed, the index file can be changed correspondingly.
In an exemplary embodiment, the index file may further include standard tags for each video clip.
In an exemplary embodiment, storing the time stamps and offset positions of the IDR frames within the video segments in the video file to the index file may include: inserting a private label in front of a standard label of each video clip in the index file; wherein the private label includes a timestamp and an offset position of an IDR frame within a corresponding video segment.
In an exemplary embodiment, the standard tag may include a duration of a video clip.
This is illustrated by a practical example. Assume that a private label is inserted before each TS slice based on the standard m3u8, as shown in table 2 below.
Figure BDA0001821150070000121
Figure BDA0001821150070000131
TABLE 2
In Table 2 above, # EXTM3U, # EXT-X-TARGETDURATION:10, # EXTINF:10.0, etc. are standard labels, and a value of 10.0 after # EXTINF represents that the time length of the corresponding TS fragment is 10 s.
Where # QQ-HLS-PRIVATE TAG is the identification of the PRIVATE TAG (TAG), followed by information "timestamp | offset position of IDR frame", and may be spaced "apart" between each IDR frame information.
For example, # QQ-HLS-PRIVATE:0.0|376,3.0|368940,7.0|892060 indicates that 1.TS this TS slice includes three IDR frames, and the timestamp of the first IDR frame is 0.0 seconds, and the offset position is 376 bytes; the timestamp of the second IDR frame is 3.0 seconds and the offset is 368940 bytes; the third IDR frame has a timestamp of 7.0 seconds and an offset of 892060 bytes.
In the above example, the time stamp of each IDR frame refers to a relative time in each TS slice, but in other embodiments, the time stamp of each IDR frame may be recorded using an absolute time, which is not limited in the present invention.
In the embodiment of the present invention, the offset position of the IDR frame refers to an offset byte with respect to the initial position of the entire video file.
It should be noted that, in the above example, the index file is obtained by modifying an existing m3u8 file of a video file, but in other embodiments, the time stamp and the offset position of the IDR frame in each video clip in the video file may be separately recorded in a file different from m3u8, which is not limited by the present invention.
In this embodiment of the present invention, if the step S710 and the step S720 are executed by a background server, the method may further include: and when a playing request of the front-end player is received, sending the m3u8 file inserted with the private label to the front-end player.
Fig. 8 schematically shows a flow chart of a method for video playback according to a further embodiment of the present invention. The method steps of the embodiment of the present invention may be executed by the terminal device, the server, or the terminal device and the server interactively, for example, the terminal device 101 in fig. 1 may be used to execute the method steps, but the present invention is not limited thereto.
In step S810, a play request of the video file is sent.
For example, the front-end player sends a play request of a video file to the back-end server. Wherein the playback request may include an HLS video playback address.
In step S820, an index file returned in response to the play request is received.
For example, after receiving the play request, the background server may retrieve the corresponding index file according to the HLS video play address in the play request, and return the index file to the front-end player. The index file returned here may be the m3u8 file with the private label inserted in the standard m3u8 file as described above.
In step S830, the index file is parsed to obtain the timestamp and the offset position of the IDR frame in each video segment in the video file.
For example, after receiving the index file, the front-end player may parse the index file to obtain a timestamp and an offset position of an IDR frame in each video segment in the video file, and store the timestamp and the offset position locally.
In the embodiment of the present invention, the m3u8 file may be parsed in rows, where a tag to be currently parsed is parsed according to the specification of a standard tag if the tag is a standard tag, is parsed according to the specification of a private tag if the tag is a private tag, and may be discarded or an error may be directly reported if the tag is an unknown tag, depending on a specific parsing policy.
In the embodiment of the present invention, the method may further include: the front end player may start playing the downloaded m3u8 file, for example, in a standard HLS supported form.
In the embodiment of the present invention, the method may further include: when a user generates a SEEK, according to the SEEK position and the time length of each TS fragment (# the number after the extinnf tag is the time length of each TS fragment), the location to the corresponding TS fragment can be determined according to the following formula (1):
the total time length of the first (N-1) TS fragments (SEEK position < total time length of the first N TS fragments (1)
In the above formula (1), N is assumed to be the serial number of the target TS segment, where N is a positive integer greater than or equal to 1.
Taking m3u8 after inserting the private label in table 2 above as an example, if the position of the SEEK is 15s and the above equation (1) is substituted to calculate that N is 2, the TS fragment at the position of the SEEK is 2. TS.
In the embodiment of the present invention, the method may further include: and inquiring the IDR frame closest to the SEEK position as a target IDR frame.
For example, the query method may be that, assuming that the serial number of the TS slice where the SEEK position is located is N, the timestamp _ offset of the SEEK position in the nth slice is the total duration of the SEEK position-first (N-1) TS slices, and finding the IDR frame whose last timestamp is not greater than the timestamp _ offset in the private label of the target TS slice is the target IDR frame, and may obtain the offset position thereof.
Here, also take m3u8 after inserting the private label in table 2 above as an example, assuming that the position of the SEEK is 15 seconds, the TS slice where the position of the SEEK is located is 2.TS, and the timestamp _ offset of the nth slice where the position of the SEEK is 15 seconds to 10 seconds, that is, 5 seconds. Traversing the IDR frame list of the target TS slice, finding an IDR frame with the last timestamp not greater than the timestamp _ offset, that is, 4.6 seconds, and its corresponding offset position is 468940.
It should be noted that, in the above example, it is assumed that the current playing progress is smaller than the SEEK position, for example, it is assumed that the current playing progress is 5 seconds, and the user drags the progress bar backwards for 15 seconds, then by using the method provided in the embodiment of the present invention, an IDR frame closest to the SEEK position for 15 seconds and located before 15 seconds, that is, an IDR frame with a timestamp of 4.6 seconds in 2.ts, is queried as the target IDR frame. However, the present invention is not limited thereto, and in other embodiments, the user may drag the progress bar back, that is, the current playing progress may be larger than the SEEK position. In other embodiments, an IDR frame closest to and after the SEEK position may be queried as the target IDR frame.
In the embodiment of the present invention, the method may further include: and when the front-end player sends an http request to the background server, the http request header carries the offset position of the target IDR frame. Also according to the above example, then:
Range:bytes=468940-
indicating that the front-end player requests 468940 bytes later range of the video file from the back-end server.
In the embodiment of the present invention, the method may further include: and the front-end player receives the video data which is returned by the background server according to the http request and is behind the offset position of the target IDR frame, namely the video data behind the target IDR frame, and the front-end player obtains the video data to decode and play, so that the quick SEEK in the TS (transport stream) fragment of the HLS protocol is realized.
According to the method for playing the video, disclosed by the embodiment of the invention, when m3u8 is generated, on the basis of the standard m3u8, a private label is marked on the front of each TS fragment, wherein the private label comprises the time stamps and offset positions of all IDR frames in each TS fragment; then, according to the SEEK position, positioning the TS fragment of the SEEK position, and then according to the IDR frame timestamp in the private label, finding out the offset position of the IDR frame needing the SEEK; then, requesting the content of the offset position, decoding and playing, on one hand, quickly realizing SEEK in the TS fragment of the HLS protocol; on the other hand, the deviation between the real playing position and the SEEK position is small, and the video watching experience of the user is improved.
Fig. 9 schematically shows a block diagram of an apparatus for video playback according to an embodiment of the present invention. The apparatus 900 for playing a video according to the embodiment of the present invention may be disposed in the terminal device 101 in fig. 1, or may be disposed in the server 105 in fig. 1, or may be a partial module disposed in the terminal device 101 and a partial module disposed in the server 105, which is not limited in this respect.
As shown in fig. 9, an apparatus 900 for video playing provided by an embodiment of the present invention may include: a drag information acquisition module 910, a video segment determination module 920, and a target frame search module 930.
The dragging information obtaining module 910 may be configured to obtain current dragging position information of the video file.
The video clip positioning module 920 may be configured to determine a target video clip of the video file according to the current drag position information.
The target frame searching module 930 may be configured to search, according to the index file of the video file, a target instant decoding refresh frame closest to the current dragging position in the target video segment.
Wherein the index file includes timestamps for the instant decode refresh frames within each video segment in the video file.
In an exemplary embodiment, the target frame finding module may include a drag timestamp obtaining unit and a target frame determining unit.
Wherein the drag time stamp obtaining unit may be configured to obtain a time stamp of the current drag position within the target video segment.
The target frame determination unit may be configured to determine the target instant decoding refresh frame according to a timestamp of the instant decoding refresh frame within the target video segment and a timestamp of the current drag position within the target video segment.
In an exemplary embodiment, the index file may further include offset locations of the instantaneous decode refresh frames within each video segment of the video file.
In an exemplary embodiment, the apparatus 900 for video playing may further include an offset position obtaining module, a playing request transmitting module, and a video data receiving module.
The offset position obtaining module may be configured to obtain an offset position of the target instant decoding refresh frame according to the index file.
The play request sending module may be configured to send a video continuous play request according to the offset position of the target instant decoding refresh frame.
The video data receiving module may be configured to receive current video data returned in response to the video resume request.
In an exemplary embodiment, the apparatus 900 for video playback may further include a decoding module and a playback module.
Wherein the decoding module may be configured to decode the current video data.
The play module may be configured to play the decoded current video data.
In an exemplary embodiment, the apparatus 900 for video playing may further include a traversal module and a frame information storage module.
The traversal module may be configured to traverse video frames in each video segment in the video file, and obtain a frame type, a timestamp, and an offset position of each video frame, where the frame type includes an instantaneous decoding refresh frame.
The frame information storage module may be configured to store the acquired time stamp and offset position of the instant decoding refresh frame in each video segment of the video file to the index file.
In an exemplary embodiment, the index file may further include standard tags for each video clip.
In an exemplary embodiment, the frame information storage module may include a tag insertion unit.
Wherein the tag inserting unit may be configured to insert a private tag before a standard tag of each video clip in the index file.
Wherein the private label may include a timestamp and an offset position of an instantaneous decode refresh frame within a corresponding video segment.
In an exemplary embodiment, the standard tag may include a duration of a video clip.
In an exemplary embodiment, the play request sending module may be further configured to send a play request of the video file.
In an exemplary embodiment, the apparatus 900 for video playing may further include an index file receiving module and an index file parsing module.
Wherein the index file receiving module may be configured to receive an index file returned in response to the play request.
Parsing the index file may be configured to obtain timestamps and offset positions for the instantaneous decoding refresh frames within the video segments in the video file.
The specific implementation of each module and/or unit in the apparatus for video playing provided in the embodiment of the present invention may refer to the content in the method for video playing, and is not described herein again.
According to the device for playing the video, provided by the embodiment of the invention, the target IDR frame closest to the front distance of the current dragging distance in the target video segment is searched through the index file comprising the time stamp of the IDR frame in each video segment in the video file, so that the positioning precision is improved, and the video watching experience is optimized.
Fig. 10 schematically shows an interface diagram of a user dragging a progress bar during video playing.
As shown in fig. 10, the upper portion of the display screen 1000 of the terminal device is a playing interface 1010 of the player, and the lower portion 1020 may display other picture or text information, for example, some related information of the currently played video file, for example, comment information of the user, actor information or script information of the video file, and the like, which is not limited in the present invention.
The user can click the play button 1011 to start playing the video file, the total time length for displaying the video file is 44:48 minutes, and the user drags the progress bar 1012 to the current drag position 1013, which is assumed to be 4:33 minutes.
Fig. 11 is a schematic diagram of an interface for locating a target immediate decoding refresh frame by using the method shown in fig. 3.
As shown in fig. 11, according to fig. 10, after the user drags the progress bar to 04:33 minutes, the user locates to the starting point 04:24 of the segment and continues to play the video file, and the deviation between the actual playing point 04:24 and the target playing point 04:33 is large.
Fig. 12 is a schematic diagram of an interface for locating a target immediate decoding refresh frame by the method shown in fig. 4.
As shown in fig. 12, according to the above fig. 10, after the user drags the progress bar to 04:33 minutes, by using the method provided by the embodiment of the present invention, the playing can be continued after being positioned to 04:30 minutes according to the private label, and the deviation between the actual playing point 04:30 and the target playing point 04:33 minutes is small.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the invention. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present invention can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which can be a personal computer, a server, a touch terminal, or a network device, etc.) to execute the method according to the embodiment of the present invention.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (9)

1. A method for video playback, comprising:
traversing video frames in each video segment in a video file, and acquiring the frame type, the timestamp and the offset position of each video frame, wherein the frame type comprises an instant decoding refresh frame;
storing the acquired time stamp and offset position of the instant decoding refresh frame in each video segment in the video file to an index file, wherein the index file also comprises a standard tag of each video segment, and the standard tag comprises the duration of the video segment; wherein, storing the time stamp and the offset position of the instant decoding refresh frame in each video segment in the obtained video file to the index file comprises: inserting a private label in front of a standard label of each video clip in the index file; wherein the private label includes a timestamp and an offset position of an instant decode refresh frame within a respective video segment;
acquiring current dragging position information of a video file;
determining a target video clip of the video file according to the current dragging position information and the time length of each video clip in a standard label in an index file of the video file, wherein the current dragging position information is greater than or equal to the total time length of the first (N-1) video clips in the video file and is less than the total time length of the first N video clips in the video file, N is a serial number of the target video clip, and N is a positive integer greater than or equal to 1;
according to the current dragging position and the total time length of the previous (N-1) video segments, obtaining a timestamp of the current dragging position in the target video segment, searching a private label of the target video segment of an index file of the video file for a timestamp of which the last timestamp is not greater than the timestamp of the current dragging position in the target video segment, taking the timestamp as a target instant decoding refresh frame closest to the current dragging position, and obtaining an offset position of the target instant decoding refresh frame.
2. The method of claim 1, further comprising:
sending a video continuous playing request according to the offset position of the target instant decoding refresh frame;
and receiving current video data returned in response to the video continuous playing request.
3. The method of claim 2, further comprising:
decoding the current video data;
and playing the decoded current video data.
4. The method of claim 1, further comprising:
sending a playing request of the video file;
receiving an index file returned in response to the playing request;
and analyzing the index file to obtain the time stamp and the offset position of the instant decoding refresh frame in each video segment in the video file.
5. An apparatus for video playback, comprising:
the traversal module is configured to traverse video frames in each video segment in the video file and acquire a frame type, a timestamp and an offset position of each video frame, wherein the frame type comprises an instant decoding refresh frame;
the frame information storage module is configured to store the time stamp and the offset position of the instant decoding refresh frame in each video segment in the acquired video file into an index file; the index file also comprises a standard label of each video clip, and the standard label comprises the duration of the video clip; the frame information storage module comprises a tag insertion unit, wherein the tag insertion unit is configured to insert a private tag in front of a standard tag of each video clip in the index file; wherein the private label includes a timestamp and an offset position of an instant decode refresh frame within a respective video segment;
the system comprises a dragging information acquisition module, a dragging information acquisition module and a dragging information acquisition module, wherein the dragging information acquisition module is configured to acquire current dragging position information of a video file;
a video clip determining module, configured to determine a target video clip of the video file according to the current dragging position information and a time length of each video clip in a standard tag in an index file of the video file, where the current dragging position information is greater than or equal to a total time length of first (N-1) video clips in the video file and is less than a total time length of first N video clips in the video file, N is a sequence number of the target video clip, and N is a positive integer greater than or equal to 1;
and the target frame searching module is configured to obtain a timestamp of the current dragging position in the target video segment according to the current dragging position and the total duration of the previous (N-1) video segments, search a private label of the target video segment of the index file of the video file for a timestamp of which the last timestamp is not greater than the timestamp of the current dragging position in the target video segment, use the timestamp as a target instant decoding refresh frame closest to the current dragging position, and obtain an offset position of the target instant decoding refresh frame.
6. The apparatus of claim 5, further comprising:
the playing request sending module is configured to send a video continuous playing request according to the offset position of the target instant decoding refresh frame;
and the video data receiving module is configured to receive the current video data returned in response to the video continuous playing request.
7. The apparatus of claim 6, further comprising:
a decoding module configured to decode the current video data;
and the playing module is configured to play the decoded current video data.
8. A computer-readable medium, on which a computer program is stored, which program, when being executed by a processor, carries out the method according to any one of claims 1 to 4.
9. An electronic device, comprising:
one or more processors;
a storage device configured to store one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-4.
CN201811166174.XA 2018-10-08 2018-10-08 Method and device for video playing, computer readable medium and electronic equipment Active CN109348251B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811166174.XA CN109348251B (en) 2018-10-08 2018-10-08 Method and device for video playing, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811166174.XA CN109348251B (en) 2018-10-08 2018-10-08 Method and device for video playing, computer readable medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN109348251A CN109348251A (en) 2019-02-15
CN109348251B true CN109348251B (en) 2021-05-11

Family

ID=65308670

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811166174.XA Active CN109348251B (en) 2018-10-08 2018-10-08 Method and device for video playing, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN109348251B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112329500A (en) * 2019-08-05 2021-02-05 北京百度网讯科技有限公司 Scene segment implementation method and device based on discrete frame and storage medium
CN111405354A (en) * 2020-03-10 2020-07-10 亦非云互联网技术(上海)有限公司 Optimization method and system for player channel switching, storage medium and player
CN111479122B (en) * 2020-04-13 2022-12-06 海信视像科技股份有限公司 Video playing method, device, equipment and storage medium
CN113742518A (en) * 2020-05-28 2021-12-03 伊姆西Ip控股有限责任公司 Method, apparatus and computer program product for storing and providing video
CN112423140A (en) * 2020-12-16 2021-02-26 富盛科技股份有限公司 Video playing method and device, electronic equipment and storage medium
CN113473235A (en) * 2021-06-16 2021-10-01 深圳锐取信息技术股份有限公司 Method and device for generating 8K recorded and played playback video, storage medium and equipment
CN113542888B (en) * 2021-07-09 2024-04-09 北京百度网讯科技有限公司 Video processing method and device, electronic equipment and storage medium
CN114584806A (en) * 2022-03-07 2022-06-03 湖南国科微电子股份有限公司 Video source file processing method, video playing method, device and equipment
CN115002505A (en) * 2022-08-02 2022-09-02 广州市千钧网络科技有限公司 Time shift control method and device for interrupting reconnected live stream
CN115396729B (en) * 2022-08-26 2023-12-08 百果园技术(新加坡)有限公司 Video target frame determining method, device, equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104065966B (en) * 2013-03-20 2017-09-29 三星电子(中国)研发中心 The method and apparatus that thumbnail is extracted in H.264 video file
CN107277563B (en) * 2017-05-12 2020-11-06 成都理想智美科技有限公司 Method, server and terminal for generating and playing video file
CN108076377B (en) * 2017-12-26 2020-12-08 浙江大华技术股份有限公司 Video storage and playing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN109348251A (en) 2019-02-15

Similar Documents

Publication Publication Date Title
CN109348251B (en) Method and device for video playing, computer readable medium and electronic equipment
US11917323B2 (en) System and method for modifying media streams using metadata
US10820063B2 (en) Manifest customization in adaptive bitrate streaming
CN106572358B (en) Live broadcast time shifting method and client
CN107071511B (en) Method and computer system for providing rewinding of broadcast video stream received by client
WO2015035942A1 (en) Method for playing back live video and device
CN103141069B (en) For retrieving the method and system with transmitting multimedia data
CN105228001B (en) A kind of method and system that FLV format videos play online
CN107634930B (en) Method and device for acquiring media data
CN108347625B (en) Method and device for positioning TS streaming media
CN109587514B (en) Video playing method, medium and related device
WO2017092327A1 (en) Playing method and apparatus
US20150172353A1 (en) Method and apparatus for interacting with a media presentation description that describes a summary media presentation and an original media presentation
CN105828096B (en) Method and device for processing media stream file
US11647252B2 (en) Identification of elements in a group for dynamic element replacement
KR20200109359A (en) Video streaming
CN113141522B (en) Resource transmission method, device, computer equipment and storage medium
WO2017185601A1 (en) Method and device for providing and downloading video
CN112243158B (en) Media file processing method and device, computer readable medium and electronic equipment
WO2018014545A1 (en) Code stream data processing method and apparatus
CN113742518A (en) Method, apparatus and computer program product for storing and providing video
US20090282076A1 (en) Playlist processing
CN108810575B (en) Method and device for sending target video
CN109587517B (en) Multimedia file playing method and device, server and storage medium
WO2016090916A1 (en) Code stream transmission method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant